by Ranjan Dailata
Disclaimer This template is only available on n8n self-hosted as it's making use of the community node for MCP Client. Who this is for? The Scrape Web Data with Bright Data and MCP Automated AI Agent workflow is built for professionals who need to automate large-scale, intelligent data extraction by utilizing the Bright Data MCP Server and Google Gemini. This solution is ideal for: Data Analysts - Who require structured, enriched datasets for analysis and reporting. Marketing Researchers - Seeking fresh market intelligence from dynamic web sources. Product Managers - Who want competitive product and feature insights from various websites. AI Developers - Aiming to feed web data into downstream machine learning models. Growth Hackers - Looking for high-quality data to fuel campaigns, research, or strategic targeting. What problem is this workflow solving? Manually scraping websites, cleaning raw HTML data, and generating useful insights from it can be slow, error-prone, and non-scalable. This workflow solves these problems by: Automating complex web data extraction through Bright Data’s MCP Server. Reducing the human effort needed for cleaning, parsing, and analyzing unstructured web content. Allowing seamless integration into further automation processes. What this workflow does? This n8n workflow performs the following steps: Trigger: Start manually. Input URL(s): Specify the URL to perform the web scrapping. Web Scraping (Bright Data): Use Bright Data’s MCP Server tools to accomplish the web data scrapping with markdown and html format. Store / Output: Save results into disk and also performs a Webhook notification. Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token>. Update the LinkedIn URL person and company workflow. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. Update the file name and path to persist on disk. How to customize this workflow to your needs Different Inputs: Instead of static URLs, accept URLs dynamically via webhook or form submissions. Outputs: Update the Webhook endpoints to send the response to Slack channels, Airtable, Notion, CRM systems, etc.
by ScrapeOps
Amazon Product Price Tracker This workflow automatically monitors Amazon product prices, tracks price changes, and sends alerts when significant price fluctuations occur. Built with ScrapeOps' structured data API, it provides a reliable, maintenance-free solution for price tracking without worrying about anti-bot measures or complex selectors. What This Workflow Does Monitors multiple Amazon products simultaneously using their ASINs Calculates both absolute and percentage price changes Sends customizable email alerts when prices cross defined thresholds Maintains a historical record of all price data for trend analysis Updates a Google Sheets with the latest price information Prerequisites A ScrapeOps API key (register at https://scrapeops.io) Google account for Google Sheets integration SMTP email configuration for alerts Setup Instructions Spreadsheet Setup Make a copy of the template spreadsheet: https://docs.google.com/spreadsheets/d/1hRv-TBXrpN6rkIU65WorttNHt-IPWas_An0sF4Of39U Add your Amazon product ASINs in the "Products to Monitor" sheet Set your desired alert thresholds for price increases/decreases Workflow Configuration Add your ScrapeOps API key to the "Setup" node Update the spreadsheet URL in the "Setup" node with YOUR copy Configure your email settings for notifications Adjust the schedule frequency as needed (default: hourly) How It Works The workflow reads product ASINs from your Google Sheet, fetches current pricing data via ScrapeOps' Amazon Product API, calculates price changes, updates your spreadsheet, and sends alerts when price movements exceed your defined thresholds. Unlike traditional web scrapers that break when websites change, this solution uses ScrapeOps' reliable API that handles all the complexity of Amazon data extraction, ensuring consistent results without maintenance. Additional Notes This workflow is ideal for deal hunters, price comparison services, and e-commerce analytics The alerting system can be extended to additional channels like Slack or Telegram ScrapeOps handles all anti-bot measures, proxy management, and parsing complexities
by Ranjan Dailata
Disclaimer This template is only available on n8n self-hosted as it's making use of the community node for MCP Client. Who this is for? The Chat Conversations with Bright Data MCP Search Engines & Google Gemini workflow is designed for users who need real-time, AI-enhanced conversations powered by live search engine results. This workflow is tailored for: Data Analysts - Who want live, search-based data fused with AI reasoning. Marketing Researchers - Seeking up-to-the-minute market or competitor insights via conversational AI. Product Managers - Exploring user needs, market trends, and competitor analysis in real time. AI Developers - Building dynamic applications that combine live search data with intelligent conversation agents. Growth Hackers - Who need fast, conversational research tools for campaign ideation, outreach, or content creation. What problem is this workflow solving? Traditional chatbots and AI systems often rely on static, outdated data. This workflow enables AI agents to fetch live search engine data and converse intelligently about it, making interactions dynamic, accurate, and highly contextual. This workflow solves the major gaps of: Outdated Knowledge: Regular chatbots lack up-to-date information from live web searches. Manual Search Fatigue: Manually searching for information and interpreting it is time-consuming. Context Bridging: Connecting search results into meaningful, conversational replies requires human-level reasoning. What this workflow does? Accepts a user's conversational query input. Triggers a search request to Bright Data’s MCP Search Engines API (Google, Bing, etc.) based on the query. Waits for the search task to complete. Retrieves real-time search results. Feeds the search results and original question into Google Gemini. Generates a human-like, contextually accurate AI response combining live information and conversational flow. Outputs the response back into a chat app. Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Also, do "Account Setup" as mentioned in the @brightdata/mcp URL. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data Web Unlocker API Token within the Environments textbox above as API_TOKEN=<your-token>. Update the HTTP Request for Webhook Notification node for sending the Webhook notification for chat responses. How to customize this workflow to your needs Change Search Engine: Add or Remove the Search Engine MCP tools based upon the Bright Data MCP Server updates. Expand Outputs: Send AI chat responses to Slack, Discord, custom chat UIs, WhatsApp, or CRM systems. Store conversation logs in a database (PostgreSQL, MongoDB, etc.) for future audits or training.
by Eduard
This workflow creates a documentation system for n8n instances using Docsify.js. It serves a dynamic documentation website that allows users to: View an overview of all workflows in a tabular format Filter workflows by tags Access automatically generated documentation for each workflow Edit documentation with a live Markdown preview Visualize workflow structures using Mermaid.js diagrams > 📺 Check out the short 2-min demonstration on LinkedIn. Don't forget to connect! 🔧 Key Components Main Documentation Portal Serves a Docsify-powered website Provides a navigation sidebar with workflow tags Displays workflow status, creation date, and documentation links Documentation Generator Uses GPT model to auto-generate workflow descriptions Creates Mermaid.js diagrams of workflow structures Maintains consistent documentation format Live Editor Split-screen Markdown editor with preview Real-time Mermaid diagram rendering Save/Cancel functionality ⚙️ Technical Details Environment Setup Requires write access to the specified project directory Uses environment variables for n8n instance URL configuration Implements webhook endpoints for serving documentation ⚠️ Security Considerations > Note: The current implementation doesn't include authentication for editing. Consider adding authentication for production use. Dependencies Docsify.js for documentation rendering Mermaid.js for workflow visualization OpenAI GPT for documentation generation 🔍 Part of the n8n Observability Series This workflow is part of a broader series focused on n8n instance observability. Check out these related workflows: Workflow Dashboard - Get comprehensive analytics of your n8n instance Visualize Your n8n Workflows with Mermaid.js - Create beautiful workflow visualizations Each workflow in this series helps you better understand and manage your n8n automation ecosystem!
by Ranjan Dailata
Who this is for? The Structured Data Extract & Data Mining workflow is crafted for researchers, content analysts, SEO strategists, and AI developers who need to transform semi-structured web data (like markdown content or scraped HTML) into actionable structured datasets. It is ideal for: Content Analysts** - Organizing and mining large volumes of markdown or HTML content. SEO & Trend Researchers** - Exploring topics by location and category. AI Engineers & NLP Developers** - Looking to automate insight extraction from unstructured inputs. Growth Marketers** - Tracking topic-level trends for strategic campaigns. Automation Specialists** - Streamlining workflows from scrape to storage. What problem is this workflow solving? Extracting insights from markdown or HTML documents typically requires manual review, formatting, and parsing. This becomes unscalable when dealing with large datasets or when real-time response is needed. Additionally, trend and topic extraction usually involves external tools, custom scripts, and inconsistent formatting. This workflow solves: Automatic text extraction from markdown or structured content. Location and category-based trend mining with semantic grouping. AI-driven topic extraction and summarization Real-time notification via webhook with rich structured payloads. Persistent storage of mined data to disk for audits or further processing. What this workflow does Receives input: Sets the URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. A Markdown/Text Extractor node parses the content into clean plaintext The cleaned data is passed to Google Gemini to: Identify trends by location and category Extract key topics and themes Format the response into structured JSON The structured insights are sent via Webhook Notification to external systems (e.g., Slack, Web apps, Zapier) The final output is saved to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source** : Update the workflow input to read from Google Sheet or Airbase for dynamically tracking multiple brands or topics. Gemini Prompt Customization** : Extract trends within a custom category (e.g., E-commerce design patterns in the US) Output topics with popularity metrics Structure the output as per your database schema (e.g., [{ topic, trend_score, location }]) Webhook Output** : Send notifications to - Slack – with AI summaries in rich blocks Internal APIs – for use in dashboards Zapier/Make – for multi-step automation Persistence** Save output to: Remote FTP or SFTP storage Amazon S3, Google Cloud Storage etc.
by Gavin
This Template gives the ability to monitor all uplinks for your Meraki Dashboard and then alert your team in a method you prefer. This example is a Teams notification to our Dispatch Channel Setup will probably take around 30 minutes to 1h provided with the Template. Most time intensive steps are getting a Meraki API key which I go over and setting up the Teams node which n8n has good documentation for. Tutorial & explanation https://www.youtube.com/watch?v=JvaN0dNwRNU
by phil
AI-Powered SEO Keyword Research Workflow with n8n > automates comprehensive keyword research for content creation Table of Contents Introduction Workflow Architecture NocoDB Integration Data Flow Core Components Setup Requirements Possible Improvements Introduction This n8n workflow automates SEO keyword research using AI and data-driven analytics. It combines OpenAI's language models with DataForSEO's analytics to generate comprehensive keyword strategies for content creation. The workflow is triggered by a webhook from NocoDB, processes the input data through multiple stages, and returns a detailed content brief with optimized keywords. Workflow Architecture The workflow follows a structured process: Input Collection: Receives data via webhook from NocoDB Topic Expansion: Generates keywords using AI Keyword Metrics Analysis: Gathers search volume, CPC, and difficulty metrics Competitor Analysis: Analyzes competitor content for ranking keywords Final Strategy Creation: Combines all data to generate a comprehensive keyword strategy Output Storage: Saves results back to NocoDB and sends notifications NocoDB Integration Database Structure The workflow integrates with two tables in NocoDB: Input Table Schema This table collects the input parameters for the keyword research: | Field Name | Type | Description | | --------------- | ------------- | --------------------------------------------------------------------------- | | ID | Auto Number | Unique identifier | | Primary Topic | Text | The main keyword/topic to research | | Competitor URLs | Text | Comma-separated list of competitor websites | | Target Audience | Single Select | Description of the target audience (Solopreneurs, Marketing Managers, etc.) | | Content Type | Single Select | Type of content (Blog, Product page, etc.) | | Location | Single Select | Target geographic location | | Language | Single Select | Target language for keywords | | Status | Single Select | Workflow status (Pending, Started, Done) | | Start Research | Checkbox | Active Workflow when you set this to true | Output Table Schema This table stores the generated keyword strategy: | Field Name | Type | Description | | ------------------ | ----------- | ------------------------------------------------ | | ID | Auto Number | Unique identifier | | primary_topic_used | Text | The topic that was researched | | report_content | Long Text | The complete keyword strategy in Markdown format | | generatedAt | Datetime | Automatically generated by NocoDb | Webhook Settings NocoDB Webhook Settings Data Flow The workflow handles data in the following sequence: Webhook Trigger: Receives input from NocoDB when a new keyword research request is created Field Extraction: Extracts primary topic, competitor URLs, audience, and other parameters AI Topic Expansion: Uses OpenAI to generate related keywords, categorized by type and intent Keyword Analysis: Sends primary keywords to DataForSEO to get search volume, CPC, and difficulty Competitor Research: Analyzes competitor pages to identify their keyword rankings Strategy Generation: Combines all data to create a comprehensive keyword strategy Storage & Notification: Saves the strategy to NocoDB and sends a notification to Slack Core Components 1. Topic Expansion This component uses OpenAI and a structured output parser to generate: 20 primary keywords 30 long-tail keywords with search intent 15 question-based keywords 10 related topics 2. DataForSEO Integration Two API endpoints are used: Search Volume & CPC**: Gets monthly search volume and cost-per-click data Keyword Difficulty**: Evaluates how difficult it would be to rank for each keyword 3. Competitor Analysis This component: Analyzes competitor URLs to identify which keywords they rank for Identifies content gaps or opportunities Determines the search intent their content targets 4. Final Keyword Strategy The AI-generated strategy includes: Top 10 primary keywords with metrics 15 long-tail opportunities with low competition 5 question-based keywords to address in content Content structure recommendations 3 potential content titles optimized for SEO Setup Requirements To use this workflow, you'll need: n8n Instance: Either cloud or self-hosted NocoDB Account: For data input and storage API Keys: OpenAI API key DataForSEO API credentials Slack API token (for notifications) Database Setup: Create the required tables in NocoDB as described above Possible Improvements The workflow could be enhanced with the following improvements: Enhanced Keyword Strategy Add topic clustering to group related keywords Enhance the final output with more specific content structure suggestions Include word count recommendations for each content section Additional Data Sources Integrate Google Search Console data for existing content optimization Add Google Trends data to identify rising topics Include sentiment analysis for different keyword groups Improved Competitor Analysis Analyze content length and structure from top-ranking pages Identify common backlink sources for competitor content Extract content headings to better understand content organization Automation Enhancements Add scheduling capabilities to run updates on existing content Implement content performance tracking over time Create alert thresholds for changes in keyword difficulty or search volume Example Output Here is an example Output the Workflow generated based on the following inputs. Inputs: Primary Topic: AI Automation Competitor URLs: n8n.io, zapier.com, make.com Target Audience: Small Business Owners Content Type: Landing Page Location: United States Language: English Output: Final Keyword Strategy The workflow provides a powerful automation for content marketers and SEO specialists to develop data-driven keyword strategies with minimal manual effort. > Original Workflow: AI-Powered SEO Keyword Research Automation - The vibe Marketer
by Dr. Firas
👉 Build a Phone Agent to qualify outbound leads and schedule inbound calls Who is this for? This workflow is designed for sales teams, call centers, and businesses handling both outbound and inbound lead calls who want to automate their qualification, follow-up, and call documentation process without manual intervention. It’s ideal for teams using Google Sheets, RetellAI, OpenAI, and Gmail as part of their tech stack. Real-World Use Cases 🛍 E-commerce – Instantly handle product FAQs and order status checks, 24/7. 🏬 Retail Stores – Share store hours, directions, and return policies without lifting a finger. 🍽 Restaurants – Take reservations or answer menu questions automatically. 💼 Service Providers – Book appointments or consultations while you focus on your craft. 📞 Any Local Business – Deliver friendly, consistent phone support — no live agent required. What problem is this workflow solving? Managing lead calls at scale can be chaotic—between scheduling outbound qualification calls, handling inbound appointment requests, and making sure every call is documented and followed up. This workflow automates the entire process, reducing human error and saving time by: ✅ Sending reminders to reps for outbound calls ✅ Automatically placing calls with RetellAI ✅ Handling inbound calls and checking caller details ✅ Generating and emailing call summaries automatically What this workflow does This n8n template connects Google Sheets, RetellAI, OpenAI, and Gmail into a seamless workflow: Outbound Lead Qualification Workflow Triggers when a new lead is added to Google Sheets Sends an SMS notification to remind the rep to call in 5 minutes (Optional) Waits 5 minutes Initiates an automated call to the lead via RetellAI Inbound Call Appointment Scheduler Receives inbound calls from RetellAI (via webhook) Checks if the caller’s number exists in Google Sheets Responds to RetellAI with a success or error message Post-Call Workflow Receives post-call data from RetellAI Filters only analyzed calls Updates the lead’s record in Google Sheets Uses OpenAI to generate a call summary Emails the summary to a team inbox or rep Setup ✅ You need an active RetellAI API key Sign up for RetellAI, create an agent, and set the webhook URLs (n8n_call for call events). Purchase a Twilio phone number and link it to the agent. ✅ Your Google Sheet must have a column for phone numbers (e.g., "Phone") ✅ Gmail account connected and authorized in n8n ✅ OpenAI API key added to your environment variables or credentials Configure your Google Sheets node with the correct spreadsheet ID and range Add your RetellAI API key to the HTTP request nodes Connect your Gmail account in the Gmail node Add your OpenAI key in the OpenAI node 👉 See full setup guide here: Notion Documentation How to customize this workflow to your needs Change SMS content**: Edit the text in the “Send SMS reminder” node to match your team’s tone Modify call wait time**: Enable and adjust the “Wait 5 minutes” node to any delay you prefer Add CRM integration**: Replace or extend the Google Sheets node to update your CRM instead of a spreadsheet Customize call summary prompts**: Edit the prompt sent to OpenAI to change the summary style or add extra insights Send email to different recipients**: Change the recipient address in the Gmail node or make it dynamic from the lead record Need help customizing? Contact me for consulting and support : Linkedin
by Akhil Varma Gadiraju
n8n Workflow: Sync Workflows with GitLab How It Works This workflow ensures that your self-hosted n8n workflows are version-controlled in a GitLab repository. It compares each current workflow from n8n with its stored counterpart in GitLab. If any differences are detected, the GitLab file is updated with the latest version. Core Logic: Retrieve Workflows – Fetch all workflows from the n8n REST API. Compare with GitLab – For each workflow, fetch the corresponding file from GitLab and compare the JSON. Update if Changed – If differences exist, commit the updated workflow to GitLab using its API. Setup Before using the workflow, ensure the following: Prerequisites: n8n**: Self-hosted instance with access to the /rest/workflows API. GitLab**: A repository where workflows will be stored, and a Personal Access Token (PAT) with api and write_repository permissions. n8n Nodes Required**: HTTP Request (to call n8n and GitLab APIs) Code or Function nodes (for diffing and formatting) Looping (SplitInBatches or similar) Configuration: Set environment variables or workflow credentials for: GITLAB_TOKEN GITLAB_REPO GITLAB_BRANCH (e.g., main) GITLAB_FILE_PATH_PREFIX (e.g., n8n-workflows/) How to Use Import the Workflow into your n8n instance. Configure GitLab API Credentials: Set the GitLab PAT as a header in the HTTP Request node: Private-Token: {{ $env.GITLAB_TOKEN }} Map Workflows to GitLab Paths: Use the workflow name or ID to create the file path. Example: n8n-workflows/workflow-name.json Trigger the Workflow: Can be manually triggered, or scheduled to run at intervals (e.g., daily). Review Commits in GitLab: Each updated workflow will be committed with a message like: "Update workflow: Sample Workflow" Disclaimer This workflow does not handle merge conflicts or manual edits made directly in GitLab. Always ensure proper coordination if multiple sources are modifying workflows. Only structural changes are tracked. Non-functional metadata (like timestamps or IDs) may trigger false positives unless filtered. Use at your own risk. Test in a safe environment before applying to production workflows.
by Joseph
(Image Generation → Hosting → Video Generation) This workflow is designed for creators, automation enthusiasts, and indie hackers who want to generate image-based videos automatically using AI tools — at a low cost. ⚙️ Workflow Overview This automation performs the following steps: Trigger (Schedule or manual) Generate an image using Flux (choose between two APIs) Upload the image to Kraken.io to get a public URL Send the image to Runway ML (choose between two APIs) to generate a video Receive the video as a URL — ready for posting, download, or further automation 🛠️ Step-by-Step Setup 🖼️ Flux (Image Generation) You can use either of the following providers: Option 1: Flux by BlackForest Labs (Direct API) 🔑 Get your API key here: https://docs.bfl.ml/ Paste your API key in the HTTP Request node named Flux (Blackforest) You can customize prompts or styles inside the JSON body Option 2: Flux via RapidAPI 🔑 Subscribe and get your key here: https://rapidapi.com/poorav925/api/ai-text-to-image-generator-flux-free-api/playground/apiendpoint\_e38039ee-1912-4ef9-b4d4-270d72fca851 Enter your RapidAPI key in the X-RapidAPI-Key header Optional: tweak prompts, style, or resolution inside the JSON body 🐙 Kraken.io (Hosting the Image Publicly) Runway ML requires the image to be publicly accessible. We use Kraken.io to host the generated image and return a public URL. 🔑 Get your API credentials: https://kraken.io/account/api-credentials Setup: Copy your API Key and API Secret Open the Kraken Upload node in n8n Replace placeholders with your credentials The node uploads your image and gives back a public image URL for Runway to use 🎬 RunwayML (Video Generation) You also have two options here: Option 1: Runway Official API 🔑 Get your credentials at: https://dev.runwayml.com/ Use the public image URL from Kraken in the JSON body Paste your Bearer token in the Authorization header Customize other settings like video length, style, FPS, etc. Option 2: Runway via RapidAPI 🔑 Subscribe and get your key here: https://rapidapi.com/fortunehoppers/api/runwayml/playground/apiendpoint\_93c8554d-8097-40cd-8252-3d4dec9c0e68 Paste your RapidAPI key in the request header Customize prompt and generation options in the body Use the Kraken-generated image URL as the input source 📤 What to Do with the Video Once the video is generated, you’ll get a direct video URL. You can: Save it to Google Sheets or Notion Send it via email Trigger a YouTube upload automation Or download manually for editing and reposting 💡 Optional Tips & Notes You can schedule this workflow to generate AI videos daily or weekly Combine it with a Google Sheet of prompts for bulk automation Try using a consistent visual style or theme for better branding This workflow is lightweight and affordable — perfect for indie projects or experimental content generation Great for shorts, quote visuals, music loops, AI art promos, etc. 🔗 Resources Flux (Blackforest) Docs Flux on RapidAPI RunwayML Official Docs Runway on RapidAPI Kraken.io API Dashboard 🙋 Need Help? Feel free to reach out: 🐦 Twitter: @juppfy 📧 Email: joseph@uppfy.com If you’d like to hire me for custom n8n workflows or product automations, don’t hesitate to get in touch.
by Max aka Mosheh
How it works: The n8n flow grabs the needed IDs, fetches the current links, adds your new one, and sends a single HTTP request to NocoDB to update the record’s linked entries. Set up steps: Plan for 10 minutes setup if you’re already running n8n and NocoDB. You’ll need to copy/paste table IDs, set up your HTTP node, and test once. No coding, just copy IDs.
by Don Jayamaha Jr
📈 Get daily and on-demand Tesla (TSLA) trading signals via Telegram—powered by GPT-4.1 and real-time market data. This is the central AI supervisor that orchestrates seven sub-agents for technical analysis, price pattern recognition, and news sentiment. Reports are delivered in structured Telegram-ready HTML, optimized for traders seeking fast, intelligent decision-making signals. ⚠️ This master agent requires 7 connected sub-workflows to function. One of them, the News & Sentiment Agent, also requires a DeepSeek Chat API key for language processing. 🔌 Required Sub-Workflows You must download and publish the following workflows: Tesla Financial Market Data Analyst Tool Tesla News and Sentiment Analyst Tool (Requires DeepSeek Chat API Key) Tesla 15min Indicators Tool Tesla 1hour Indicators Tool Tesla 1day Indicators Tool Tesla 1hour & 1day Klines Tool Tesla Quant Technical Indicators Webhooks Tool (Requires Alpha Vantage Premium API Key) 📍 See all tools at: 🔗 https://n8n.io/creators/don-the-gem-dealer/ 🔍 What This Agent Does Listens to your trading query via Telegram Calls the Financial Analyst and News & Sentiment Analyst These agents aggregate: RSI, MACD, BBANDS, SMA, EMA, ADX Candlestick pattern + volume divergence analysis News summaries and sentiment scoring via DeepSeek Chat GPT-4.1 composes the final structured TSLA trade report with: Spot and leverage setups Signal rationale Confidence score Sentiment tag News summary 🧠 Output Example TSLA Trading Report (Daily Summary) Spot Trade • Action: Buy • Entry: 172.45 • TP: 182.00 • SL: 169.80 • Signal: RSI bounce + Bullish Engulfing • Sentiment: Neutral Leveraged Position • Position: Long • Leverage: 3x • TP: 186 • SL: 170 • Confidence: High (83/100) 📰 Top News • Tesla Model Y delivery surge – Electrek • Options market pricing in upside – Bloomberg • FSD delayed in Canada – TeslaNorth 🛠️ Setup Instructions 1. Import All 8 Workflows Ensure all sub-agents above are published in your n8n instance. 2. Create Your Telegram Bot Use @BotFather to generate the token and connect to the trigger/send nodes. 3. Connect OpenAI GPT-4.1 Add your OpenAI credentials for GPT-4.1 in the designated node. 4. Add DeepSeek Chat API Key Sign up at https://deepseek.com and insert your DeepSeek Chat credentials in the News Agent. 5. Add Alpha Vantage Premium API Key Sign up at https://www.alphavantage.co/premium/ Use HTTP Header Auth for webhook-based indicator fetchers. 6. Replace Telegram ID Update the placeholder <<replace your ID here>> with your actual Telegram numeric ID in the auth node. 📌 Included Sticky Notes ✅ Telegram Bot Setup ✅ Agent Routing & Memory ✅ Financial vs. Sentiment Trigger Flow ✅ Report Formatting (HTML) ✅ API Requirements (GPT-4.1, DeepSeek, Alpha Vantage) ✅ Troubleshooting & Licensing 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 For support: LinkedIn – Don Jayamaha 🚀 Deploy the Tesla Quant Trading AI system with GPT-4.1, DeepSeek Chat, and Alpha Vantage Premium—right into Telegram. All 8 workflows are required. 🎥 Tesla Quant AI Agent – Live Demo Experience the power of the Tesla Quant Trading AI Agent in action.