by Eduard
This workflow creates a documentation system for n8n instances using Docsify.js. It serves a dynamic documentation website that allows users to: View an overview of all workflows in a tabular format Filter workflows by tags Access automatically generated documentation for each workflow Edit documentation with a live Markdown preview Visualize workflow structures using Mermaid.js diagrams > 📺 Check out the short 2-min demonstration on LinkedIn. Don't forget to connect! 🔧 Key Components Main Documentation Portal Serves a Docsify-powered website Provides a navigation sidebar with workflow tags Displays workflow status, creation date, and documentation links Documentation Generator Uses GPT model to auto-generate workflow descriptions Creates Mermaid.js diagrams of workflow structures Maintains consistent documentation format Live Editor Split-screen Markdown editor with preview Real-time Mermaid diagram rendering Save/Cancel functionality ⚙️ Technical Details Environment Setup Requires write access to the specified project directory Uses environment variables for n8n instance URL configuration Implements webhook endpoints for serving documentation ⚠️ Security Considerations > Note: The current implementation doesn't include authentication for editing. Consider adding authentication for production use. Dependencies Docsify.js for documentation rendering Mermaid.js for workflow visualization OpenAI GPT for documentation generation 🔍 Part of the n8n Observability Series This workflow is part of a broader series focused on n8n instance observability. Check out these related workflows: Workflow Dashboard - Get comprehensive analytics of your n8n instance Visualize Your n8n Workflows with Mermaid.js - Create beautiful workflow visualizations Each workflow in this series helps you better understand and manage your n8n automation ecosystem!
by Ranjan Dailata
Who this is for? The Structured Data Extract & Data Mining workflow is crafted for researchers, content analysts, SEO strategists, and AI developers who need to transform semi-structured web data (like markdown content or scraped HTML) into actionable structured datasets. It is ideal for: Content Analysts** - Organizing and mining large volumes of markdown or HTML content. SEO & Trend Researchers** - Exploring topics by location and category. AI Engineers & NLP Developers** - Looking to automate insight extraction from unstructured inputs. Growth Marketers** - Tracking topic-level trends for strategic campaigns. Automation Specialists** - Streamlining workflows from scrape to storage. What problem is this workflow solving? Extracting insights from markdown or HTML documents typically requires manual review, formatting, and parsing. This becomes unscalable when dealing with large datasets or when real-time response is needed. Additionally, trend and topic extraction usually involves external tools, custom scripts, and inconsistent formatting. This workflow solves: Automatic text extraction from markdown or structured content. Location and category-based trend mining with semantic grouping. AI-driven topic extraction and summarization Real-time notification via webhook with rich structured payloads. Persistent storage of mined data to disk for audits or further processing. What this workflow does Receives input: Sets the URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. A Markdown/Text Extractor node parses the content into clean plaintext The cleaned data is passed to Google Gemini to: Identify trends by location and category Extract key topics and themes Format the response into structured JSON The structured insights are sent via Webhook Notification to external systems (e.g., Slack, Web apps, Zapier) The final output is saved to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source** : Update the workflow input to read from Google Sheet or Airbase for dynamically tracking multiple brands or topics. Gemini Prompt Customization** : Extract trends within a custom category (e.g., E-commerce design patterns in the US) Output topics with popularity metrics Structure the output as per your database schema (e.g., [{ topic, trend_score, location }]) Webhook Output** : Send notifications to - Slack – with AI summaries in rich blocks Internal APIs – for use in dashboards Zapier/Make – for multi-step automation Persistence** Save output to: Remote FTP or SFTP storage Amazon S3, Google Cloud Storage etc.
by Ranjan Dailata
Who this is for The Google Trend Data Extract & Summarization workflow is ideal for trend researchers, digital marketers, content strategists, and AI developers who want to automate the extraction, summarization, and distribution of Google Trends data. This end-to-end solution helps transform trend signals into human-readable insights and delivers them across multiple channels. It is built for: Market Researchers** - Tracking trends by topic or region Content Strategists** - Identifying content opportunities from trending data SEO Analysts** - Monitoring search volume and shifts in keyword popularity Growth Hackers** - Reacting quickly to real-time search behavior AI & Automation Engineers** - Creating automated trend monitoring systems What problem is this workflow solving? Google Trends data can provide rich insights into user interests, but the raw data is not always structured or easily interpretable at scale. Manually extracting, cleaning, and summarizing trends from multiple regions or categories is time-consuming. This workflow solves the following problems: Automates the conversion of markdown or scraped HTML into clean textual input Transforms unstructured data into structured format ready for processing Uses AI summarization to generate easy-to-read insights from Google Trends Distributes summaries via email and webhook notifications Persists responses to disk for archiving, auditing, or future analytics What this workflow does Receives input: Sets an URL for the data extraction and analysis. Uses Bright Data’s Web Unlocker to extract content from relevant site. Markdown to Textual Data Extractor: Converts markdown content into plaintext using n8n’s Function or Markdown nodes Structured Data Extract: Parses the plaintext into structured JSON suitable for AI processing Summarize Google Trends: Sends structured data to Google Gemini with a summarization prompt to extract key takeaways Send Summary via Gmail: Composes an email with the AI-generated summary and sends it to a designated recipient Persist to Disk: Writes the AI structured data to disk Webhook Notification: Sends the summarized response to an external system (e.g., Slack, Notion, Zapier) using a webhook Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source : Update the workflow input to read from Google Sheet or Airbase etc. Gemini Prompt Tuning : Customize prompts to extract summaries like: Summarize the most significant trend shifts Generate content ideas from the trending search topics Email Personalization : Configure Gmail node to: Use dynamic subject lines like: Weekly Google Trends Summary – {{date}} Send to multiple stakeholders or mailing lists File Storage Customization : Save with timestamps, e.g., trends_summary_2025-04-29.json Extend to S3 or cloud drive integrations Webhook Use Cases : Send summary to: Internal dashboards Slack channels Automation tools like Make, Zapier etc.
by Gavin
This Template gives the ability to monitor all uplinks for your Meraki Dashboard and then alert your team in a method you prefer. This example is a Teams notification to our Dispatch Channel Setup will probably take around 30 minutes to 1h provided with the Template. Most time intensive steps are getting a Meraki API key which I go over and setting up the Teams node which n8n has good documentation for. Tutorial & explanation https://www.youtube.com/watch?v=JvaN0dNwRNU
by phil
AI-Powered SEO Keyword Research Workflow with n8n > automates comprehensive keyword research for content creation Table of Contents Introduction Workflow Architecture NocoDB Integration Data Flow Core Components Setup Requirements Possible Improvements Introduction This n8n workflow automates SEO keyword research using AI and data-driven analytics. It combines OpenAI's language models with DataForSEO's analytics to generate comprehensive keyword strategies for content creation. The workflow is triggered by a webhook from NocoDB, processes the input data through multiple stages, and returns a detailed content brief with optimized keywords. Workflow Architecture The workflow follows a structured process: Input Collection: Receives data via webhook from NocoDB Topic Expansion: Generates keywords using AI Keyword Metrics Analysis: Gathers search volume, CPC, and difficulty metrics Competitor Analysis: Analyzes competitor content for ranking keywords Final Strategy Creation: Combines all data to generate a comprehensive keyword strategy Output Storage: Saves results back to NocoDB and sends notifications NocoDB Integration Database Structure The workflow integrates with two tables in NocoDB: Input Table Schema This table collects the input parameters for the keyword research: | Field Name | Type | Description | | --------------- | ------------- | --------------------------------------------------------------------------- | | ID | Auto Number | Unique identifier | | Primary Topic | Text | The main keyword/topic to research | | Competitor URLs | Text | Comma-separated list of competitor websites | | Target Audience | Single Select | Description of the target audience (Solopreneurs, Marketing Managers, etc.) | | Content Type | Single Select | Type of content (Blog, Product page, etc.) | | Location | Single Select | Target geographic location | | Language | Single Select | Target language for keywords | | Status | Single Select | Workflow status (Pending, Started, Done) | | Start Research | Checkbox | Active Workflow when you set this to true | Output Table Schema This table stores the generated keyword strategy: | Field Name | Type | Description | | ------------------ | ----------- | ------------------------------------------------ | | ID | Auto Number | Unique identifier | | primary_topic_used | Text | The topic that was researched | | report_content | Long Text | The complete keyword strategy in Markdown format | | generatedAt | Datetime | Automatically generated by NocoDb | Webhook Settings NocoDB Webhook Settings Data Flow The workflow handles data in the following sequence: Webhook Trigger: Receives input from NocoDB when a new keyword research request is created Field Extraction: Extracts primary topic, competitor URLs, audience, and other parameters AI Topic Expansion: Uses OpenAI to generate related keywords, categorized by type and intent Keyword Analysis: Sends primary keywords to DataForSEO to get search volume, CPC, and difficulty Competitor Research: Analyzes competitor pages to identify their keyword rankings Strategy Generation: Combines all data to create a comprehensive keyword strategy Storage & Notification: Saves the strategy to NocoDB and sends a notification to Slack Core Components 1. Topic Expansion This component uses OpenAI and a structured output parser to generate: 20 primary keywords 30 long-tail keywords with search intent 15 question-based keywords 10 related topics 2. DataForSEO Integration Two API endpoints are used: Search Volume & CPC**: Gets monthly search volume and cost-per-click data Keyword Difficulty**: Evaluates how difficult it would be to rank for each keyword 3. Competitor Analysis This component: Analyzes competitor URLs to identify which keywords they rank for Identifies content gaps or opportunities Determines the search intent their content targets 4. Final Keyword Strategy The AI-generated strategy includes: Top 10 primary keywords with metrics 15 long-tail opportunities with low competition 5 question-based keywords to address in content Content structure recommendations 3 potential content titles optimized for SEO Setup Requirements To use this workflow, you'll need: n8n Instance: Either cloud or self-hosted NocoDB Account: For data input and storage API Keys: OpenAI API key DataForSEO API credentials Slack API token (for notifications) Database Setup: Create the required tables in NocoDB as described above Possible Improvements The workflow could be enhanced with the following improvements: Enhanced Keyword Strategy Add topic clustering to group related keywords Enhance the final output with more specific content structure suggestions Include word count recommendations for each content section Additional Data Sources Integrate Google Search Console data for existing content optimization Add Google Trends data to identify rising topics Include sentiment analysis for different keyword groups Improved Competitor Analysis Analyze content length and structure from top-ranking pages Identify common backlink sources for competitor content Extract content headings to better understand content organization Automation Enhancements Add scheduling capabilities to run updates on existing content Implement content performance tracking over time Create alert thresholds for changes in keyword difficulty or search volume Example Output Here is an example Output the Workflow generated based on the following inputs. Inputs: Primary Topic: AI Automation Competitor URLs: n8n.io, zapier.com, make.com Target Audience: Small Business Owners Content Type: Landing Page Location: United States Language: English Output: Final Keyword Strategy The workflow provides a powerful automation for content marketers and SEO specialists to develop data-driven keyword strategies with minimal manual effort. > Original Workflow: AI-Powered SEO Keyword Research Automation - The vibe Marketer
by Dr. Firas
👉 Build a Phone Agent to qualify outbound leads and schedule inbound calls Who is this for? This workflow is designed for sales teams, call centers, and businesses handling both outbound and inbound lead calls who want to automate their qualification, follow-up, and call documentation process without manual intervention. It’s ideal for teams using Google Sheets, RetellAI, OpenAI, and Gmail as part of their tech stack. Real-World Use Cases 🛍 E-commerce – Instantly handle product FAQs and order status checks, 24/7. 🏬 Retail Stores – Share store hours, directions, and return policies without lifting a finger. 🍽 Restaurants – Take reservations or answer menu questions automatically. 💼 Service Providers – Book appointments or consultations while you focus on your craft. 📞 Any Local Business – Deliver friendly, consistent phone support — no live agent required. What problem is this workflow solving? Managing lead calls at scale can be chaotic—between scheduling outbound qualification calls, handling inbound appointment requests, and making sure every call is documented and followed up. This workflow automates the entire process, reducing human error and saving time by: ✅ Sending reminders to reps for outbound calls ✅ Automatically placing calls with RetellAI ✅ Handling inbound calls and checking caller details ✅ Generating and emailing call summaries automatically What this workflow does This n8n template connects Google Sheets, RetellAI, OpenAI, and Gmail into a seamless workflow: Outbound Lead Qualification Workflow Triggers when a new lead is added to Google Sheets Sends an SMS notification to remind the rep to call in 5 minutes (Optional) Waits 5 minutes Initiates an automated call to the lead via RetellAI Inbound Call Appointment Scheduler Receives inbound calls from RetellAI (via webhook) Checks if the caller’s number exists in Google Sheets Responds to RetellAI with a success or error message Post-Call Workflow Receives post-call data from RetellAI Filters only analyzed calls Updates the lead’s record in Google Sheets Uses OpenAI to generate a call summary Emails the summary to a team inbox or rep Setup ✅ You need an active RetellAI API key Sign up for RetellAI, create an agent, and set the webhook URLs (n8n_call for call events). Purchase a Twilio phone number and link it to the agent. ✅ Your Google Sheet must have a column for phone numbers (e.g., "Phone") ✅ Gmail account connected and authorized in n8n ✅ OpenAI API key added to your environment variables or credentials Configure your Google Sheets node with the correct spreadsheet ID and range Add your RetellAI API key to the HTTP request nodes Connect your Gmail account in the Gmail node Add your OpenAI key in the OpenAI node 👉 See full setup guide here: Notion Documentation How to customize this workflow to your needs Change SMS content**: Edit the text in the “Send SMS reminder” node to match your team’s tone Modify call wait time**: Enable and adjust the “Wait 5 minutes” node to any delay you prefer Add CRM integration**: Replace or extend the Google Sheets node to update your CRM instead of a spreadsheet Customize call summary prompts**: Edit the prompt sent to OpenAI to change the summary style or add extra insights Send email to different recipients**: Change the recipient address in the Gmail node or make it dynamic from the lead record Need help customizing? Contact me for consulting and support : Linkedin
by Solomon
Learn how to build an MCP Server and Client in n8n with official nodes. > ⚠ Requires n8n version 1.88.0 or higher. In this example, we use Google Calendar and custom functions as two separate MCP Servers, demonstrating how to integrate both native and custom tools. How it works The AI Agent connects to two MCP Servers. Each MCP Trigger (Server) generates a URL exposing its tools. This URL is used by an MCP Client linked to the AI Agent. Whenever you make changes to the tools, there’s no need to modify the MCP Client. It automatically keeps the AI Agent informed on how to use each tool, even if you change them over time. That’s the power of MCP 🙌 Who is this template for Anyone looking to use MCP with their AI Agents. How to set up Instructions are included within the workflow itself. Check out my other templates 👉 https://n8n.io/creators/solomon/
by Immanuel
Automated Raw Materials Inventory Management with Google Sheets, Supabase, and Gmail using n8n Webhooks Description What Problem Does This Solve? 🛠️ This workflow automates raw materials inventory management for businesses, eliminating manual stock updates, delayed material issue approvals, and missed low stock alerts. It ensures real-time stock tracking, streamlined approvals, and timely notifications. Target audience: Small to medium-sized businesses, inventory managers, and n8n users familiar with Google Sheets, Supabase, and Gmail integrations. What Does It Do? 🌟 Receives raw material data and issue requests via form submissions. Updates stock levels in Google Sheets and Supabase. Manages approvals for material issue requests with email notifications. Detects low stock levels and sends alerts via Gmail. Maintains data consistency across Google Sheets and Supabase. Key Features Real-time stock updates from form submissions. Automated approval process for material issuance. Low stock detection with Gmail notifications. Dual storage in Google Sheets and Supabase for redundancy. Error handling for robust data validation. Setup Instructions Prerequisites n8n Instance**: Self-hosted or cloud n8n instance. API Credentials**: Google Sheets API: Credentials from Google Cloud Console with Sheets scope, stored in n8n credentials. Supabase API: API key and URL from Supabase project, stored in n8n credentials (do not hardcode in nodes). Gmail API: Credentials from Google Cloud Console with Gmail scope. Forms**: A form (e.g., Google Form) to submit raw material receipts and issue requests, configured to send data to n8n webhooks. Installation Steps Import the Workflow: Copy the workflow JSON from the “Template Code” section (to be provided). Import it into n8n via “Import from File” or “Import from URL”. Configure Credentials: Add API credentials in n8n’s Credentials section for Google Sheets, Supabase, and Gmail. Assign credentials to respective nodes. For example: In the Append Raw Materials node, use Google Sheets credentials: {{ $credentials.GoogleSheets }}. In the Current Stock Update node, use Supabase credentials: {{ $credentials.Supabase }}. In the Send Low Stock Email Alert node, use Gmail credentials. Set Up Nodes: Webhook Nodes (Receive Raw Materials Webhook, Receive Material Issue Webhook): Configure webhook URLs and link them to your form submissions. Approval Email (Send Approval Request): Customize the HTML email template if needed. Low Stock Alerts (Send Low Stock Email Alert, Send Low Stock Email After Issue): Configure recipient email addresses. Test the Workflow: Submit a test form for raw material receipt and verify stock updates in Google Sheets/Supabase. Submit a material issue request, approve/reject it, and confirm stock updates and notifications. How It Works High-Level Steps Receive Raw Materials: Processes form submissions for raw material receipts. Update Stock: Updates stock levels in Google Sheets and Supabase. Handle Issue Requests: Processes material issue requests via forms. Manage Approvals: Sends approval requests and processes decisions. Monitor Stock Levels: Detects low stock and sends Gmail alerts. Detailed Descriptions Detailed node descriptions are available in the sticky notes within the workflow screenshot (to be provided). Below is a summary of key actions. Node Names and Actions Raw Materials Receiving and Stock Update Receive Raw Materials Webhook**: Receives raw material data from a form submission. Standardize Raw Material Data**: Maps form data into a consistent format. Calculate Total Price**: Computes Total Price (Quantity Received * Unit Price). Append Raw Materials**: Records receipt in Google Sheets. Check Quantity Received Validity**: Ensures Quantity Received is valid. Lookup Existing Stock**: Retrieves current stock for the Product ID. Check If Product Exists**: Branches based on Product ID existence. Calculate Updated Current Stock**: Adds Quantity Received to stock (True branch). Update Current Stock**: Updates stock in Google Sheets (True branch). Retrieve Updated Stock for Check**: Retrieves updated stock for low stock check. Detect Low Stock Level**: Flags if stock is below minimum. Trigger Low Stock Alert**: Triggers email if stock is low. Send Low Stock Email Alert**: Sends low stock alert via Gmail. Add New Product to Stock**: Adds new product to stock (False branch). Current Stock Update**: Updates Supabase Current Stock table. New Row Current Stock**: Inserts new product into Supabase. Search Current Stock**: Retrieves Supabase stock records. New Record Raw**: Inserts raw material record into Supabase. Format Response**: Removes duplicates from Supabase response. Combine Stock Update Branches**: Merges branches for existing/new products. Material Issue Request and Approval Receive Material Issue Webhook**: Receives issue request from a form submission. Standardize Data**: Normalizes request data and adds Approval Link. Validate Issue Request Data**: Ensures Quantity Requested is valid. Verify Requested Quantity**: Validates Product ID and Submission ID. Append Material Request**: Records request in Google Sheets. Check Available Stock for Issue**: Retrieves current stock for the request. Prepare Approval**: Checks stock sufficiency for the request. Send Approval Request**: Emails approver with Approve/Reject options. Receive Approval Response**: Captures approver’s decision via webhook. Format Approval Response**: Processes approval data with Approval Date. Verify Approval Data**: Validates the approval response. Retrieve Issue Request Details**: Retrieves original request from Google Sheets. Process Approval Decision**: Branches based on approval action. Get Stock for Issue Update**: Retrieves stock before update (Approved). Deduct Issued Stock**: Reduces stock by Approved Quantity (Approved). Update Stock After Issue**: Updates stock in Google Sheets (Approved). Retrieve Stock After Issue**: Retrieves updated stock for low stock check. Detect Low Stock After Issue**: Flags low stock after issuance. Trigger Low Stock Alert After Issue**: Triggers email if stock is low. Send Low Stock Email After Issue**: Sends low stock alert via Gmail. Update Issue Request Status**: Updates request status (Approved/Rejected). Combine Stock Lookup Results**: Merges stock lookup branches. Create Record Issue**: Inserts issue request into Supabase. Search Stock by Product ID**: Retrieves Supabase stock records. Issues Table Update**: Updates Supabase Materials Issued table. Update Current Stock**: Updates Supabase stock after issuance. Combine Issue Lookup Branches**: Merges issue lookup branches. Search Issue by Submission ID**: Retrieves Supabase issue records. Customization Tips Expand Storage Options **: Add nodes to store data in other databases (e.g., Airtable) alongside Google Sheets and Supabase. Modify Approval Email **: Update the Send Approval Request node to customize the HTML email template (e.g., adjust styling or add branding). Alternative Notifications **: Add nodes to send low stock alerts via other platforms (e.g., Slack or Telegram). Adjust Low Stock Threshold **: Modify the Detect Low Stock Level node to change the Minimum Stock Level (default: 50).!
by Don Jayamaha Jr
📈 Get daily and on-demand Tesla (TSLA) trading signals via Telegram—powered by GPT-4.1 and real-time market data. This is the central AI supervisor that orchestrates seven sub-agents for technical analysis, price pattern recognition, and news sentiment. Reports are delivered in structured Telegram-ready HTML, optimized for traders seeking fast, intelligent decision-making signals. ⚠️ This master agent requires 7 connected sub-workflows to function. One of them, the News & Sentiment Agent, also requires a DeepSeek Chat API key for language processing. 🔌 Required Sub-Workflows You must download and publish the following workflows: Tesla Financial Market Data Analyst Tool Tesla News and Sentiment Analyst Tool (Requires DeepSeek Chat API Key) Tesla 15min Indicators Tool Tesla 1hour Indicators Tool Tesla 1day Indicators Tool Tesla 1hour & 1day Klines Tool Tesla Quant Technical Indicators Webhooks Tool (Requires Alpha Vantage Premium API Key) 📍 See all tools at: 🔗 https://n8n.io/creators/don-the-gem-dealer/ 🔍 What This Agent Does Listens to your trading query via Telegram Calls the Financial Analyst and News & Sentiment Analyst These agents aggregate: RSI, MACD, BBANDS, SMA, EMA, ADX Candlestick pattern + volume divergence analysis News summaries and sentiment scoring via DeepSeek Chat GPT-4.1 composes the final structured TSLA trade report with: Spot and leverage setups Signal rationale Confidence score Sentiment tag News summary 🧠 Output Example TSLA Trading Report (Daily Summary) Spot Trade • Action: Buy • Entry: 172.45 • TP: 182.00 • SL: 169.80 • Signal: RSI bounce + Bullish Engulfing • Sentiment: Neutral Leveraged Position • Position: Long • Leverage: 3x • TP: 186 • SL: 170 • Confidence: High (83/100) 📰 Top News • Tesla Model Y delivery surge – Electrek • Options market pricing in upside – Bloomberg • FSD delayed in Canada – TeslaNorth 🛠️ Setup Instructions 1. Import All 8 Workflows Ensure all sub-agents above are published in your n8n instance. 2. Create Your Telegram Bot Use @BotFather to generate the token and connect to the trigger/send nodes. 3. Connect OpenAI GPT-4.1 Add your OpenAI credentials for GPT-4.1 in the designated node. 4. Add DeepSeek Chat API Key Sign up at https://deepseek.com and insert your DeepSeek Chat credentials in the News Agent. 5. Add Alpha Vantage Premium API Key Sign up at https://www.alphavantage.co/premium/ Use HTTP Header Auth for webhook-based indicator fetchers. 6. Replace Telegram ID Update the placeholder <<replace your ID here>> with your actual Telegram numeric ID in the auth node. 📌 Included Sticky Notes ✅ Telegram Bot Setup ✅ Agent Routing & Memory ✅ Financial vs. Sentiment Trigger Flow ✅ Report Formatting (HTML) ✅ API Requirements (GPT-4.1, DeepSeek, Alpha Vantage) ✅ Troubleshooting & Licensing 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 For support: LinkedIn – Don Jayamaha 🚀 Deploy the Tesla Quant Trading AI system with GPT-4.1, DeepSeek Chat, and Alpha Vantage Premium—right into Telegram. All 8 workflows are required. 🎥 Tesla Quant AI Agent – Live Demo Experience the power of the Tesla Quant Trading AI Agent in action.
by Don Jayamaha Jr
📊 This AI sub-agent aggregates Tesla (TSLA) trading signals across multiple timeframes using real-time technical indicators and candlestick behavior. It is a core component of the Tesla Quant Trading AI system. Powered by GPT-4.1, it consolidates 15-minute, 1-hour, and 1-day indicators, adds candlestick pattern data, and produces a unified JSON signal for downstream use by the master agent. ⚠️ This agent is not standalone. It is triggered by the Tesla Quant Trading AI Agent via Execute Workflow. 🧠 Requires: 4 connected sub-agents and Alpha Vantage Premium API Key 🔌 Required Sub-Workflows To use this workflow, you must install: Tesla 15min Indicators Tool Tesla 1hour Indicators Tool Tesla 1day Indicators Tool Tesla 1hour and 1day Klines Tool Tesla Quant Technical Indicators Webhooks Tool (provides Alpha Vantage data) 🧠 What This Agent Does Fetches pre-cleaned 20-point JSON outputs from the 4 sub-agents listed above Analyzes each timeframe individually: 15m: momentum and short-term setups 1h: confirmation of emerging trends 1d: macro positioning and trend alignment Klines: candlestick reversal patterns and volume divergence Generates a structured final signal in JSON with: Trading stance: Buy, Sell, Hold, or Cautious Confidence score (0.0–1.0) Multi-timeframe indicator breakdown Candlestick and volume divergence annotations 📋 Sample Output { "summary": "TSLA momentum is weakening short-term. 1h MACD shows bearish crossover, RSI declining. 1d candles confirm potential reversal setup.", "signal": "Cautious Sell", "confidence": 0.81, "multiTimeframeInsights": { "15m": { "RSI": 68.3, "MACD": { "macd": 0.53, "signal": 0.61 }, ... }, "1h": { "RSI": 65.0, "MACD": { "macd": -0.32, "signal": 0.11 }, ... }, "1d": { "BBANDS": { ... }, ... }, "candlestickPatterns": { "1h": "Doji", "1d": "Bearish Engulfing" }, "volumeDivergence": { "1h": "Bearish", "1d": "Neutral" } } } 🛠️ Setup Instructions Import this workflow into n8n Name it: Tesla_Financial_Market_Data_Analyst_Tool Add Required API Credentials Alpha Vantage Premium (via HTTP Query Auth) OpenAI GPT-4.1 for reasoning and synthesis Link Required Sub-Agents Connect the 4 tool workflows listed above to their respective Tool Workflow nodes Connect the webhook provider for data fetches Set Up as Sub-Agent This workflow must be triggered using Execute Workflow from the parent agent Pass in: message (optional context) sessionId (used for memory continuity) 🧾 Sticky Notes Provided 📘 Tesla Financial Market Data Analyst — Core logic overview 📈 15m / 1h / 1d Tool Notes — Indicator lists + use cases 🕯️ Klines Tool Note — Candlestick and volume divergence patterns 🧠 GPT Reasoning Note — GPT-4.1 handles final synthesis 🧩 Sub-Workflow Trigger — Proper integration with parent agent 🧠 Memory Buffer — Maintains session context across evaluations 🔒 Licensing & Support © 2025 Treasurium Capital Limited Company The logic, prompt design, and multi-agent architecture are proprietary and IP-protected. For support or collaboration inquiries: 🔗 Don Jayamaha – LinkedIn 🔗 n8n Creator Profile 🚀 Unify your Tesla trading logic across timeframes—automated, AI-powered, and built for scalers and swing traders.
by Marian Tcaciuc
Manage Calendar with Voice & Text Commands using GPT-4, Telegram & Google Calendar This n8n workflow transforms your Telegram bot into a personal AI calendar assistant, capable of understanding both voice and text commands in Romanian, and managing your Google Calendar using the GPT-4 model via LangChain. Whether you want to create, update, fetch, or delete events, you can simply speak or write your request to your Telegram bot — and the assistant takes care of the rest. 🚀 Features Voice command support using Telegram voice messages (.ogg) Transcription using OpenAI Whisper Natural language understanding with GPT-4 via LangChain Google Calendar integration: ✅ Create Events 🔁 Update Events ❌ Delete Events 📅 Fetch Events Responses sent back via Telegram 🛠️ Step-by-Step Setup Instructions 1. Create a Telegram Bot Go to @BotFather on Telegram. Send /newbot and follow the instructions. Save the Bot Token. 2. Configure Telegram Trigger Node Paste the Telegram token into the Telegram Trigger and Telegram nodes. Set updates to ["message"]. 3. Set up OpenAI Credentials Get an OpenAI API key from https://platform.openai.com Create a credential in n8n for OpenAI. This is used for both transcription and AI reasoning. 4. Set up Google Calendar In Google Cloud Console: Enable Google Calendar API Set up OAuth2 credentials Add your n8n redirect URI (usually https://yourdomain/rest/oauth2-credential/callback) Create a credential in n8n using Google Calendar OAuth2 Grant access to your calendar (e.g., "Family" calendar). ⚙️ Customization Options 🗣️ Change Language or Locale The transcription node uses "en" for English. Change to another locale if needed. ✏️ Edit Prompt You can modify the prompt in the AI Agent node to include your name, work schedule, or specific behavior expectations. 📆 Change Calendar Logic Adjust time ranges or filters in the Get Events node Add custom logic before Create Event (e.g., validation, conflict checks) 📚 Helpful Tips Make sure n8n has HTTPS enabled to receive Telegram updates. You can test the flow first using only text, then voice. Use AI memory or vector stores (like Supabase) if you want context-aware planning in the future.
by Belgacem Dhiflaoui
Description What Problem Does This Solve? 🛠️ This workflow automates the process of extracting key information from resumes received as email attachments and storing that data in a structured format within a Supabase database. It eliminates the manual effort of reviewing each resume, identifying relevant details, and entering them into a database. This streamlines the hiring process, making it faster and more efficient for recruiters and HR professionals. Target audience: Recruiters, HR departments, and talent acquisition teams. What Does It Do? 🌟 Monitors a designated email inbox for new messages with resume attachments. Extracts key information such as name, contact details, education, work experience, and skills from the attached resumes. Cleans and formats the extracted data. Stores the processed data securely in a Supabase database. Key Features 📋 Automatic email monitoring for resume attachments. Intelligent data extraction from various resume formats (e.g., PDF, DOC, DOCX). Customizable data fields to capture specific information. Seamless integration with Supabase for data storage. Uses OpenRouter to streamline API key management for services such as AI-powered parsing. Setup Instructions Prerequisites ⚙️ n8n Instance**: Self-hosted or cloud instance of n8n. Email Account**: Gmail account with Gmail API access for receiving resumes. Supabase Account**: A Supabase project with a database/table ready to store extracted resume data. You'll need the Supabase URL and API key. OpenRouter Account**: For managing AI model API keys centrally when using LLM-based resume parsing. Installation Steps 📦 1. Import the Workflow: Copy the exported workflow JSON. Import it into your n8n instance via “Import from File” or “Import from URL”. 2. Configure Credentials: In n8n > Credentials, add credentials for: Email account (Gmail API): Provide Client ID and Client Secret from the Google Cloud Platform. Supabase: Provide the Supabase URL and the anon public API key. OpenRouter (Optional): Add your OpenRouter API key for use with any AI-powered resume parsing nodes. Assign these credentials to their respective nodes: Gmail Trigger → Email credentials. Supabase Insert → Supabase credentials. AI Parsing Node → OpenRouter credentials. 3. Set Up Supabase Table: Create a table in Supabase with columns such as: name, email, phone, education, experience, skills, received_date, etc. Make sure the field names align with the structure used in your workflow. 4. Customize Nodes: Parsing Node(s):* Modify the workflow to use an *OpenAI model* directly for field extraction, replacing the *Basic LLM Chain** node that utilizes OpenRouter. 5. Test the Workflow: Send a test email with a resume attachment. Check n8n's execution log to confirm the workflow triggered, parsed the data, and inserted it into Supabase. Verify data integrity in your Supabase table. How It Works High-Level Workflow 🔍 Email Monitoring: Triggered when a new email with an attachment is received (via Gmail API). Attachment Check: Verifies the email contains at least one attachment. Prepare Data: Extracts the attachment and prepares it for analysis. Data Extraction: Uses OpenRouter-powered LLM (if configured) to extract structured information from the resume. Data Storage: The structured information is saved into the Supabase database. Node Names and Actions (Example) Gmail Trigger:** Triggers when a new email is received. IF**: Checks whether the received email includes any attachments. Get Attachments:** Retrieves attachments from the triggering email. Prepare Data:** Prepares the attachment content for processing. Basic LLM Chain:** Uses an AI model via OpenRouter to extract relevant resume data and returns it as structured fields. Supabase-Insert:** Inserts the structured resume data into your Supabase database.