by Davi Saranszky Mesquita
Log errors and avoid sending too many emails Use case Most of the time, it’s necessary to log all errors that occur. However, in some cases, a scheduled task or service consuming excessive resources might trigger a surge of errors. To address this, we can log all errors but limit alerts to a maximum of one notification every 5 minutes. What this workflow does This workflow can be configured to receive error events, or you can integrate it before your own error-handling logic. If used as the primary error handler, note that this flow will only add a database log entry and take no further action. You’ll need to add your own alerts (e.g., email or push notifications). Below is an example of a notification setup I prefer to use. At the end, there’s an error cleanup option. This feature is particularly useful in development environments. If you already have an error-handling workflow, you can call this one as a sub-workflow. Its final steps include cleanup logic to reset the execution state and terminate the workflow. Setup Verify all Postgres nodes and credentials when using the 'Error Handling Sample' How to adjust it to your needs 1) You can set this workflow as a sub-workflow within your existing error-handling setup. 2) Alternatively, you can add the "Error Handling Sample" at the end of this workflow, which sends email and push notifications. Configuration Requirements: ⚠️ You must create a database table for this to work! DDL of this sample: create table p1gq6ljdsam3x1m."N8Err" ( id serial primary key, created_at timestamp, updated_at timestamp, created_by varchar, updated_by varchar, nc_order numeric, title text, "URL" text, "Stack" text, json json, "Message" text, "LastNode" text ); alter table p1gq6ljdsam3x1m."N8Err" owner to postgres; create index "N8Err_order_idx" on p1gq6ljdsam3x1m."N8Err" (nc_order); by Davi Saranszky Mesquita https://www.linkedin.com/in/mesquitadavi/
by Krishna Kumar Eswaran
🧠 Problem This Solves: For developers and creators, consistently posting quality content on LinkedIn can be time-consuming. This workflow automates the process by: Fetching the latest Dev.to articles Posting them to LinkedIn twice daily Preventing duplicates using Airtable Sending success alerts to Telegram This ensures you're always active on LinkedIn, with zero manual effort. 👥 Who This Template Is For Developers who want to build their presence on LinkedIn Tech creators or solo founders looking to grow an audience Community/page managers who want regular, curated content Busy professionals aiming for consistent LinkedIn engagement without doing it manually ⚙️ Workflow Breakdown This automation runs twice a day (9:00 AM and 7:00 PM) and performs the following steps: Fetches Dev.to articles based on a tag Checks Airtable to avoid reposting the same article Posts to LinkedIn if it’s new Sends a Telegram message after posting successfully 🧩 Step-by-Step Setup Instructions ✅ 1. Airtable Configuration Create a new base in Airtable with just one table and one column: Table Name: PostedArticles Column: ArticleID (Single line text – stores the unique ID of each Dev.to article posted) This column is used to track posted articles and prevent duplicates. 🔗 2. Dev.to API Setup Use the following endpoint in the HTTP Request node: arduino Copy Edit https://dev.to/api/articles?tag=YOUR_TAG_HERE&per_page=10 Replace YOUR_TAG_HERE with a tag like android, webdev, ai, etc. 💬 3. Telegram Bot Setup Open @BotFather in Telegram and create a new bot Save the bot token Get your chat ID using @userinfobot or via Telegram API Add a Telegram node in n8n using this token and chat ID This will notify you when a post is successfully published. 🧾 4. LinkedIn Setup Create a LinkedIn Developer App Use OAuth2 to connect it in n8n Choose to post on either a user profile or a company page 🧱 5. n8n Workflow Structure Here’s the basic structure of the workflow: Cron Node – Triggers at 9:00 AM and 7:00 PM daily HTTP Request – Fetches latest articles from Dev.to Airtable Search – Checks if ArticleID already exists IF Node – Filters new vs. already-posted articles LinkedIn Post – Publishes new article Airtable Create – Saves the new ArticleID Telegram Message – Sends success confirmation 🛠️ Customization Tips Change the Dev.to tag in the API URL Modify LinkedIn post format (add hashtags, emojis, personal notes) Adjust posting times in the Cron node Use additional filters (e.g., only post articles with a cover image or certain word count)
by JPres
👥 Who Is This For? Sales and marketing teams seeking efficient, hands‑free generation of personalized slide decks for each prospect from CSV lead lists. 🛠 What Problem Does This Solve? Manually editing presentation decks for large lead lists is slow and error‑prone. This workflow fully automates: Importing and parsing CSV lead data Logging leads and outputs in Google Sheets Duplicating a master Slides template per lead Injecting lead‑specific variables into slides 🔄 Node‑by‑Node Breakdown | Step | Node | Purpose | | ---- | ---------------------------------------- | -------------------------------------------------------- | | 1 | New Leads Arrived | Detect new CSV uploads in Drive | | 2 | File Type? | Filter for .csv files only | | 3 | Download by ID | Download the CSV content | | 4 | Create new Sheet | Create a Google Sheet to record lead data | | 5 | Combine Empty New Document with CSV Data | Structure each lead record for slide creation | | 6 | Merge Data for new Lead Document | Map template placeholders to lead values | | 7 | Get all Leads | Retrieve sheet rows to iterate through each lead | | 8 | MoveToLeadListFolder | Move processed CSV to an archive folder | | 9 | Copy Slides Template | Make a copy of the master Slides deck | | 10 | Create Custom Presentation | Replace placeholders in the copied deck with lead data | | 11 | Add Presentation ID to Lead | Write the generated presentation URL back into the Sheet | ⚙️ Pre‑conditions / Requirements n8n with Google Drive, Sheets, and Slides credentials A master Google Slides deck with placeholder tokens (e.g. {{Name}}, {{Company}}) A Drive folder for incoming CSV lead files ⚙️ Setup Instructions Import this workflow into your n8n instance. Configure the New Leads Arrived node to watch your CSV folder. Enter your Google credentials in the Drive, Sheets, and Slides nodes. Specify the master Slides template ID in the Copy Slides Template node. In Create Custom Presentation, map slide tokens to sheet column names. Disable “Keep Binary Data” in Copy Slides Template to conserve memory. Upload a sample CSV (with headers like Name, Company, Metric) to test. 🎨 How to Customize Add or remove variables by editing the CSV headers and updating the mapping in Merge Data for new Lead Document. Insert an AI/natural‑language node before slide creation to generate more advanced and personalized text blocks. Use SplitInBatches to throttle API calls and avoid rate‑limit errors. Add error‑handling branches to capture and log failed operations. 🔐 Security and Privacy The workflow uses placeholder variables for file and folder IDs, so no actual IDs are exposed in the template. Ensure OAuth scopes are limited to only the required Google APIs.
by JPres
👥 Who Is This For? Content creators, marketing teams, and channel managers who want a simple, hands‑off solution to upload videos and automatically generate optimized metadata from video transcripts. 🛠 What Problem Does This Solve? Manual video uploads with proper metadata creation is time‑consuming and repetitive. This workflow fully automates: Monitoring a specific Google Drive folder for new video uploads Seamless YouTube upload processing Transcript extraction for context understanding AI‑powered generation of titles, descriptions, and tags Metadata application to uploaded videos without manual intervention 🔄 Node‑by‑Node Breakdown | Step | Node Purpose | |------|---------------------------------------------------------------------| | 1 | New Video? (Trigger) – Monitors specified Google Drive folder | | 2 | Download New Video – Retrieves the video file from Google Drive | | 3 | Upload to YouTube – Uploads the video to YouTube with initial settings | | 4 | Get Transcript – Extracts transcript from the uploaded video | | 5 | Adjust Transcript Format – Formats raw transcript for processing | | 6 | Create Description – Generates SEO‑optimized description | | 7 | YT Tags (Message Model) – Creates relevant tags based on content | | 8 | YT Title (Message Model) – Generates compelling title | | 9 | Define File Path Upload Format (Optional) – Structures data paths | | 10 | Update Video’s Metadata – Applies generated title, description, tags| ⚙️ Pre‑conditions / Requirements n8n with Google Drive and YouTube API credentials configured (stored as n8n credentials/variables; no hard‑coded IDs) Dedicated Google Drive folder for video uploads YouTube channel with proper upload permissions AI service access for transcript processing and metadata generation Sufficient storage for temporary video handling ⚙️ Setup Instructions Import this workflow into your n8n instance. Configure Google Drive credentials; reference folder ID via n8n variable (do not hard‑code). Set up YouTube API credentials with upload and edit permissions. Specify the target Google Drive folder ID in the New Video? trigger node (via variable). Configure AI service credentials for transcript and metadata generation. Adjust message templates for title, description, and tag creation. Test with a small video file before production use. 🎨 How to Customize Modify AI prompts to match your channel’s tone and style. Add conditional logic based on video categories or naming conventions. Implement notification systems to alert when uploads complete. Create custom metadata templates for different content types. Include timestamps or chapter markers based on transcript analysis. Add social media sharing nodes to announce new uploads. ⚠️ Important Notes Video quality is preserved through the upload process. Consider YouTube API quotas when handling multiple uploads. Transcript quality affects metadata generation results. Videos are initially uploaded without visibility adjustments. Processing time depends on video length and transcript complexity. 🔐 Security and Privacy Store API credentials and folder IDs as n8n Credentials/Variables—remove any hard‑coded tokens or IDs. Video files are processed temporarily and not stored permanently. Limit Google Drive folder access to authorized users only. Manage YouTube upload permissions carefully (use OAuth/service accounts). Ensure compliance with organizational data‑handling policies.
by Krishna Kumar Eswaran
🧠 Problem This Solves Manually sharing Medium articles to LinkedIn daily can be repetitive and time-consuming. This automation: Fetches the latest Medium articles based on a tag (e.g., android) Posts them on LinkedIn twice daily Uses Airtable to prevent duplicates Sends a confirmation to Telegram once posted Stay consistently active on LinkedIn without lifting a finger. 👥 Who This Template Is For Developers who write or follow Medium content Tech creators or founders looking to grow an audience Community or page managers needing regular curated posts Busy professionals who want hands-free LinkedIn engagement ⚙️ Workflow Breakdown This automation runs at 9:00 AM and 7:00 PM daily and performs these steps: Fetch articles from MediumAPI.com by tag Check Airtable to prevent reposting the same article Post on LinkedIn if it’s new Store the article ID in Airtable Send a Telegram message after successful posting 🧾 Step-by-Step Setup Instructions ✅ 1. Airtable Configuration Create a base with: Table Name: PostedArticles Column: ArticleID (Single line text – to track posted articles) 🔗 2. MediumAPI Setup Go to https://mediumapi.com Sign up and generate your API key from the dashboard Use this API endpoint in an HTTP node: GET https://mediumapi.com/api/tag/YOUR_TAG/latest Headers: Authorization: Bearer YOUR_API_KEY Replace YOUR_TAG with a topic like android, ai, webdev, etc. 💬 3. Telegram Bot Setup Go to @BotFather and create a new bot Save the bot token Use @userinfobot to get your Telegram chat ID Add a Telegram node in n8n with the token + chat ID 🔗 4. LinkedIn Setup Create a LinkedIn Developer App Connect it via OAuth2 in n8n Choose to post on your profile or company page 🧱 5. n8n Workflow Structure Node Type Description Cron Triggers the flow twice a day HTTP Request Fetches articles from MediumAPI.com Airtable Search Checks if article ID already exists IF Node Skips duplicates LinkedIn Post Publishes to your LinkedIn profile/page Airtable Create Stores posted article ID Telegram Node Sends success notification 🛠️ Customization Tips Change the tag in the API URL to match your niche Add hashtags or personal comments to the LinkedIn message Schedule different posting times in the Cron node Filter Medium posts based on length or title keywords (optional)
by Niklas Hatje
Use case When collecting new leads via a form, you need to follow up on new submissions. Often, this required a lot of manual work that includes reviewing each submission, checking if they meet your criteria and then outreaching. With this workflow you can do all of that fully automatically and save a lot of your valuable time. What this workflow does This workflow runs every time you're receiving a new submission from an n8n form. It then filters out typical personal emails (such as Gmail, Hotmail, Yahoo etc.) before enriching the submission via Clearbit. It then checks, if the company of the submitter is a B2B company and has more than 499 employees. If it does, it sends an email via Gmail to the user. Setup Add the Clearbit and Gmail credentials Click on Test Workflow Enter your own email (which needs to be a business email to work) in the Form Check your email Once you're happy don't forget to activate this workflow How to adjust this template Replace the form trigger with your form provider of choice (e.g. Typeform, SurveyMonkey, Google Forms etc.) Adjust the criteria to your needs via the If node Adjust the email you're sending in the Gmail node
by Milorad Filipović
How it works It’s very important to come prepared to Sales calls. This often means a lot of manual research about the person you’re calling with. This workflow delivers a summary of the latest social media activity (LinkedIn + X) for businesses you are about to interact with each day. Scans Your Calendar**: Each morning, it reviews your Google Calendar for any scheduled meetings or calls with companies based on each attendee email address. Fetches Latest Posts**: For each identified company, it fetches recent LinkedIn and X posts and summerizes them using AI to deliver a qucik overview for a busy sales rep. Delivers Insights**: You receive personalized emails via Gmail, each dedicated to a company you’re meeting with that day, containing a reminder of the meeting and a summary of company's recent social media activity. Setup steps The workflow requires you to have the following accounts set up in their respective nodes: Google Calendar GMail Clearbit OpenAI Besides those, you will need an account on the RapidAPI platform and subscribe to the following APIs: Fresh LinkedIn Profile Data Twitter Email example
by Yang
Who is this for? This template is for sales teams, agencies, or local service providers who want to quickly generate cold outreach lists and automatically call local businesses with a Vapi AI assistant. It’s perfect for automating cold calls from scraped local listings with no manual dialing or research. What problem is this workflow solving? Finding leads and initiating outreach calls can be time-consuming. This workflow automates the process: it scrapes business listings from Google Maps using Dumpling AI, extracts phone numbers, filters out incomplete data, formats the numbers, and uses Vapi to make outbound AI-powered calls. Every call is logged in Google Sheets for follow-up and tracking. What this workflow does Starts manually and pulls search queries (e.g., "plumbers in Austin") from Google Sheets. Sends each query to Dumpling AI’s Google Maps scraping endpoint. Splits the returned business data into individual leads. Extracts key info like business name, website, and phone number. Filters to only keep leads with valid phone numbers. Formats phone numbers for Vapi dialing (adds +1). Calls each business using Vapi AI. Logs each successful call in a Google Sheet. Setup Google Sheets Setup Create a sheet with business search queries in the first column (e.g., best+restaurants+in+Chicago) Make sure the tab name is set and authorized in your credentials. Connect your Google Sheets account in the Get Search Keywords from Google Sheets node. Dumpling AI Setup Go to dumplingai.com Generate an API Key and connect it as a header token in the Scrape Google Map Businesses using Dumpling AI node Vapi Setup Sign into Vapi and create an assistant Get your assistantId and phoneNumberId Insert these into the JSON payload of the Initiate Vapi AI Call to Business node Add your Vapi API key to the credentials section Call Logging Create another tab in your sheet (e.g., “leads”) with these headers: company name phone number website This will be used in the Log Called Business Info to Sheet node How to customize this workflow to your needs Modify the business search terms in your Google Sheet to target specific industries or locations. Add filters to exclude certain businesses based on ratings, keywords, or location. Update your Vapi assistant script to match the type of outreach or pitch you’re using. Add additional integrations (e.g., CRM logging, Slack notifications, follow-up emails). Change the trigger to run on a schedule or webhook instead of manually. Nodes and Functions Breakdown Start Workflow Manually: Initiates the automation manually for testing or controlled runs. Get Search Keywords from Google Sheets: Reads search phrases from the spreadsheet. Scrape Google Map Businesses using Dumpling AI: Sends each search query to Dumpling AI and receives matching local business data. Split Each Business Result: Breaks the returned array of businesses into individual records for processing. Extract Business Name, Phone and website: Extracts title, phone, and website from each business record. Filter Valid Phone Numbers Only: Ensures only entries with a phone number move forward. Format Phone Number for Calling: Adds a +1 country code and strips non-numeric characters. Initiate Vapi AI Call to Business: Uses the business name and number to initiate a Vapi AI outbound call. Log Called Business Info to Sheet: Appends business details into a Google Sheet for tracking. Notes You must have valid API keys and authorized connections for Dumpling AI, Google Sheets, and Vapi. Make sure to handle API rate limits if you're running the workflow on large datasets. This workflow is optimized for US-based leads (+1 country code); adjust the formatting node if calling internationally.
by Don Jayamaha Jr
⏱️ Analyze Tesla (TSLA) short-term market structure and momentum using 6 technical indicators on the 15-minute timeframe. This AI agent tool is part of the Tesla Quant Trading AI Agent system. It is designed to detect intraday shifts in volatility, trend strength, and potential reversal signals. ⚠️ Not standalone. This agent is triggered via Execute Workflow by the Tesla Financial Market Data Analyst Tool. 🔌 Requires: Tesla Quant Technical Indicators Webhooks Tool Alpha Vantage Premium API Key 📊 What It Does This workflow pulls the latest 20 data points for 6 key technical indicators from a webhook-powered source, then uses GPT-4.1 to interpret market momentum and structure: Connected Indicators: RSI (Relative Strength Index)** MACD (Moving Average Convergence Divergence)** BBANDS (Bollinger Bands)** SMA (Simple Moving Average)** EMA (Exponential Moving Average)** ADX (Average Directional Index)** The output is a structured JSON with: Market summary Timeframe (15m) Indicator values 📋 Sample Output { "summary": "TSLA shows fading momentum. RSI dropped below 60, MACD is flattening, and BBANDS are tightening. Expect short-term consolidation.", "timeframe": "15m", "indicators": { "RSI": 58.3, "MACD": { "macd": -0.020, "signal": -0.018, "histogram": -0.002 }, "BBANDS": { "upper": 183.10, "lower": 176.70, "middle": 179.90, "close": 177.60 }, "SMA": 178.20, "EMA": 177.70, "ADX": 19.6 } } 🧠 Agent Components | Module | Role | | --------------------- | -------------------------------------------------------- | | Webhook Data Node | Calls /15minData endpoint for Alpha Vantage indicators | | LangChain Agent | Parses indicator payloads and generates reasoning | | OpenAI GPT-4.1 | Powers the AI logic to interpret technical structure | | Memory Module | Maintains session consistency for multi-agent calls | 🛠️ Setup Instructions Import Workflow into n8n Name it: Tesla_15min_Indicators_Tool Configure Webhook Source Install and publish: Tesla_Quant_Technical_Indicators_Webhooks_Tool Ensure /15minData is publicly reachable (or tunnel-enabled) Add Credentials Alpha Vantage API Key (HTTP Query Auth) OpenAI GPT-4.1 (OpenAI Chat Model) Link as Sub-Agent This workflow is not triggered manually. It is executed using Execute Workflow by: 👉 Tesla_Financial_Market_Data_Analyst_Tool Pass in: message (optional) sessionId (for short-term memory linkage) 📌 Sticky Notes Summary 🟢 Trigger Integration – Receives sessionId and message from parent 🟡 Webhook Fetcher – Pulls Alpha Vantage data from /15minData 🧠 GPT-4.1 Reasoning – Produces structured JSON insight 🔵 Session Memory – Maintains evaluation flow across tools 📘 Tool Description – Explains indicator use and AI output format 🔒 Licensing & Author © 2025 Treasurium Capital Limited Company All logic, formatting, and agent design are protected under copyright. No resale or public re-use without permission. Created by: Don Jayamaha Creator Profile: https://n8n.io/creators/don-the-gem-dealer/ 🚀 Build faster intraday Tesla trading models using clean 15-minute indicator insights—processed by AI. Required by the Tesla Financial Market Data Analyst Tool.
by Don Jayamaha Jr
🕒 Evaluate Tesla (TSLA) price action and market structure on the 1-hour timeframe using 6 real-time indicators. This sub-agent is designed to feed mid-term technical insights into the Tesla Financial Market Data Analyst Tool. It uses GPT-4.1 to interpret Alpha Vantage indicator data delivered via secure webhooks. ⚠️ This workflow is not standalone and is executed via Execute Workflow. 🔌 Requires: Tesla Quant Technical Indicators Webhooks Tool Alpha Vantage Premium API Key 🔧 Connected Indicators This tool fetches and analyzes the latest 20 datapoints for: RSI (Relative Strength Index)** MACD (Moving Average Convergence Divergence)** BBANDS (Bollinger Bands)** SMA (Simple Moving Average)** EMA (Exponential Moving Average)** ADX (Average Directional Index)** 📋 Sample Output { "summary": "TSLA is gaining strength on the 1-hour chart. RSI is rising, MACD has crossed bullish, and BBANDS are widening.", "timeframe": "1h", "indicators": { "RSI": 62.1, "BBANDS": { "upper": 176.90, "lower": 169.70, "middle": 173.30, "close": 176.30 }, "SMA": 174.20, "EMA": 175.60, "ADX": 27.5, "MACD": { "macd": 0.84, "signal": 0.65, "histogram": 0.19 } } } 🧠 Agent Components | Component | Role | | ------------------------------ | -------------------------------------------------- | | 1hour Data | Pulls Alpha Vantage indicator data via webhook | | Tesla 1hour Indicators Agent | Interprets signals using structured GPT-4.1 prompt | | OpenAI Chat Model | GPT-4.1 LLM performs analysis | | Simple Memory | Maintains session context | 🛠️ Setup Instructions Import Workflow into n8n Name it: Tesla_1hour_Indicators_Tool Install the Webhook Fetcher Tool 👉 Required: Tesla_Quant_Technical_Indicators_Webhooks_Tool This agent expects webhook /1hourData to return pre-cleaned data Add Credentials Alpha Vantage Premium API Key (via HTTP Query Auth) OpenAI GPT-4.1 credentials Configure for Sub-Agent Use Triggered only via Execute Workflow from: 👉 Tesla Financial Market Data Analyst Tool Inputs: message (optional) sessionId (required for memory linkage) 📌 Sticky Notes Overview 🟢 Trigger Setup – Activated only by the parent agent 📊 1h Webhook Fetcher – Calls Alpha Vantage via secured endpoint 🧠 AI Agent Summary – Interprets trend/momentum from indicator data 🔗 GPT Model Notes – GPT-4.1 parses and explains technical alignment 📘 Documentation Sticky – Embedded in canvas with full walkthrough 🔐 Licensing & Support © 2025 Treasurium Capital Limited Company This tool is part of a proprietary multi-agent AI architecture. No commercial reuse or redistribution permitted. 🔗 Author: Don Jayamaha 🔗 Templates: https://n8n.io/creators/don-the-gem-dealer/ 🚀 Detect TSLA trend shifts and validate setups with 1-hour technical clarity—powered by Alpha Vantage + GPT-4.1. This tool is required by the Tesla Financial Market Data Analyst Tool.
by Don Jayamaha Jr
📉 Detect key candlestick reversal patterns and volume divergence on Tesla (TSLA) using GPT-4.1 and real-time OHLCV data. This AI agent evaluates 1-hour and 1-day candles and is an essential part of the Tesla Financial Market Data Analyst Tool. It identifies signals like Doji, Engulfing, Hammer, and volume anomalies to support trade entry and exit logic. ⚠️ Not a standalone template — must be triggered by the Tesla Financial Market Data Analyst Tool 🔐 Requires: Alpha Vantage Premium API Key OpenAI GPT-4.1 access 🔍 What This Agent Does Calls Alpha Vantage to fetch: 🕐 1-hour OHLCV data 📅 1-day OHLCV data GPT-4.1 evaluates: 📊 Candlestick patterns like Doji, Engulfing, Shooting Star 🔄 Volume divergence (price/volume inconsistency) Returns a structured JSON output like: { "summary": "Bearish signs detected on 1-day chart. A shooting star formed on high volume while RSI is elevated. Volume divergence seen on 1h chart as price rises but volume weakens.", "candlestickPatterns": { "1h": "None", "1d": "Shooting Star" }, "volumeDivergence": { "1h": "Bearish", "1d": "None" }, "ohlcv": { "1h": { "close": 174.1, "volume": 1430000, "high": 175.0, "low": 173.8 }, "1d": { "close": 188.3, "volume": 21234000, "high": 189.9, "low": 183.7 } } } 🛠️ Setup Instructions Import the Workflow Name it: Tesla_1hour_and_1day_Klines_Tool Install Dependencies ✅ Tesla Financial Market Data Analyst Tool (this is the trigger parent) Add Required Credentials Alpha Vantage Premium → via HTTP Query Auth OpenAI GPT-4.1 → via OpenAI credentials Verify Web Access This tool fetches data live from Alpha Vantage: /query?function=TIME_SERIES_INTRADAY&interval=60min /query?function=TIME_SERIES_DAILY Run via Execute Workflow Trigger This tool will activate only when called by the Financial Analyst Agent. Inputs: message (optional) sessionId (used for memory continuity) 🧠 Agent Architecture | Component | Description | | ----------------------- | --------------------------------------------------- | | Candlestick Data Hour | Fetches 60min TSLA candles via Alpha Vantage | | Candlestick Data Day | Fetches daily TSLA candles via Alpha Vantage | | OpenAI Chat Model | GPT-4.1 reasoning engine for pattern detection | | Simple Memory | Maintains short-term logic context | | Tesla Klines Agent | LangChain AI agent analyzing both candle and volume | 📌 Sticky Notes Overview 📘 Workflow Purpose 🧠 Short-Term Memory Notes 🔍 1h/1d Data Fetch Logic 📉 Candlestick Pattern Types Detected 📊 Volume Divergence Definitions 🤖 GPT-4.1 Prompt Configuration 🔐 Licensing & Support © 2025 Treasurium Capital Limited Company Logic, pattern reasoning, and prompt structure are proprietary IP. 🔗 Don Jayamaha – LinkedIn 🔗 n8n Creator Profile 🚀 Automate technical edge: detect TSLA candle reversals and volume anomalies with precision using GPT-4.1 and Alpha Vantage. Required by the Tesla Financial Market Data Analyst Tool.
by Mutasem
Use Case Track all Linear tickets in Google sheets. Useful if you want to do some custom analysis but don't want to pay for Linear's Plus features (Linear Insights) or that it does not cover. Setup Add Linear API header key Add Google sheets creds Update which teams to get tickets from in Graphql Nodes Update which Google Sheets page to write all the tickets to You only need to add one column, id, in the sheet. Google Sheets node in automatic mapping mode will handle adding the rest of the columns. Set any custom data on each ticket Activate workflow 🚀 How to adjust this template Set any custom fields you want to get out of this, that you can quickly do in n8n.