by Ranjan Dailata
Who this is for? Extract & Summarize Indeed Company Info is an automated workflow that extracts the Indeed company profile information using Bright Data Web Unlocker, transform it using Google Gemini’s LLM, and forward the transformed response with the summary to a specified webhook for downstream use. This workflow is tailored for: Recruiters and HR teams looking to assess companies quickly during talent sourcing. Job seekers researching potential employers and needing summarized company insights. Market researchers and analysts monitoring competitor or industry players. What problem is this workflow solving? Searching and evaluating company profiles on Indeed manually can be time-consuming and inefficient, especially when dealing with large volumes of companies. Manually browsing, copying, and summarizing company descriptions, reviews, and ratings from Indeed hinders productivity and limits real-time insights. This workflow solves this by: Automating the extraction of company details from Indeed using Bright Data Web Unlocker. Summarizing the raw data using Google Gemini's language model for a quick, human-readable overview. Sending the transformed response with the summary to a chosen endpoint, like Slack, Notion, Airtable, or a custom webhook. What this workflow does This automated pipeline does the following: Scrape Indeed company profile pages (e.g., ratings, description, reviews) using Bright Data’s Web Unlocker. Transform the scraped content into structured JSON using n8n’s built-in tools. Summarize and extract meaningful insights using Google Gemini's large language model. Forward the summarized data to a specified webhook or app for real-time access, storage, or analysis. Forward the formatted response to a specified webhook or app for real-time access, storage, or analysis. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the search query, Bright Data zone by navigating to the Set Indeed Search Query node. Update the Webhook Notifier with the Webhook endpoint of your choice. How to customize this workflow to your needs This workflow is built to be flexible - whether you're a company or a market researcher, entrepreneur, or data analyst. Here’s how you can adapt it to fit your specific use case: Changing the data source**: Replace the Indeed search input with other job or business listing platforms if needed (e.g., Glassdoor, Crunchbase) Refining the LLM prompt**: Tailor the Gemini prompt to transform or summarize the Indeed company information in a specific format. Routing the output to different destinations**: Send summaries or transformed response to Google Sheets, Airtable, or CRMs like HubSpot or Salesforce etc.
by Yang
👥 Who is this for? This workflow is ideal for virtual assistants, researchers, developers, automation specialists, and data analysts who need to regularly extract and organize structured product information (like books) from a website. It’s especially useful for those working with catalog-based websites who want to automate extraction and delivery of clean, sorted data. 🧩 What problem is this solving? Manually copying product listings like book titles and prices from a website into a spreadsheet is slow and repetitive. This automation solves that problem by scraping content using Dumpling AI, extracting the right data using CSS selectors, and formatting it into a clean CSV file that is sent to your email—all triggered automatically when a new URL is added to Google Sheets. ⚙️ What this workflow does This template automates an entire content scraping and delivery process: Watches a Google Sheet for new URLs Scrapes the HTML content of the given webpage using Dumpling AI Uses CSS selectors in the HTML node to extract each book from the page Splits the HTML array into individual items Extracts the book title and price from each HTML block Sorts the books in descending order based on price Converts the sorted data to a CSV file Sends the CSV via email using Gmail 🛠️ Setup Google Sheets Create a sheet titled something like URLs Add your product listing URLs (e.g., http://books.toscrape.com) Connect the Google Sheets trigger node to your sheet Ensure you have proper credentials connected Dumpling AI Create an account at Dumpling AI) - Generate your API key Set the HTTP Method to POST and pass the URL dynamically from the Google Sheet Use Header Auth to include your API key in the request header Make sure "cleaned": "True" is included in the body for optimized HTML output HTML Node The first HTML node extracts the main book container blocks using: .row > li The second HTML node parses out the individual fields: title: h3 > a (via the title attribute) price: .price_color Sort Node Sorts books by price in descending order Note: price is extracted as a string, ensure it's parsable if you plan to use numeric filtering later Convert to CSV The JSON data is passed into a Convert node and transformed into a CSV file Gmail Sends the CSV as an attachment to a designated email 🔄 How to customize this workflow Extract more data**: Add more CSS selectors in the second HTML node to pull fields like author, availability, or product links Switch destinations**: Replace Gmail with Slack, Google Drive, Dropbox, or another platform Adjust sorting**: Sort alphabetically or based on another extracted value Use a different source**: As long as the site structure is consistent, this can scrape any listing-like page Trigger differently**: Use a webhook, form submission, or schedule trigger instead of Google Sheets ⚠️ Dependencies and Notes This workflow uses Dumpling AI to perform the web scraping. This requires an API key and uses credits per request. The HTML node depends on valid CSS selectors. If the site layout changes, the selectors may need to be updated. Ensure you’re not scraping content from websites that prohibit automated scraping.
by Paulo Ramirez
Upload your CRM contacts to telli and schedule AI voice-agent calls Introduction to telli and AI Voice-Agent Calls telli is an innovative platform that provides AI-powered voice agents capable of making calls and performing tasks tailored to specific customer use cases. These AI voice-agents can handle a wide range of communication tasks, from appointment scheduling to customer support, with remarkable efficiency and natural conversation flow. This template is designed for businesses and organizations looking to automate their outbound calling processes using telli's AI voice-agents in conjunction with Airtable as their CRM. It solves the problem of manual call scheduling and data transfer between your CRM and calling system, saving time and reducing human error. Prerequisites telli account Airtable base with contact information n8n instance Step-by-Step Setup Guide n8n Setup: Create a new workflow in n8n. Add the Airtable node to connect to your CRM table. telli API Configuration: Log in to your telli dashboard. Locate and copy your API key under telli - Settings - API/Webhooks. Workflow Configuration: Add two HTTP Request nodes to your n8n workflow. Set the "Authorization" header in both POST requests, replacing the value with your telli API key. Configure the first request to use the /add-contact endpoint. Set up the second request to use the /schedule-call endpoint. Data Mapping: Map the relevant fields from your Airtable node to the telli API requests. Testing and Activation: Run a test execution of your workflow. Once satisfied with the results, activate the workflow. API Endpoint Details Add Contact Endpoint URL**: https://api.telli.com/v1/add-contact Method**: POST Headers**: Authorization: YOUR-API-KEY Content-Type: application/json Payload**: { "external_contact_id": "string", "salutation": "string", "first_name": "string", "last_name": "string", "phone_number": "string", "email": "jsmith@example.com", "contact_details": {}, "timezone": "string" } Schedule Call Endpoint URL**: https://api.telli.com/v1/schedule-call Method**: POST Headers**: Authorization: YOUR-API-KEY Content-Type: application/json Payload**: { "contact_id": TELLI-CONTACT-ID, "agent_id": "string", "max_retry_days": 123, "call_details": { "message": "Hello, this is your friendly reminder!", "questions": [ { "fieldName": "email", "neededInformation": "email of the customer", "exampleQuestion": "What is your email address?", "responseFormat": "email string" } ] }, "override_from_number": "string" } Use Cases This template is versatile and can be applied to various scenarios, including: Lead Qualification*: Automatically schedule calls to new leads entered in your CRM. Appointment Reminders*: Set up calls to remind clients of upcoming appointments. Customer Feedback*: Schedule follow-up calls after product deliveries or service completions. Uploading Multiple Contacts For bulk operations, you have two options: Loop Node: Include a Loop node in your n8n workflow to process multiple contacts sequentially. Batch Endpoints: Instead of /add-contact and /schedule-call, use telli's batch endpoints: /add-contacts-batch: Add multiple contacts within an array. /schedule-calls-batch: Schedule multiple calls at once. Example of batch endpoint usage: { "contacts": [ {"name": "John Doe", "phone": "+1234567890"}, {"name": "Jane Smith", "phone": "+1987654321"} ] } By leveraging this template, you can seamlessly integrate your Airtable CRM with telli's powerful AI voice-agents, automating your outbound calling process and enhancing your customer communication strategy.
by JPres
👥 Who Is This For? Sales and marketing teams seeking efficient, hands‑free generation of personalized slide decks for each prospect from CSV lead lists. 🛠 What Problem Does This Solve? Manually editing presentation decks for large lead lists is slow and error‑prone. This workflow fully automates: Importing and parsing CSV lead data Logging leads and outputs in Google Sheets Duplicating a master Slides template per lead Injecting lead‑specific variables into slides 🔄 Node‑by‑Node Breakdown | Step | Node | Purpose | | ---- | ---------------------------------------- | -------------------------------------------------------- | | 1 | New Leads Arrived | Detect new CSV uploads in Drive | | 2 | File Type? | Filter for .csv files only | | 3 | Download by ID | Download the CSV content | | 4 | Create new Sheet | Create a Google Sheet to record lead data | | 5 | Combine Empty New Document with CSV Data | Structure each lead record for slide creation | | 6 | Merge Data for new Lead Document | Map template placeholders to lead values | | 7 | Get all Leads | Retrieve sheet rows to iterate through each lead | | 8 | MoveToLeadListFolder | Move processed CSV to an archive folder | | 9 | Copy Slides Template | Make a copy of the master Slides deck | | 10 | Create Custom Presentation | Replace placeholders in the copied deck with lead data | | 11 | Add Presentation ID to Lead | Write the generated presentation URL back into the Sheet | ⚙️ Pre‑conditions / Requirements n8n with Google Drive, Sheets, and Slides credentials A master Google Slides deck with placeholder tokens (e.g. {{Name}}, {{Company}}) A Drive folder for incoming CSV lead files ⚙️ Setup Instructions Import this workflow into your n8n instance. Configure the New Leads Arrived node to watch your CSV folder. Enter your Google credentials in the Drive, Sheets, and Slides nodes. Specify the master Slides template ID in the Copy Slides Template node. In Create Custom Presentation, map slide tokens to sheet column names. Disable “Keep Binary Data” in Copy Slides Template to conserve memory. Upload a sample CSV (with headers like Name, Company, Metric) to test. 🎨 How to Customize Add or remove variables by editing the CSV headers and updating the mapping in Merge Data for new Lead Document. Insert an AI/natural‑language node before slide creation to generate more advanced and personalized text blocks. Use SplitInBatches to throttle API calls and avoid rate‑limit errors. Add error‑handling branches to capture and log failed operations. 🔐 Security and Privacy The workflow uses placeholder variables for file and folder IDs, so no actual IDs are exposed in the template. Ensure OAuth scopes are limited to only the required Google APIs.
by JPres
👥 Who Is This For? Content creators, marketing teams, and channel managers who want a simple, hands‑off solution to upload videos and automatically generate optimized metadata from video transcripts. 🛠 What Problem Does This Solve? Manual video uploads with proper metadata creation is time‑consuming and repetitive. This workflow fully automates: Monitoring a specific Google Drive folder for new video uploads Seamless YouTube upload processing Transcript extraction for context understanding AI‑powered generation of titles, descriptions, and tags Metadata application to uploaded videos without manual intervention 🔄 Node‑by‑Node Breakdown | Step | Node Purpose | |------|---------------------------------------------------------------------| | 1 | New Video? (Trigger) – Monitors specified Google Drive folder | | 2 | Download New Video – Retrieves the video file from Google Drive | | 3 | Upload to YouTube – Uploads the video to YouTube with initial settings | | 4 | Get Transcript – Extracts transcript from the uploaded video | | 5 | Adjust Transcript Format – Formats raw transcript for processing | | 6 | Create Description – Generates SEO‑optimized description | | 7 | YT Tags (Message Model) – Creates relevant tags based on content | | 8 | YT Title (Message Model) – Generates compelling title | | 9 | Define File Path Upload Format (Optional) – Structures data paths | | 10 | Update Video’s Metadata – Applies generated title, description, tags| ⚙️ Pre‑conditions / Requirements n8n with Google Drive and YouTube API credentials configured (stored as n8n credentials/variables; no hard‑coded IDs) Dedicated Google Drive folder for video uploads YouTube channel with proper upload permissions AI service access for transcript processing and metadata generation Sufficient storage for temporary video handling ⚙️ Setup Instructions Import this workflow into your n8n instance. Configure Google Drive credentials; reference folder ID via n8n variable (do not hard‑code). Set up YouTube API credentials with upload and edit permissions. Specify the target Google Drive folder ID in the New Video? trigger node (via variable). Configure AI service credentials for transcript and metadata generation. Adjust message templates for title, description, and tag creation. Test with a small video file before production use. 🎨 How to Customize Modify AI prompts to match your channel’s tone and style. Add conditional logic based on video categories or naming conventions. Implement notification systems to alert when uploads complete. Create custom metadata templates for different content types. Include timestamps or chapter markers based on transcript analysis. Add social media sharing nodes to announce new uploads. ⚠️ Important Notes Video quality is preserved through the upload process. Consider YouTube API quotas when handling multiple uploads. Transcript quality affects metadata generation results. Videos are initially uploaded without visibility adjustments. Processing time depends on video length and transcript complexity. 🔐 Security and Privacy Store API credentials and folder IDs as n8n Credentials/Variables—remove any hard‑coded tokens or IDs. Video files are processed temporarily and not stored permanently. Limit Google Drive folder access to authorized users only. Manage YouTube upload permissions carefully (use OAuth/service accounts). Ensure compliance with organizational data‑handling policies.
by Yang
Who is this for? This template is for sales teams, agencies, or local service providers who want to quickly generate cold outreach lists and automatically call local businesses with a Vapi AI assistant. It’s perfect for automating cold calls from scraped local listings with no manual dialing or research. What problem is this workflow solving? Finding leads and initiating outreach calls can be time-consuming. This workflow automates the process: it scrapes business listings from Google Maps using Dumpling AI, extracts phone numbers, filters out incomplete data, formats the numbers, and uses Vapi to make outbound AI-powered calls. Every call is logged in Google Sheets for follow-up and tracking. What this workflow does Starts manually and pulls search queries (e.g., "plumbers in Austin") from Google Sheets. Sends each query to Dumpling AI’s Google Maps scraping endpoint. Splits the returned business data into individual leads. Extracts key info like business name, website, and phone number. Filters to only keep leads with valid phone numbers. Formats phone numbers for Vapi dialing (adds +1). Calls each business using Vapi AI. Logs each successful call in a Google Sheet. Setup Google Sheets Setup Create a sheet with business search queries in the first column (e.g., best+restaurants+in+Chicago) Make sure the tab name is set and authorized in your credentials. Connect your Google Sheets account in the Get Search Keywords from Google Sheets node. Dumpling AI Setup Go to dumplingai.com Generate an API Key and connect it as a header token in the Scrape Google Map Businesses using Dumpling AI node Vapi Setup Sign into Vapi and create an assistant Get your assistantId and phoneNumberId Insert these into the JSON payload of the Initiate Vapi AI Call to Business node Add your Vapi API key to the credentials section Call Logging Create another tab in your sheet (e.g., “leads”) with these headers: company name phone number website This will be used in the Log Called Business Info to Sheet node How to customize this workflow to your needs Modify the business search terms in your Google Sheet to target specific industries or locations. Add filters to exclude certain businesses based on ratings, keywords, or location. Update your Vapi assistant script to match the type of outreach or pitch you’re using. Add additional integrations (e.g., CRM logging, Slack notifications, follow-up emails). Change the trigger to run on a schedule or webhook instead of manually. Nodes and Functions Breakdown Start Workflow Manually: Initiates the automation manually for testing or controlled runs. Get Search Keywords from Google Sheets: Reads search phrases from the spreadsheet. Scrape Google Map Businesses using Dumpling AI: Sends each search query to Dumpling AI and receives matching local business data. Split Each Business Result: Breaks the returned array of businesses into individual records for processing. Extract Business Name, Phone and website: Extracts title, phone, and website from each business record. Filter Valid Phone Numbers Only: Ensures only entries with a phone number move forward. Format Phone Number for Calling: Adds a +1 country code and strips non-numeric characters. Initiate Vapi AI Call to Business: Uses the business name and number to initiate a Vapi AI outbound call. Log Called Business Info to Sheet: Appends business details into a Google Sheet for tracking. Notes You must have valid API keys and authorized connections for Dumpling AI, Google Sheets, and Vapi. Make sure to handle API rate limits if you're running the workflow on large datasets. This workflow is optimized for US-based leads (+1 country code); adjust the formatting node if calling internationally.
by Don Jayamaha Jr
⏱️ Analyze Tesla (TSLA) short-term market structure and momentum using 6 technical indicators on the 15-minute timeframe. This AI agent tool is part of the Tesla Quant Trading AI Agent system. It is designed to detect intraday shifts in volatility, trend strength, and potential reversal signals. ⚠️ Not standalone. This agent is triggered via Execute Workflow by the Tesla Financial Market Data Analyst Tool. 🔌 Requires: Tesla Quant Technical Indicators Webhooks Tool Alpha Vantage Premium API Key 📊 What It Does This workflow pulls the latest 20 data points for 6 key technical indicators from a webhook-powered source, then uses GPT-4.1 to interpret market momentum and structure: Connected Indicators: RSI (Relative Strength Index)** MACD (Moving Average Convergence Divergence)** BBANDS (Bollinger Bands)** SMA (Simple Moving Average)** EMA (Exponential Moving Average)** ADX (Average Directional Index)** The output is a structured JSON with: Market summary Timeframe (15m) Indicator values 📋 Sample Output { "summary": "TSLA shows fading momentum. RSI dropped below 60, MACD is flattening, and BBANDS are tightening. Expect short-term consolidation.", "timeframe": "15m", "indicators": { "RSI": 58.3, "MACD": { "macd": -0.020, "signal": -0.018, "histogram": -0.002 }, "BBANDS": { "upper": 183.10, "lower": 176.70, "middle": 179.90, "close": 177.60 }, "SMA": 178.20, "EMA": 177.70, "ADX": 19.6 } } 🧠 Agent Components | Module | Role | | --------------------- | -------------------------------------------------------- | | Webhook Data Node | Calls /15minData endpoint for Alpha Vantage indicators | | LangChain Agent | Parses indicator payloads and generates reasoning | | OpenAI GPT-4.1 | Powers the AI logic to interpret technical structure | | Memory Module | Maintains session consistency for multi-agent calls | 🛠️ Setup Instructions Import Workflow into n8n Name it: Tesla_15min_Indicators_Tool Configure Webhook Source Install and publish: Tesla_Quant_Technical_Indicators_Webhooks_Tool Ensure /15minData is publicly reachable (or tunnel-enabled) Add Credentials Alpha Vantage API Key (HTTP Query Auth) OpenAI GPT-4.1 (OpenAI Chat Model) Link as Sub-Agent This workflow is not triggered manually. It is executed using Execute Workflow by: 👉 Tesla_Financial_Market_Data_Analyst_Tool Pass in: message (optional) sessionId (for short-term memory linkage) 📌 Sticky Notes Summary 🟢 Trigger Integration – Receives sessionId and message from parent 🟡 Webhook Fetcher – Pulls Alpha Vantage data from /15minData 🧠 GPT-4.1 Reasoning – Produces structured JSON insight 🔵 Session Memory – Maintains evaluation flow across tools 📘 Tool Description – Explains indicator use and AI output format 🔒 Licensing & Author © 2025 Treasurium Capital Limited Company All logic, formatting, and agent design are protected under copyright. No resale or public re-use without permission. Created by: Don Jayamaha Creator Profile: https://n8n.io/creators/don-the-gem-dealer/ 🚀 Build faster intraday Tesla trading models using clean 15-minute indicator insights—processed by AI. Required by the Tesla Financial Market Data Analyst Tool.
by Don Jayamaha Jr
🕒 Evaluate Tesla (TSLA) price action and market structure on the 1-hour timeframe using 6 real-time indicators. This sub-agent is designed to feed mid-term technical insights into the Tesla Financial Market Data Analyst Tool. It uses GPT-4.1 to interpret Alpha Vantage indicator data delivered via secure webhooks. ⚠️ This workflow is not standalone and is executed via Execute Workflow. 🔌 Requires: Tesla Quant Technical Indicators Webhooks Tool Alpha Vantage Premium API Key 🔧 Connected Indicators This tool fetches and analyzes the latest 20 datapoints for: RSI (Relative Strength Index)** MACD (Moving Average Convergence Divergence)** BBANDS (Bollinger Bands)** SMA (Simple Moving Average)** EMA (Exponential Moving Average)** ADX (Average Directional Index)** 📋 Sample Output { "summary": "TSLA is gaining strength on the 1-hour chart. RSI is rising, MACD has crossed bullish, and BBANDS are widening.", "timeframe": "1h", "indicators": { "RSI": 62.1, "BBANDS": { "upper": 176.90, "lower": 169.70, "middle": 173.30, "close": 176.30 }, "SMA": 174.20, "EMA": 175.60, "ADX": 27.5, "MACD": { "macd": 0.84, "signal": 0.65, "histogram": 0.19 } } } 🧠 Agent Components | Component | Role | | ------------------------------ | -------------------------------------------------- | | 1hour Data | Pulls Alpha Vantage indicator data via webhook | | Tesla 1hour Indicators Agent | Interprets signals using structured GPT-4.1 prompt | | OpenAI Chat Model | GPT-4.1 LLM performs analysis | | Simple Memory | Maintains session context | 🛠️ Setup Instructions Import Workflow into n8n Name it: Tesla_1hour_Indicators_Tool Install the Webhook Fetcher Tool 👉 Required: Tesla_Quant_Technical_Indicators_Webhooks_Tool This agent expects webhook /1hourData to return pre-cleaned data Add Credentials Alpha Vantage Premium API Key (via HTTP Query Auth) OpenAI GPT-4.1 credentials Configure for Sub-Agent Use Triggered only via Execute Workflow from: 👉 Tesla Financial Market Data Analyst Tool Inputs: message (optional) sessionId (required for memory linkage) 📌 Sticky Notes Overview 🟢 Trigger Setup – Activated only by the parent agent 📊 1h Webhook Fetcher – Calls Alpha Vantage via secured endpoint 🧠 AI Agent Summary – Interprets trend/momentum from indicator data 🔗 GPT Model Notes – GPT-4.1 parses and explains technical alignment 📘 Documentation Sticky – Embedded in canvas with full walkthrough 🔐 Licensing & Support © 2025 Treasurium Capital Limited Company This tool is part of a proprietary multi-agent AI architecture. No commercial reuse or redistribution permitted. 🔗 Author: Don Jayamaha 🔗 Templates: https://n8n.io/creators/don-the-gem-dealer/ 🚀 Detect TSLA trend shifts and validate setups with 1-hour technical clarity—powered by Alpha Vantage + GPT-4.1. This tool is required by the Tesla Financial Market Data Analyst Tool.
by Mutasem
Use Case Track all Linear tickets in Google sheets. Useful if you want to do some custom analysis but don't want to pay for Linear's Plus features (Linear Insights) or that it does not cover. Setup Add Linear API header key Add Google sheets creds Update which teams to get tickets from in Graphql Nodes Update which Google Sheets page to write all the tickets to You only need to add one column, id, in the sheet. Google Sheets node in automatic mapping mode will handle adding the rest of the columns. Set any custom data on each ticket Activate workflow 🚀 How to adjust this template Set any custom fields you want to get out of this, that you can quickly do in n8n.
by Don Jayamaha Jr
📅 Analyze Tesla’s daily trading structure with AI using 6 Alpha Vantage indicators. This tool evaluates long-term trend health, volatility patterns, and potential reversal signals at the 1-day timeframe. Designed for use within the Tesla Financial Market Data Analyst Tool, this agent helps swing and position traders anchor macro sentiment. ⚠️ Not standalone. Must be executed via Execute Workflow 🔌 Requires: Tesla Quant Technical Indicators Webhooks Tool Alpha Vantage Premium API Key OpenAI GPT-4.1 credentials 🔍 What It Does This tool queries a secured webhook (/1dayData) to retrieve real-time, trimmed JSON data for: RSI (Relative Strength Index)** BBANDS (Bollinger Bands)** SMA (Simple Moving Average)** EMA (Exponential Moving Average)** ADX (Average Directional Index)** MACD (Moving Average Convergence Divergence)** These values are then passed to a LangChain AI Agent powered by GPT-4.1, which returns: A 2–3 sentence market condition summary Structured indicator values Timeframe tag ("1d") 📋 Sample Output { "summary": "TSLA shows consolidation on the daily chart. RSI is neutral, BBANDS are contracting, and MACD is flattening.", "timeframe": "1d", "indicators": { "RSI": 51.3, "BBANDS": { "upper": 192.80, "lower": 168.20, "middle": 180.50, "close": 179.90 }, "SMA": 181.10, "EMA": 179.75, "ADX": 15.8, "MACD": { "macd": -0.25, "signal": -0.20, "histogram": -0.05 } } } 🧠 Agent Components | Component | Description | | ----------------------------- | -------------------------------------------------- | | 1day Data (HTTP Node) | Pulls latest data from secured /1dayData webhook | | OpenAI Chat Model | GPT-4.1 powers the analysis logic | | Tesla 1day Indicators Agent | LangChain agent performing interpretation | | Simple Memory | Short-term session continuity | 🛠️ Setup Instructions Import Workflow into n8n Name: Tesla_1day_Indicators_Tool Add Required Credentials Alpha Vantage Premium (via HTTP Query Auth) OpenAI GPT-4.1 (Chat Model) Install Webhook Fetcher Required: Tesla Quant Technical Indicators Webhooks Tool Endpoint /1dayData must be active Execution Context This tool is only triggered via: 👉 Tesla Financial Market Data Analyst Tool Inputs expected: message: optional context sessionId: session memory linkage 📌 Sticky Notes Overview 📘 Tesla 1-Day Indicators Tool – Purpose and integration 📡 Webhook Fetcher – Pulls daily Alpha Vantage data via HTTPS 🧠 GPT-4.1 Model – Reasoning for trend classification 🔗 Sub-Agent Trigger – Used only by Financial Market Analyst 🧠 Memory Buffer – Ensures consistent session logic 🔒 Licensing & Support © 2025 Treasurium Capital Limited Company This workflow—including prompts, logic, and formatting—is protected IP. 🔗 Don Jayamaha – LinkedIn 🔗 Creator Profile 🚀 Evaluate long-term Tesla price behavior with AI-enhanced technical analysis—critical for swing trading strategy. Required by the Tesla Financial Market Data Analyst Tool.
by Niklas Hatje
Use Case When building a product it's important to discover and eliminate bugs as quickly as possible. Since we're using our product at n8n a lot, we wanted to make it as easy as possible for everyone to add bugs with the needed level of detail. That's why we built this workflow that allows everyone to add bugs to our Linear account easily directly from Slack What this workflow does This workflow waits for a webhook call within Slack, that gets fired when users use the /bug command on a bot that you will create as part of this template. It then adds the bug to Linear using a pre-defined description and a defined label. It then notifies the user about the newly added bug as you can see below: How to create your Slack bot Visit https://api.slack.com/apps, click on New App and choose a name and workspace. Click on OAuth & Permissions and scroll down to Scopes -> Bot token Scopes Add the chat:write scope Head over to Slash Commands and click on Create New Command Use /bug as the command Copy the test URL from the Webhook node into Request URL Add whatever feels best to the description and usage hint Go to Install app and click install Setup Configure your Slack bot using the sticky to the left Fill the Set me up node. You can find the IDs easily using the Helper nodes section Make sure to exchange the Request URL in your Slack with the Prod URL of the Webhook node before activating this workflow How to adjust it to your needs Add zero, one, two or many labels when creating the new ticket Change the Slack message according to your needs Change the default description for a new bug ticket Rename the Slack command as it works best for you How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We're using AI to classify the bugs and move them to the right team as soon as they get added to Linear (see this template) We also added other commands like /pain and /idea that allow us to submit other things to Notion. You can see the workflow for that here.
by Jimleuk
This n8n template demonstrates one approach to customer authentication via chat agents. Unlike approaches where you have to authenticate users prior to interacting with the agent, this approach allows guest users to authenticate at any time during the session or not at all. Note about Security: this template is for illustration purposes only and requires much more work to be ready for production! How it works A conversational agent is used for this demonstration. The key component is the Redis node just after the chat trigger which acts as the session context. For guests, the session item is blank. for customers, the session item is populated with their customer profile. The agent is instructed to generate a unique login URL only for guests when appropriate or upon request. This login URL redirects the guest user to a simple n8n form also hosted in this template. The login URL has the current sessionID as a query parameter as the way to pass this data to the form. Once login is successful, the matching session item by sessionId is populated with the customer profile. The user can now return to the chat window. Back to the agent, now when the user sends their next message, the Redis node will pick up the session item and the customer profile associated with it. The system prompt is updated with this data which let's the agent know the user is now a customer. How to use You'll need to update the "auth URL" tool to match the URL of your n8n instance. Better yet, copy the production URL of your form from the trigger. Activate the workflow to turn on production mode which is required for this workflow. Implement the authentication logic in step 3. This could be sending the user and pass to a postgreSQL data for validation. Requirements OpenAI for LLM (feel free to swap to any provider) Redis for Cache/Sessions (again, feel free to swap this out for postgresql or other database) Customising this workflow Consider not populating the session item with the user data as it can become stale. Instead, just add the userId and instruct the agent to query using tools. Extend the Login URL idea by experimenting with signup URLs or single-use Urls.