by Louis Chan
How it works Transform medical documents into structured data using Google Gemini AI with enterprise-grade accuracy. Classifies document types (receipts, prescriptions, lab reports, clinical notes) Extracts text with 95%+ accuracy using advanced OCR Structures data according to medical taxonomy standards Supports multiple languages (English, Chinese, auto-detect) Tracks processing costs and quality metrics automatically Set up steps Prerequisites Google Gemini API key (get from Google AI Studio) Quick setup Import this workflow template Configure Google Gemini API credentials in n8n Test with a sample medical document URL Deploy your webhook endpoint Usage Send POST request to your webhook: { "image_url": "https://example.com/medical-receipt.jpg", "expected_type": "financial", "language_hint": "auto" } Get structured response: json{ "success": true, "result": { "documentType": "financial", "metadata": { "providerName": "Dr. Smith Clinic", "createdDate": "2025-01-06", "currency": "USD" }, "content": { "amount": 150.00, "services": [...] }, "quality_metrics": { "overall_confidence": 0.95 } } } Use cases Healthcare Organizations Medical billing automation - Process receipts and invoices automatically Insurance claim processing - Extract data from claim documents Clinical documentation - Digitize patient records and notes Data standardization - Consistent structured output format System Integrators EMR integration - Connect with existing healthcare systems Workflow automation - Reduce manual data entry by 90% Multi-language support - Handle international medical documents Quality assurance - Built-in confidence scoring and validation Supported Document Types Financial: Medical receipts, bills, insurance claims, invoices Clinical: Medical charts, progress notes, consultation reports Prescription: Prescriptions, medication lists, pharmacy records Administrative: Referrals, authorizations, patient registration Diagnostic: Lab reports, test results, screening reports Legal: Medical certificates, documentation forms
by Don Jayamaha Jr
📅 Analyze Tesla’s daily trading structure with AI using 6 Alpha Vantage indicators. This tool evaluates long-term trend health, volatility patterns, and potential reversal signals at the 1-day timeframe. Designed for use within the Tesla Financial Market Data Analyst Tool, this agent helps swing and position traders anchor macro sentiment. ⚠️ Not standalone. Must be executed via Execute Workflow 🔌 Requires: Tesla Quant Technical Indicators Webhooks Tool Alpha Vantage Premium API Key OpenAI GPT-4.1 credentials 🔍 What It Does This tool queries a secured webhook (/1dayData) to retrieve real-time, trimmed JSON data for: RSI (Relative Strength Index)** BBANDS (Bollinger Bands)** SMA (Simple Moving Average)** EMA (Exponential Moving Average)** ADX (Average Directional Index)** MACD (Moving Average Convergence Divergence)** These values are then passed to a LangChain AI Agent powered by GPT-4.1, which returns: A 2–3 sentence market condition summary Structured indicator values Timeframe tag ("1d") 📋 Sample Output { "summary": "TSLA shows consolidation on the daily chart. RSI is neutral, BBANDS are contracting, and MACD is flattening.", "timeframe": "1d", "indicators": { "RSI": 51.3, "BBANDS": { "upper": 192.80, "lower": 168.20, "middle": 180.50, "close": 179.90 }, "SMA": 181.10, "EMA": 179.75, "ADX": 15.8, "MACD": { "macd": -0.25, "signal": -0.20, "histogram": -0.05 } } } 🧠 Agent Components | Component | Description | | ----------------------------- | -------------------------------------------------- | | 1day Data (HTTP Node) | Pulls latest data from secured /1dayData webhook | | OpenAI Chat Model | GPT-4.1 powers the analysis logic | | Tesla 1day Indicators Agent | LangChain agent performing interpretation | | Simple Memory | Short-term session continuity | 🛠️ Setup Instructions Import Workflow into n8n Name: Tesla_1day_Indicators_Tool Add Required Credentials Alpha Vantage Premium (via HTTP Query Auth) OpenAI GPT-4.1 (Chat Model) Install Webhook Fetcher Required: Tesla Quant Technical Indicators Webhooks Tool Endpoint /1dayData must be active Execution Context This tool is only triggered via: 👉 Tesla Financial Market Data Analyst Tool Inputs expected: message: optional context sessionId: session memory linkage 📌 Sticky Notes Overview 📘 Tesla 1-Day Indicators Tool – Purpose and integration 📡 Webhook Fetcher – Pulls daily Alpha Vantage data via HTTPS 🧠 GPT-4.1 Model – Reasoning for trend classification 🔗 Sub-Agent Trigger – Used only by Financial Market Analyst 🧠 Memory Buffer – Ensures consistent session logic 🔒 Licensing & Support © 2025 Treasurium Capital Limited Company This workflow—including prompts, logic, and formatting—is protected IP. 🔗 Don Jayamaha – LinkedIn 🔗 Creator Profile 🚀 Evaluate long-term Tesla price behavior with AI-enhanced technical analysis—critical for swing trading strategy. Required by the Tesla Financial Market Data Analyst Tool.
by GYANENDRA DWIVEDI
🚀 WhatsApp Automation Template Designed & Developed by Infridet Solutions Private Limited 🔧 Objective: Automate your lead nurturing and sales process from YouTube/Instagram → Landing Page → CRM → Email → WhatsApp → Sales → Deal Closure using tools like: 🌐 WordPress (Landing Page + Fluent Forms) 🧾 Google Sheets (Backup Log) 📩 FluentCRM (Lead Tagging + Email Sequences) 💬 Whinta.com (WhatsApp Messaging API) ⚙️ N8N (Workflow Automation Engine) 🧩 System Flow Overview: Lead Source: YouTube or Instagram CTA Landing Page: Built on WordPress with a story-driven design Form Capture: Fluent Forms with dynamic input fields Data Sync: Backup to Google Sheets Push lead to FluentCRM and tag as New Lead Email Sequence: Warm-up emails (1 to 5) Introduce offer or service WhatsApp Outreach: Send personalized message via Whinta Triggered 1 hour after form fill or last email Sales Follow-Up: Sales team handles replies manually CRM tag updated to Customer upon closing 📁 Folder Structure (Optional Git/Zip File): 📦 WhatsApp-Automation-Infridet/ │ ├── whatsapp-automation-n8n.json # N8N Flowchart Import File ├── email-templates.docx # Warm-up Email Scripts ├── whinta-api-integration.pdf # API Documentation ├── crm-tagging-notes.txt # CRM Tag Setup Details └── readme.md # This Instruction File 🛠️ Required Integrations & Setup ✅ Fluent Forms (WordPress) Embed form with Name, Email, Phone Enable webhook to N8N: /lead-capture ✅ Google Sheets Use n8n-nodes-base.googleSheets node Capture name, email, phone, source, timestamp ✅ FluentCRM REST API enabled Push contact and assign tag New Lead Setup Email Automation via tag trigger ✅ SMTP Email (Optional) Use Gmail SMTP or Brevo Trigger email on form submission ✅ Whinta.com (WhatsApp API) Send POST request Payload includes phone, message, sender_id Customize message with personalization 💬 Sample WhatsApp Message: Hey {{name}}, Gyan here from Account Craft 👋 I saw your form submission – would you like help in starting your YouTube journey this week? Let me know. I'm just one text away. ✅ 📧 Sample Email (Warmup Day 1): > Subject: Welcome to Account Craft 🚀 > Body: > Hi {{name}}, > > I’m Gyan from Account Craft. Thanks for joining us! > Here’s what’s coming next: exclusive videos, personalized tips, and real support to get your YouTube channel earning. > > Let’s go! > – Gyan 🔁 CRM Tag Updates: | Action | Tag Assigned | |-------------------|------------------| | On form fill | New Lead | | After WhatsApp | Engaged | | After sale closed | Customer | 📌 Final Output: Once completed, the system will: Log all leads into a database Automatically send emails and WhatsApp messages Notify your sales team Update lead status without manual entry > Automation Template Designed & Deployed by > Infridet Solutions Private Limited > Smart Integrations. Seamless Business. > 🌐 www.infridetsolutions.com | 📞 +91-8853354829
by Yang
👥 Who is this for? This workflow is ideal for virtual assistants, researchers, developers, automation specialists, and data analysts who need to regularly extract and organize structured product information (like books) from a website. It’s especially useful for those working with catalog-based websites who want to automate extraction and delivery of clean, sorted data. 🧩 What problem is this solving? Manually copying product listings like book titles and prices from a website into a spreadsheet is slow and repetitive. This automation solves that problem by scraping content using Dumpling AI, extracting the right data using CSS selectors, and formatting it into a clean CSV file that is sent to your email—all triggered automatically when a new URL is added to Google Sheets. ⚙️ What this workflow does This template automates an entire content scraping and delivery process: Watches a Google Sheet for new URLs Scrapes the HTML content of the given webpage using Dumpling AI Uses CSS selectors in the HTML node to extract each book from the page Splits the HTML array into individual items Extracts the book title and price from each HTML block Sorts the books in descending order based on price Converts the sorted data to a CSV file Sends the CSV via email using Gmail 🛠️ Setup Google Sheets Create a sheet titled something like URLs Add your product listing URLs (e.g., http://books.toscrape.com) Connect the Google Sheets trigger node to your sheet Ensure you have proper credentials connected Dumpling AI Create an account at Dumpling AI) - Generate your API key Set the HTTP Method to POST and pass the URL dynamically from the Google Sheet Use Header Auth to include your API key in the request header Make sure "cleaned": "True" is included in the body for optimized HTML output HTML Node The first HTML node extracts the main book container blocks using: .row > li The second HTML node parses out the individual fields: title: h3 > a (via the title attribute) price: .price_color Sort Node Sorts books by price in descending order Note: price is extracted as a string, ensure it's parsable if you plan to use numeric filtering later Convert to CSV The JSON data is passed into a Convert node and transformed into a CSV file Gmail Sends the CSV as an attachment to a designated email 🔄 How to customize this workflow Extract more data**: Add more CSS selectors in the second HTML node to pull fields like author, availability, or product links Switch destinations**: Replace Gmail with Slack, Google Drive, Dropbox, or another platform Adjust sorting**: Sort alphabetically or based on another extracted value Use a different source**: As long as the site structure is consistent, this can scrape any listing-like page Trigger differently**: Use a webhook, form submission, or schedule trigger instead of Google Sheets ⚠️ Dependencies and Notes This workflow uses Dumpling AI to perform the web scraping. This requires an API key and uses credits per request. The HTML node depends on valid CSS selectors. If the site layout changes, the selectors may need to be updated. Ensure you’re not scraping content from websites that prohibit automated scraping.
by explorium
Explorium Event-Triggered Outreach This n8n and agent-based workflow automates outbound prospecting by monitoring Explorium event data (e.g. product launches, new office opening, new investment and more), researching companies, identifying key contacts, and generating tailored sales emails leveraging the Explorium MCP server. Template Workflow Overview Node 1: Webhook Trigger Purpose: Listens for real-time product launch events pushed from Explorium's webhook system. How it works: Explorium sends HTTP POST requests containing event data The webhook payload includes company name, business ID, domain, product name, and event type Pay attention: Product launch is just one example. You can easily enroll to many more meaningful events. to learn about events and how to enroll to events, visit the events documentation. Node 2: Company Research Agent Agent Type: Tools Agent Purpose: Enrich company data after an event occurs. How it works: Uses Explorium MCP via the MCP Client tool to gather additional company data Uses Anthropic Claude (Chat Model) to process and interpret company information for downstream personalization Node 3: Employee Data Retrieval Purpose: Retrieve prospect-level data for targeting. How it works: Uses HTTP Request node to call Explorium's fetch_prospects endpoint Filters prospects by: Company business_id Departments: Product, R&D, etc... Seniority levels: owner, cxo, vp, director, senior, manager, partner, etc... Pay Attention: Follow our fetch prospect documentation for the full list of filter and best practice. Limits results to top 5 relevant employees Code nodes handle: Filtering logic Cleaning API response Formatting data for downstream agents Node 4: Conditional Branch - Prospect Data Check If Node: Checks whether prospect data was successfully retrieved Logic: If prospects found → personalized emails per person If no prospects → fallback to company-level general email Node 5A: Email Writer #1 (No Prospect Data) Agent Type: Tools Agent Purpose: Write generic outbound email using only company-level research and event info. Powered by: Anthropic Chat Model Node 5B: Loop Over Prospects → Email Writer #2 (Personalized) Agent Type: Tools Agent Purpose: Write highly personalized email for each identified employee. How it works: Loops through each individual prospect Passes company research + employee data to LLM agent Generates customized emails referencing: Prospect's title & department Product launch Role-relevant Explorium value proposition Node 6: Slack Notifications Purpose: Posts completed emails to internal Slack channel for review or testing before final deployment. Future State: Can be swapped with an email sequencing platform in production. Setup Requirements Explorium API Access MCP Client credentials for company enrichment and prospect fetching Registered webhook for event listening Get explorium api key n8n Configuration Secure environment variables for API keys & webhook secret Code nodes configured for JSON transformation, filtering & signature validation Customization Options Personalization Logic Update LLM prompt instructions to reflect ICP priorities Modify email templates based on role, department, or tenure logic Adjust fallback behavior when prospect data is unavailable API Request Tuning Adjust page_size for number of prospects retrieved Fine-tune seniority and department filters to match evolving targeting Future Expansion Swap Slack notifications for outbound email automation Integrate call task assignment directly into CRM Introduce engagement scoring feedback loop (opens, clicks, replies) Troubleshooting Tips Validate webhook signature matching to prevent unauthorized requests Ensure correct business_id is passed to prospect fetching endpoint Confirm business enrichment returns sufficient data for company researcher agents Review agent LLM responses for correct output structure and parsing consistency
by Lucas Walter
Who's it for Content creators, social media managers, and marketing teams who want to automatically extract the most engaging clips from long-form YouTube videos and identify content with high viral potential. What it does This workflow analyzes any YouTube video using Vizard AI's clipping technology and automatically generates up to 8 short clips with viral score ratings. It then filters for the highest-scoring clips (9/10 or above) and posts them to a designated Slack channel for team review and distribution. How it works Video submission: Enter a YouTube URL through a user-friendly form AI analysis: Submits the video to Vizard AI for automated clipping and viral score analysis Smart polling: Waits for processing completion and retrieves results Quality filtering: Only surfaces clips with viral scores of 9/10 or higher Team notification: Posts results to Slack with clip titles, scores, and download links Requirements Vizard AI API credentials (sign up at vizard.ai) Slack workspace with OAuth app configured How to set up Configure Vizard AI credentials: Add your Vizard AI API key to the HTTP Request nodes Set up Slack integration: Configure the Slack OAuth2 credentials and select your target channel Customize filtering: Adjust the viral score threshold in the filter node (currently set to 9/10) Test the workflow: Submit a test YouTube URL to ensure everything works properly How to customize the workflow Adjust clip quantity**: Modify the maxClipNumber parameter (currently 8) in the initial API request Change viral score threshold**: Update the filter condition to match your quality standards Extend with automation**: Connect to social media posting tools or caption generation workflows for full automation Add scheduling**: Integrate with webhook triggers, scheduled triggers, or RSS feeds for batch processing videos
by WeblineIndia
Smart Document Parser for Invoices, Logs or Sensor Reports (PDF/Image to Google Sheets) This n8n workflow automatically parses documents such as invoices, sensor logs or structured PDFs/images (including scanned docs or CSVs), extracts key fields like totals, dates and customer/vendor info using OCR and AI, and writes the structured output into Google Sheets. Who’s it for Finance or Ops teams automating invoice processing. SaaS platforms parsing uploaded reports or documents. Anyone needing a no-code backend for PDF/image/CSV document parsing. AI-powered data capture pipelines. How it works Webhook Trigger receives file uploads (/uploadDoc) Switch Node checks the file type: If image → Use Tesseract OCR If PDF → Use PDF parser If CSV → Extract as-is Extracted text is passed to: Google Gemini or Gemini Flash AI model Prompt extracts fields like invoice_id, total, customer_name, etc. JSON string is parsed and cleaned Data is appended to Google Sheets using appendOrUpdate How to set up Create a Google Sheet with columns like: invoice_id, invoice_date, due_date, customer_name, vendor_name, subtotal, tax_total, total, currency Connect: Google Sheets OAuth Google Gemini (PaLM API key) for LLM parsing Deploy the webhook endpoint: /uploadDoc Upload sample files (PDFs, images, CSVs) to test Review and map sheet columns in the Invoice Data node Requirements | Tool | Purpose | | ------------- | --------------------------------- | | n8n | Automation framework | | Google Sheets | To store structured output | | Tesseract OCR | For scanned image text extraction | | Google Gemini | For natural language parsing | How to customize Add extraction for line items using structured prompts. Change prompt to extract sensor readings, log types, or custom keys. Add support for other file types (e.g., XLSX, DOCX). Add Slack/Email notifications on success/failure. Swap Gemini with OpenAI or Hugging Face if preferred. Add‑ons Save uploaded files to Google Drive or S3 Add auth for secure uploads Use charting/dashboard nodes to visualize extracted data Integrate with billing/accounting software Use Case Examples | Scenario | What Happens | | ----------------------- | ------------------------------------------------------- | | Invoice Upload (PDF) | Extracts totals, customer, tax data into a Google Sheet | | Scanned Receipt (Image) | OCR + LLM extracts structured data | | Log File (CSV) | Parses and logs entries into Sheets | Common troubleshooting | Issue | Possible Cause | Solution | | --------------------------------- | ----------------------- | ------------------------------------------- | | Webhook not triggered | URL or method mismatch | Use correct POST URL /uploadDoc | | Text is blank | OCR failed | Check image quality or Tesseract config | | Gemini model not returning JSON | Prompt formatting issue | Ensure prompt ends with valid JSON schema | | Sheet not updated | Invalid Sheet ID or tab | Double-check sheet credentials and tab name | Need Help? Need help fine-tuning the Gemini prompt for better field accuracy? Want to extract full tables, multi-page invoices or convert PDFs to JSON lines? Our automation team at WeblineIndia can help you extend this into a full-blown document automation pipeline.
by VKAPS IT
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🎯 How it works This workflow captures new lead information from a web form, enriches it with Apollo.io data, qualifies the lead using AI, and—if the lead is strong—automatically sends a personalized outreach email via Gmail and logs the result in Google Sheets. 🛠️ Key Features 📩 Lead form capture with validation 🔍 Enrichment via Apollo API 🤖 Lead scoring using AI (LangChain + Groq) 📧 Dynamic email generation & sending via Gmail 📊 Logging leads with job title & org into Google Sheets ✅ Conditional email sending (score ≥ 6 only) 🧪 Set up steps Estimated time: 15–20 minutes Add your Apollo API Key to the HTTP Header credential (never hardcode!) Connect your Gmail account for sending emails Connect your Google Sheets account and set up the correct spreadsheet & sheet name Enable LangChain/Groq credentials for lead scoring and AI-generated emails Update the form endpoint to your live webhook if needed 📌 Sticky Notes Add the following mandatory sticky notes inside your workflow: FormTrigger Node: "Collects lead info via form. Ensure your form is connected to this endpoint." HTTP Request Node: "Enrich lead using Apollo.io API. Add your API key via header-based authentication." AI Agent (Lead Score): "Scores lead from 1-10 based on job title and industry match. Only leads with score ≥ 6 proceed." AI Agent (Email Composer): "Generates a concise, polite email using lead’s job title & company. Modify tone if needed." Google Sheets Append: "Logs enriched lead with job title, org, and LinkedIn URL. Customize sheet structure if needed." Gmail Node: "Sends personalized outreach email if lead passes score threshold. Uses AI-generated content." 💸 Free or Paid? Free – No paid API services are required (Apollo has a free tier).
by InfoGrab
This is a response chatbot in public channels through slash commands. I explain more in detail through the YouTube video, but it's only available in Korean. How it works? When you request the created slash command in Slack, the request comes to the webhook. Then, the Switch Node branches appropriately according to each slash command request. Here, a slash command called /ask is connected to the chatbot, and the chatbot generates answers to the questions asked. The final node responds to the channel. Set up steps Create a Slack app. Add chat:write permission in Slack OAuth&Permissions>Scopes. Create a Command in Slack Slash Commands menu and enter the n8n Webhook node's URL. Complete creating the Slash Commands. Enter the created command in the Switch node. 슬래시 커맨드를 통한 공개 채널에서의 응답 챗봇 입니다. 유튜브 영상에 더 자세하게 설명 드립니다. 설명 슬랙에 생성한 슬래시 커맨드를 슬랙에서 요청하면 웹훅에 요청이 들어옵니다. 이후 Switch Node에서 각 슬래시 커맨드의 요청에 따라 알맞게 분기합니다. 여기에서는 /ask라는 슬래시 커맨드가 챗봇으로 연결되어 있고, 챗봇에서 질문한 내용의 답변을 생성합니다. 마지막 노드에서 채널로 응답을 합니다. 설정 방법 Slack 앱을 만드세요. Slack OAuth&Permissions>Scopes 에서 chat:write 권한을 추가하세요. Slack Slash Commands 메뉴에서 Command를 생성하고, n8n Webhook 노드의 url을 입력하세요. Slash Slash Commands 생성을 완료하세요. Switch 노드에 생성한 커맨드를 입력하세요.
by Paul
Gmail AI Email Manager - Setup Guide 🎯 Workflow Overview This workflow will create an intelligent Gmail email manager that can: Monitor incoming emails via webhook Analyze email content using AI Categorize emails automatically Generate smart responses Take actions based on email content Send notifications for important emails 📋 Pre-Setup Checklist Before we build the workflow, let me gather the necessary information and validate our approach. Phase 1: Discovery & Planning [ ] Search for Gmail nodes [ ] Find AI analysis nodes [ ] Identify webhook trigger options [ ] Check notification nodes Phase 2: Configuration Requirements [ ] Gmail API credentials [ ] AI service (OpenAI/Claude) API key [ ] Webhook URL setup [ ] Email classification rules 🔧 Setup Instructions Step 1: Gmail API Setup Go to Google Cloud Console Create new project or select existing Enable Gmail API Create OAuth 2.0 credentials Add authorized redirect URI: https://your-n8n-instance.com/rest/oauth2-credential/callback Step 2: AI Service Setup Choose one of the following: OpenAI**: Get API key from platform.openai.com Claude**: Get API key from console.anthropic.com Local AI**: Set up Ollama or similar Step 3: n8n Credentials Gmail OAuth2: Add client ID, secret, and scopes AI Service: Add API key Webhook: Configure webhook URL Gmail AI Email Manager - Setup Guide 🔧 Quick Setup Checklist 1. Google Cloud Console [ ] Enable Gmail API [ ] Create OAuth2 credentials [ ] Add redirect URI: https://your-n8n.com/rest/oauth2-credential/callback [ ] Set up Gmail push notifications with Pub/Sub 2. API Keys [ ] Get OpenAI API key from platform.openai.com [ ] Create Google Sheets for logging (optional) 3. n8n Credentials [ ] Gmail OAuth2: Client ID, Secret, Scopes: gmail.readonly,gmail.modify,gmail.compose [ ] OpenAI API: Your API key 4. Gmail Labels (Create these) [ ] URGENT (red) [ ] IMPORTANT (orange) [ ] PROMOTIONAL (purple) [ ] PERSONAL (green) [ ] WORK (blue) [ ] SPAM (gray) 5. Update Workflow Values [ ] High Priority Alert: Change notification email [ ] Spreadsheet Log: Update sheet ID (if using) [ ] Webhook: Copy URL after saving workflow 6. Test [ ] Save & activate workflow [ ] Send test email to Gmail [ ] Check execution log [ ] Verify auto-categorization works That's it! Your AI email manager is ready! 🚀
by Oneclick AI Squad
This n8n workflow automates the process of scraping LinkedIn profiles using the Apify platform and organizing the extracted data into Google Sheets for easy analysis and follow-up. Use Cases Lead Generation**: Extract contact information and professional details from LinkedIn profiles Recruitment**: Gather candidate information for talent acquisition Market Research**: Analyze professional networks and industry connections Sales Prospecting**: Build targeted prospect lists with detailed professional information How It Works 1. Workflow Initialization & Input Webhook Start Scraper**: Triggers the entire scraping workflow Read LinkedIn URLs**: Retrieves LinkedIn profile URLs from Google Sheets Schedule Scraper Trigger**: Sets up automated scheduling for regular scraping 2. Data Processing & Extraction Data Formatting**: Prepares and structures the LinkedIn URLs for processing Fetch Profile Data**: Makes HTTP requests to Apify API with profile URLs Run Scraper Actor**: Executes the Apify LinkedIn scraper actor Get Scraped Results**: Retrieves the extracted profile data from Apify 3. Data Storage & Completion Save to Google Sheets**: Stores the scraped profile data in organized spreadsheet format Update Progress Tracker**: Updates workflow status and progress tracking Process Complete Wait**: Ensures all operations finish before final steps Send Success Notification**: Alerts users when scraping is successfully completed Requirements Apify Account Active Apify account with sufficient credits API token for authentication Access to LinkedIn Profile Scraper actor Google Sheets Google account with Sheets access Properly formatted input sheet with LinkedIn URLs Credentials configured in n8n n8n Setup HTTP Request node credentials for Apify Google Sheets node credentials Webhook endpoint configured How to Use Step 1: Prepare Your Data Create a Google Sheet with LinkedIn profile URLs Ensure the sheet has a column named 'linkedin_url' Add any additional columns for metadata (name, company, etc.) Step 2: Configure Credentials Set up Apify API credentials in n8n Configure Google Sheets authentication Update webhook endpoint URL Step 3: Customize Settings Adjust scraping parameters in the Apify node Modify data fields to extract based on your needs Set up notification preferences Step 4: Execute Workflow Trigger via webhook or manual execution Monitor progress through the workflow Check Google Sheets for scraped data Review completion notifications Good to Know Rate Limits**: LinkedIn scraping is subject to rate limits. The workflow includes delays to respect these limits. Data Quality**: Results depend on profile visibility and LinkedIn's anti-scraping measures. Costs**: Apify charges based on compute units used. Monitor your usage to control costs. Compliance**: Ensure your scraping activities comply with LinkedIn's Terms of Service and applicable laws. Customizing This Workflow Enhanced Data Processing Add data enrichment steps to append additional information Implement duplicate detection and merge logic Create data validation rules for quality control Advanced Notifications Set up Slack or email alerts for different scenarios Create detailed reports with scraping statistics Implement error recovery mechanisms Integration Options Connect to CRM systems for automatic lead creation Integrate with marketing automation platforms Export data to analytics tools for further analysis Troubleshooting Common Issues Apify Actor Failures**: Check API limits and actor status Google Sheets Errors**: Verify permissions and sheet structure Rate Limiting**: Implement longer delays between requests Data Quality Issues**: Review scraping parameters and target profiles Best Practices Test with small batches before scaling up Monitor Apify credit usage regularly Keep backup copies of your data Regular validation of scraped information accuracy
by Don Jayamaha Jr
🕒 Evaluate Tesla (TSLA) price action and market structure on the 1-hour timeframe using 6 real-time indicators. This sub-agent is designed to feed mid-term technical insights into the Tesla Financial Market Data Analyst Tool. It uses GPT-4.1 to interpret Alpha Vantage indicator data delivered via secure webhooks. ⚠️ This workflow is not standalone and is executed via Execute Workflow. 🔌 Requires: Tesla Quant Technical Indicators Webhooks Tool Alpha Vantage Premium API Key 🔧 Connected Indicators This tool fetches and analyzes the latest 20 datapoints for: RSI (Relative Strength Index)** MACD (Moving Average Convergence Divergence)** BBANDS (Bollinger Bands)** SMA (Simple Moving Average)** EMA (Exponential Moving Average)** ADX (Average Directional Index)** 📋 Sample Output { "summary": "TSLA is gaining strength on the 1-hour chart. RSI is rising, MACD has crossed bullish, and BBANDS are widening.", "timeframe": "1h", "indicators": { "RSI": 62.1, "BBANDS": { "upper": 176.90, "lower": 169.70, "middle": 173.30, "close": 176.30 }, "SMA": 174.20, "EMA": 175.60, "ADX": 27.5, "MACD": { "macd": 0.84, "signal": 0.65, "histogram": 0.19 } } } 🧠 Agent Components | Component | Role | | ------------------------------ | -------------------------------------------------- | | 1hour Data | Pulls Alpha Vantage indicator data via webhook | | Tesla 1hour Indicators Agent | Interprets signals using structured GPT-4.1 prompt | | OpenAI Chat Model | GPT-4.1 LLM performs analysis | | Simple Memory | Maintains session context | 🛠️ Setup Instructions Import Workflow into n8n Name it: Tesla_1hour_Indicators_Tool Install the Webhook Fetcher Tool 👉 Required: Tesla_Quant_Technical_Indicators_Webhooks_Tool This agent expects webhook /1hourData to return pre-cleaned data Add Credentials Alpha Vantage Premium API Key (via HTTP Query Auth) OpenAI GPT-4.1 credentials Configure for Sub-Agent Use Triggered only via Execute Workflow from: 👉 Tesla Financial Market Data Analyst Tool Inputs: message (optional) sessionId (required for memory linkage) 📌 Sticky Notes Overview 🟢 Trigger Setup – Activated only by the parent agent 📊 1h Webhook Fetcher – Calls Alpha Vantage via secured endpoint 🧠 AI Agent Summary – Interprets trend/momentum from indicator data 🔗 GPT Model Notes – GPT-4.1 parses and explains technical alignment 📘 Documentation Sticky – Embedded in canvas with full walkthrough 🔐 Licensing & Support © 2025 Treasurium Capital Limited Company This tool is part of a proprietary multi-agent AI architecture. No commercial reuse or redistribution permitted. 🔗 Author: Don Jayamaha 🔗 Templates: https://n8n.io/creators/don-the-gem-dealer/ 🚀 Detect TSLA trend shifts and validate setups with 1-hour technical clarity—powered by Alpha Vantage + GPT-4.1. This tool is required by the Tesla Financial Market Data Analyst Tool.