by Rahul Joshi
Description: Stay ahead of payment disputes with this automated n8n workflow that integrates Stripe, Slack, and ClickUp. Perfect for finance teams, payment ops specialists, and SaaS businesses, this template fetches disputes directly from Stripe, analyzes urgency, and instantly notifies your team with rich, formatted alerts. High-priority disputes are flagged, pushed into Slack for immediate visibility, and tracked in ClickUp with due dates aligned to Stripe evidence deadlines—ensuring no dispute ever slips through the cracks. For lower-priority or resolved cases, the workflow provides concise updates and maintains an audit trail. No more manual Stripe checks, late responses, or missed deadlines—this workflow turns dispute management into a proactive, structured process. What This Template Does (Step-by-Step): 🟢 Trigger: Manual or Scheduled Execution Run the workflow on demand or schedule it (e.g., every 4 hours). 📥 Fetch Stripe Disputes Calls the Stripe API to retrieve all active disputes in your account. 📊 Validate & Format Data Ensures disputes exist, then enriches data with formatted amounts, deadlines, and customer info. ⚖️ Priority Logic Determines urgency based on dispute status, evidence deadlines, and transaction amount. 🚨 High Priority Path • Sends urgent Slack alert with full dispute details • Creates a high-priority ClickUp task with due dates • Flags immediate action required 📋 Standard Path • Sends standard Slack alert for non-urgent cases • Creates a ClickUp task with appropriate priority levels ℹ️ No Disputes Path Sends a Slack summary confirming no new disputes, maintaining a clear audit trail. ✅ Workflow Completion Confirms all disputes are processed, logged, and assigned—ready for your team to take action. Required Integrations: ✅ Stripe API (for dispute data) ✅ Slack API (for team alerts) ✅ ClickUp API (for task management) Perfect For: 💳 FinOps and payment operations teams monitoring transactions 🏢 SaaS platforms or e-commerce handling large payment volumes 🛡️ Risk and compliance teams tracking disputes and deadlines 📈 Businesses scaling customer payment handling and case management Why Use This Template? ✔️ Never miss a dispute deadline ✔️ Automated priority assessment saves hours of manual checks ✔️ Seamlessly integrates alerts + task tracking ✔️ Provides full visibility and accountability for dispute resolution
by Cj Elijah Garay
📋 WORKFLOW OVERVIEW Automate reactions for Telegram Channel Posts - Automated Telegram reaction system for specific posts Flow: User sends message to a receiver bot AI parses request (emoji type & quantity) Code processes and validates data Loop sends reactions one by one User receives confirmation Key Features: Natural language processing by sending a message to a chat bot to react to a post on a different channel Reiterates through bot token rotation. This means that if you use 100 bots then you will be able to have 100 reactions per post of your choice Rate limit protection Error handling with helpful messages You will need to first add the bots that you personally own which can be acquired from BotFather to the channel that you would want them to react posts to and allow it to manage messages. Required Bot Permissions: Bot Must Be an Administrator The bot needs to be added as an admin to the channel (regular member status won't work for reactions). Specific Admin Rights Needed: When adding the bot as admin, you need to enable: ✅ "Post Messages" - This is actually the key permission needed ✅ "Add Subscribers" (optional, but sometimes required depending on channel settings) Credentials needed are: Target Channel ID, Bot tokens, Bot Receiver token, OpenAI API Example Usage: "https://t.me/channel/123 needs 10 hearts and 10 fire reactions If in need of help contact me at: elijahmamuri@gmail.com
by Zeljislav Petrovic
Who is this template for? This workflow is built for PPC managers, digital marketing agencies, and in-house marketing teams who run Google Ads search campaigns and want to stop wasting budget on irrelevant queries. Instead of manually reviewing hundreds or thousands of search terms in spreadsheets, this template automates the entire analysis pipeline — from raw Google Ads data to a fully structured, AI-analyzed search terms report with actionable keyword exclusion and inclusion recommendations, saved directly to Google Sheets. What this workflow does This n8n workflow connects the Google Ads API, an OpenAI chat model, and Google Sheets to deliver a production-ready search term intelligence pipeline that runs automatically every 30 days — or on demand: Pulls search term data directly from Google Ads — using the Google Ads API (GAQL searchStream), the workflow fetches every search term that triggered an ad in the last 30 days, including campaign and ad group context, matched keyword, match type, search term status (Added / Excluded / None), and full performance metrics: impressions, clicks, CTR, cost, avg CPC/CPM, conversions, conversion rate, and cost per conversion Cleans and formats the dataset for AI — a Code node normalizes and filters raw API data: formats currency and percentage values, applies a metrics threshold filter, and structures the data as a clean input for the AI model Runs a comprehensive AI analysis on every search term — an OpenAI chat model (GPT-5.4-mini or any LLM supported by n8n) acts as a senior PPC strategist. For each term, it outputs: Intent signal — Transactional, Commercial, Informational, or Navigational Relevancy — on a 5-level scale from Directly Relevant to Directly Irrelevant Relevancy reasoning — a specific, business-context-aware explanation referencing campaign fit, intent, and performance signals Exclusion suggestion — exact negative keyword with correct match type syntax (broad, phrase, or exact) Exclusion level — Account, Campaign, or Ad Group Priority for exclusion — High, Medium, Low, or N/A Inclusion suggestion — keyword to add with recommended match type Inclusion type — New keyword, New ad group, or New campaign Inclusion level — Account, Campaign, or Ad Group Generates a Recommended Global Exclusions list — the AI distills patterns across the full dataset into a grouped negative keyword list, organized by theme (e.g. Informational/Zero-Intent Modifiers, Price/Deal Seeking Terms, Job Seeker Terms, Wrong Audience Terms), ready for direct upload to Google Ads shared lists Produces an Inference Summary — the AI documents its assumptions from business context and search terms dataset and how they influenced the analysis Saves everything to Google Sheets automatically — a new spreadsheet is created on every run with three sheets: Search Terms Analysis, Global Exclusions List, and Inference Summary What you get in the output A new Google Spreadsheet is automatically created on every wokflow run, named after your company and timestamped: "Search Terms Analysis & Keyword Exclusion/Inclusion Suggestions (Company Name — DD/MM/YYYY HH:MM)" 📊 Search Terms Analysis The complete, AI-enriched analysis table. Every search term from the report is included — no filtering, no omissions. All original Google Ads columns are preserved, with AI-generated columns appended: search_term, status, campaign, ad_group, matched_keyword, match_type impressions, clicks, ctr, cost, avg_cpc, avg_cpm, conversion_rate, cost_per_conversion intent_signal — Transactional / Commercial / Informational / Navigational relevancy — 5-level scale relevancy_reasoning — specific AI explanation per term exclusion_suggestion — negative keyword with match type syntax exclusion_level — Account / Campaign / Ad Group priority_for_exclusion — High / Medium / Low inclusion_suggestion — keyword to add with recommended match type inclusion_type — New keyword / New ad group / New campaign inclusion_level — Account / Campaign / Ad Group 🚫 Global Exclusions List Grouped negative keywords identified as universally or near-universally problematic across campaigns. Each entry includes the negative keyword with correct match type formatting, its thematic group, and the rationale for account-level exclusion. 📋 Inference Summary Documents any assumptions the AI made when business context fields were incomplete or ambiguous — useful for auditing the AI's reasoning and improving the Set node configuration over time. How to set up Google Ads API — ensure you have a Developer Token in your Google Ads MCC account and set up OAuth 2.0 credentials with the required Google Ads API scopes OpenAI — add your OpenAI API key to the AI model node credentials Google Sheets — connect your Google account via OAuth 2.0. A new spreadsheet is created automatically on each run — no manual sheet setup required Configure the Set node — open the "Set Google Ads IDs & business context" node and fill in your Google Ads MCC ID (leave blank if accessing directly), Customer ID, Developer Token, and all business context fields: company name, website, industry, company overview, target audience, campaign goals, key offerings, and known irrelevant topics Activate the workflow — the schedule trigger runs automatically every 30 days. Use the manual trigger for on-demand analysis Requirements n8n (cloud or self-hosted) Google Ads account with API access and a Developer Token Google Ads OAuth 2.0 credentials OpenAI API key (GPT-4.1-mini or higher recommended) Google account with Sheets access How to customize this workflow Change the AI model** — swap GPT-5.4-mini for Claude, Gemini, or any LLM supported by n8n Adjust the date range** — modify the DURING LAST_30_DAYS clause in the GAQL query to use a custom range (e.g. last 7 days, last quarter, current month) Change the schedule trigger interval** — edit the trigger interval to run the workflow weekly, monthly, or on any custom schedule Change the metrics threshold** — the Clean & Format node has a metrics filter that excludes terms with fewer than 5 impressions and 0 clicks; adjust the threshold to match your account's activity level Add Slack or email notifications** — append a notification node at the end to alert your team when a new analysis is ready Customize the AI business context** — the quality of AI analysis scales directly with the richness of the business context provided in the Set node; the more specific the company overview, target audience, and known irrelevant topics, the more precise the relevancy scoring Connect to Google Ads for direct upload** — extend the workflow to push approved exclusions back to Google Ads shared negative keyword lists via the Google Ads API About the creator Built by a digital marketing and automation specialist. For questions, advanced customizations, or custom n8n workflow solutions, feel free to reach out on LinkedIn.
by Sridevi Edupuganti
🎙️ Audio-to-Insights Workflow (Form Upload + Google Drive Link) Description This workflow enables seamless speech-to-text transcription, AI-powered summarization, sentiment analysis, and automated email delivery. It supports two different input modes: Form Upload (Local File)** Form Submission (Google Drive Link)** How it Works Input Form 1: Upload an audio file (e.g., .mp3,.wav,.mp4) Form 2: Submit a Google Drive link File Handling Local uploads go directly to AssemblyAI. Drive links are parsed → File ID extracted → File fetched → Sent to AssemblyAI. Transcription AssemblyAI generates transcript text with punctuation and highlights. Workflow loops with Wait + If until transcript status = completed. AI Analysis Transcript is passed to OpenAI. Generates a structured output including: Executive summary Sentiment label & score Key points Action items Notable quotes Topics Email Delivery A formatted email is sent via Gmail with the summary and insights. Features Dual input support: Google Drive OR direct upload Handles long-running jobs with Wait + If polling AI-powered transcript analysis with structured JSON Automated sentiment scoring and summary generation Professional HTML email reports Requirements AssemblyAI API Key – transcription Google Drive OAuth2 – file fetch OpenAI API Key – summarization & sentiment analysis Gmail OAuth2 – email delivery How to Use Import this workflow into your n8n instance. Add and configure the required credentials. Update placeholders for: AssemblyAI API Key Google Drive Link Gmail ID Trigger via either form (local file or Google Drive link). 5.For long recordings, split before uploading (10–20 min per chunk, 2–5s overlap).Keep audio consistent (e.g., WAV/MP3, 16 kHz mono if possible).Process chunks sequentially and combine summaries/action items at the end. Customising this Workflow Adjust the OpenAI prompt to fit your reporting style (executive summary, bullet points, etc.). Extend email formatting with logos or branding. Add Slack, CRM, or Notion integrations for distribution. Use Cases Meeting or lecture transcription with summaries Podcast analysis with highlights and quotes Business call reviews with action item extraction Academic seminar notes emailed automatically
by Yaron Been
Revenue Growth Strategy with CRO-led Multi-Agent Team using O3 & GPT-4.1-mini 🔥 Powered by OpenAI O3 & GPT-4.1-mini Multi-Agent System \#RevOps #n8nWorkflows #AIRevenue #OpenAI #GrowthHacking ⚡ Section 1: Start & Orchestrator 💬 Chat Trigger* → Listens for revenue-related requests (e.g., *“Optimize our sales funnel”). 🤖 CRO Agent (O3)* → Acts as the *Chief Revenue Officer**. Thinks strategically with the Think Node. Decides which specialist agents to call. 🧠 OpenAI O3 Model** → Provides advanced reasoning for CRO decisions. Benefit: Central orchestration ensures every request gets a strategic, executive-level response before delegation. 🛠️ Section 2: Specialist Agents Each specialist agent uses GPT-4.1-mini for fast, cost-effective execution. They receive the CRO’s instructions and return insights. 📈 Sales Pipeline Analyst Funnel optimization, conversion tracking, bottleneck fixes. Outputs: Pipeline health, drop-off points, recommendations. 🎯 Revenue Attribution Specialist Multi-touch attribution, ROI analysis, campaign efficiency. Outputs: Attribution models, marketing ROI. 📊 Revenue Forecasting Analyst Predictive modeling, scenario planning, growth projections. Outputs: Forecast reports, “what-if” scenarios. ⚙️ Revenue Operations Manager CRM optimization, territory planning, sales automation. Outputs: Process improvements, efficiency boosts. 💰 Pricing & Packaging Strategist Competitive pricing analysis, packaging strategy, revenue optimization. Outputs: Price models, package recommendations. 🧠 Revenue Intelligence Analyst BI dashboards, performance tracking, KPI insights. Outputs: Reports with actionable intelligence. Benefit: Breaks complex revenue problems into specialized tasks handled by domain experts. 🔄 Section 3: Feedback & Integration Each agent → sends results back to CRO Agent. CRO Agent → compiles a comprehensive revenue strategy. Can integrate with CRM, BI dashboards, or Slack/Email for delivery. Benefit: Clear, actionable insights delivered in one place — like having a virtual RevOps team on demand. 📊 Workflow Overview | Section | Key Nodes | Purpose | Benefit | | ----------------------- | ----------------------------------- | ------------------------------------------------- | ------------------------------------ | | ⚡ Start & Orchestration | Chat Trigger, CRO Agent, O3 Model | Capture request & assign to CRO | Centralized leadership | | 🛠️ Specialists | 6 Agent Nodes + GPT-4.1-mini models | Analyze pipeline, pricing, ops, attribution, etc. | Specialized, cost-efficient insights | | 🔄 Feedback Loop | CRO Agent aggregation | Compiles strategy from multiple agents | Unified, data-driven revenue plan | 💡 Use Cases Pipeline Optimization** → Identify bottlenecks, improve conversions. Attribution Modeling** → Know exactly where revenue comes from. Revenue Forecasting** → Plan growth scenarios and projections. Ops Excellence** → Automate CRM, streamline sales ops. Pricing Strategy** → Compete smarter with optimized pricing models. Revenue Intelligence** → Ongoing tracking and performance monitoring. 💸 Cost Optimization O3 only for CRO decisions** → Strategic layer. GPT-4.1-mini for specialists** → Low-cost execution (\~90% cheaper). Parallel processing** → All agents can run simultaneously. ✅ Final Result: A virtual AI-powered RevOps team that turns any revenue-related question into a comprehensive growth strategy — instantly.
by Daniel
Adaptive LLM Router for Optimized AI Chat Responses Elevate your AI chatbots with intelligent model selection: automatically route simple queries to cost-effective LLMs and complex ones to powerful ones, balancing performance and expenses seamlessly. What It Does This workflow listens for chat messages, uses a lightweight Gemini model to classify query complexity, then selects and routes to the optimal LLM (Gemini 2.5 Pro for complex, OpenAI GPT-4.1 Nano for simple) to generate responses—ensuring efficient resource use. Key Features Complexity Classifier** - Quick assessment using Gemini 2.0 Flash Dynamic Model Switching** - Routes to premium or budget models based on needs Chat Trigger** - Webhook-based for real-time conversations Current Date Awareness** - Injects $now into system prompt Modular Design** - Easy to add more models or adjust rules Cost Optimization** - Reserves heavy models for demanding tasks only Perfect For Chatbot Developers**: Build responsive, cost-aware AI assistants Customer Support**: Handle routine vs. technical queries efficiently Educational Tools**: Simple facts vs. in-depth explanations Content Creators**: Quick ideas vs. detailed writing assistance Researchers**: Basic lookups vs. complex analysis Business Apps**: Optimize API costs in production environments Technical Highlights Harnessing n8n's LangChain nodes, this workflow demonstrates: Webhook triggers for instant chat handling Agent-based classification with strict output rules Conditional model selection for AI chains Integration of multiple LLM providers (Google Gemini, OpenAI) Scalable architecture for expanding model options Ideal for minimizing AI costs while maximizing response quality. No coding required—import, configure credentials, and deploy!
by Țugui Dragoș
This workflow automatically scores and categorizes new GoHighLevel contacts using AI (GPT-4), then tags and assigns them to the appropriate team member based on their score. Hot leads also trigger a Slack notification for immediate follow-up. What does it do? Triggers when a new contact is added in GoHighLevel. Fetches full contact details and recent engagement data. Uses AI (GPT-4) to analyze and score the lead (1-100), categorize it (Hot, Warm, Cold), and provide an explanation. Tags the contact in GoHighLevel based on the score. Assigns the lead to the correct sales or nurturing team member. Sends a Slack alert for Hot leads to ensure fast response. Use case Use this workflow to automate lead qualification and assignment in sales teams using GoHighLevel. It helps prioritize high-quality leads, ensures fast follow-up, and reduces manual work. How to configure GoHighLevel API: Set your GoHighLevel API URL and API key in the Workflow Configuration node. Update user IDs for assignment as needed. Slack Integration: Add your Slack webhook URL or credentials in the Slack Notify Hot Lead node. AI Provider: Configure your OpenAI (or compatible) credentials in the AI Lead Scoring (GPT-4) node. Adjust thresholds: If needed, change the score thresholds in the IF nodes to match your business logic. Activate the workflow: Once configured, activate the workflow to start processing new leads automatically. Tip: You can further customize the workflow to fit your sales process, add more notifications, or integrate with other tools as needed.
by Guillaume Duvernay
Build a powerful AI chatbot that provides precise answers from your own company's knowledge base. This template provides a smart AI agent that connects to Lookio, a platform where you can easily upload your documents (from Notion, Jira, Slack, etc.) to create a dedicated knowledge source. What makes this agent "smart" is its efficiency. It's configured to handle simple greetings and small talk on its own, only using its powerful (and paid) knowledge retrieval tool when a user asks a genuine question. This cost-saving logic makes it perfect for building production-ready internal helpdesks, customer support bots, or any application where you need accurate, source-based answers. Who is this for? Customer support teams:** Build internal bots that help agents find answers instantly from your support documentation and knowledge bases. Product & engineering teams:** Create a chatbot that can answer technical questions based on your product documentation or internal wikis. HR departments:** Deploy an internal assistant that can answer employee questions based on company handbooks, policies, and procedures. Any business with a knowledge base:** Provide an interactive, conversational way for employees or customers to access information locked away in your documents. What problem does this solve? Provides accurate, grounded answers:** Ensures the AI agent's responses are based on your trusted, private documents, not the open internet, which prevents factual errors and "hallucinations." Makes your knowledge accessible:** Transforms your static documents and knowledge bases into an interactive, 24/7 conversational resource. Optimizes for cost and efficiency:** The agent is intelligent enough to handle simple small talk without making unnecessary API calls to your knowledge base, saving you credits and money. Simplifies RAG setup:** Provides a ready-to-use template for a common RAG (Retrieval-Augmented Generation) pattern, with the complexities of document management and retrieval handled by the Lookio platform. How it works First, build your knowledge base in Lookio: The process starts on the Lookio platform. You upload your documents (from Notion, Jira, PDFs, etc.) and create an "assistant" which becomes your secure, queryable knowledge base. A user asks a question: The n8n workflow begins when a user sends a message via the Chat Trigger. The agent makes a decision: The AI Knowledge Agent, guided by its system prompt, analyzes the user's message. If it's a simple greeting like "hi," it will respond directly. If it's a substantive question that requires specific knowledge, it decides to use its "Query knowledge base" tool. Query the Lookio knowledge base: The agent passes the user's question to the HTTP Request Tool. This tool securely calls the Lookio API with your specific Assistant ID and API key. Deliver the fact-based answer: Lookio searches your documents, synthesizes a precise answer, and sends it back to the workflow. The n8n agent then presents this answer to the user in the chat interface. Architectural Approaches to RAG in n8n with Lookio From a workflow perspective, integrating RAG natively in n8n involves orchestrating multiple nodes for data handling, embedding, and vector searches. This method provides high visibility and control over each step. An alternative architectural pattern is to use an external RAG service like Lookio, which consolidates these steps into a single HTTP Request node. This simplifies the workflow's structure by abstracting the multi-stage RAG process into one API endpoint. Setup Set up your Lookio assistant (Prerequisite): First, go to Lookio, sign up (you get 50 free credits), create an assistant with your documents, and from your settings, copy your API Key and Assistant ID. Configure the Lookio tool: In the Query knowledge base (HTTP Request Tool) node: Replace the <your-assistant-id> placeholder with your actual Assistant ID. Replace the <your-lookio-api-key> placeholder with your actual API Key. Connect your AI model: In the OpenAI Chat Model node, connect your AI provider credentials. Activate the workflow. Your smart knowledge base agent is now live and ready to chat! Taking it further Adjust retrieval quality:* In the *Query knowledge base** node, you can change the query_mode from flash (fastest) to deep for higher quality but slightly slower answers, depending on your needs. Add more tools:** Enhance your agent by giving it other tools, like a web search for when the internal knowledge base doesn't have an answer, or a calculator for performing computations. Deploy it anywhere:* Swap the *Chat Trigger* for a *Slack* or *Discord** trigger to deploy your agent right where your team works.
by Max
About This flow is ideal for online schools that use Zoom to teach classes and Google Classroom for storing materials and homework. It listens for Zoom webhooks that come after each recorded call is uploaded to Zoom Cloud (you'll need Zoom paid plan). When new meeting comes, it filters out calls that last less than 30 mins. After duration check, it checks if there is a Google Class that matches the call name. Your call must be named exactly as the Google Class you want the call to be uploaded to. If the class is found, it will extract the Class ID. This flow assumes that you have a specific topic used for storing class recordings and materials, so it will look for this topic and upload the material. If topic is not found, you'll get an email. Requirements You'll need a: Zoom paid plan that supports Zoom Cloud Google cloud console to set up Classroom API and Gmail API OpenAI API key or any other provider
by Abdul Mir
Overview Create hyper-personalized cold outreach messages at scale by combining Google Sheets, web scraping, and AI. This workflow is perfect for sales teams, SDRs, and agency owners looking to boost reply rates with icebreakers that actually feel personal. It takes lead info from a Google Sheet—including name, email, company, and website—then visits each site, pulls meaningful text, and crafts a tailored message using AI. The personalized message is then written back into your lead sheet, ready for use in cold email, LinkedIn DMs, or CRM enrichment. Who’s it for Cold email outreach specialists B2B sales and SDR teams Lead generation agencies Founders doing outbound manually How it works Pull lead data from Google Sheets Loop through each lead and scrape their website using an HTTP node Clean and format the website content Use OpenAI to generate a custom-written icebreaker for each lead Write the final icebreaker back into the spreadsheet How to set up Connect your Google Sheets account Replace the spreadsheet ID and column names with your own Set up your OpenAI credentials (or whichever LLM you prefer) Tweak the prompt for tone or style Hit "Execute Workflow" and watch the sheet populate Requirements Google Sheets credentials OpenAI (or any compatible LLM node) The websites listed must be publicly accessible and static How to customize Modify the scraping logic to focus on specific sections (e.g. About page, Case Studies) Adjust the AI prompt to match your brand’s tone Add filtering logic to skip low-value leads Integrate with your CRM to send the data downstream
by Ruth Olatunji
This n8n is a daily analytics automation that calculates which lead sources generate actual revenue, not just leads. Provides ROI data, conversion rates, and budget allocation recommendations. Use Case: automates marketing ROI tracking by linking closed deals to their lead sources in Airtable, calculating revenue and ROI per channel, and sending daily insights to Slack. What It Does Runs nightly to analyze closed deals from last 30 days Matches deals to their original lead sources Calculates total revenue per source Computes ROI (revenue vs. cost per lead) Determines conversion rates by source Updates Lead Sources table with metrics Sends weekly reports to team How It Works Step 1: Schedule Trigger Runs daily at midnight Step 2: Fetch Closed Won Deals Gets all deals where: Stage = "Closed Won" Actual Close Date in last 30 days Step 3: Fetch Lead Sources Gets cost and lead count data from Lead Sources table Step 4: Calculate ROI (JavaScript) For each source: Total revenue = Sum of all deals from that source Total cost = Cost per lead × Total leads ROI = ((Revenue - Cost) / Cost) × 100 Conversion rate = Deals closed / Total leads × 100 Average deal size = Revenue / Deal count Step 5: Update Lead Sources Writes calculated metrics back to Airtable Step 6: Send Report Slack message with top 3 performing sources Business Impact Marketing ROI:** Know exactly which channels generate revenue Budget optimization:** Allocate spend to highest-ROI sources Data-driven decisions:** Stop guessing, start knowing Cost reduction:** Cut low-performing channels Revenue growth:** Double down on what works Technical Requirements n8n (self-hosted or cloud) Airtable (uses existing tables) Slack (for reports) Gmail for reminder incase CEO missed the report in the Slack channel (optional)
by Kevin Meneses
What this workflow does This workflow automates end-to-end stock analysis using real market data and AI: Reads a list of stock tickers from Google Sheets Fetches fundamental data (valuation, growth, profitability) and OHLCV price data from EODHD APIs Computes key technical indicators (RSI, SMA 20/50/200, volatility, support & resistance) Uses an AI model to generate: Buy / Watch / Sell recommendation Entry price, stop-loss, and take-profit levels Investment thesis, pros & cons Fundamental quality score (1–10) Stores the final structured analysis back into Google Sheets This creates a repeatable, no-code stock analysis pipeline ready for decision-making or dashboards. Data source Market data is powered by EODHD APIs How to configure this workflow 1. Google Sheets (Input) Create a sheet with a column called: ticker (e.g. MSFT, AAPL, AMZN) Each row represents one stock to analyze. 2. EODHD APIs Create an EODHD account Get your API token Add it to the HTTP Request nodes as: api_token=YOUR_API_KEY EODHD APIs 3. AI Model Configure your AI provider (OpenAI / compatible model) The AI receives: Fundamentals Technical indicators Growth potential score It returns structured JSON with recommendations and trade levels 4. Google Sheets (Output) Results are appended to a Signals tab with: Signal (BUY / WATCH / SELL) Entry, Stop Loss, Take Profit Fundamental score (1–10) Investment thesis and risk notes