by Anan
⚡Auto Rename n8n Workflow Nodes with AI✨ This workflow uses AI to automatically generate clear and descriptive names for every node in your n8n workflows. It analyzes each node's type, parameters, and connections to create meaningful names, making your workflows instantly readable. Who is it for? This workflow is for n8n users who manage complex workflows with dozens of nodes. If you've ever: Built workflows full of generic names like HTTP Request 2 or Edit Fields 1 Struggled to understand your own work after a few weeks Copied workflows from others with unclear node names Spent hours manually renaming nodes one by one ...then this workflow will save you significant time and effort. Requirements n8n API Credentials**: Must be configured to allow listing and updating workflows AI Provider Credentials**: An API key for your preferred AI provider (OpenRouter is used currently) How it works Trigger: Launch via form (select from dropdown) or manual trigger (quick testing with pre-selected workflow) Fetch: Retrieve the target workflow's JSON and extract nodes and connections Generate: Send the workflow JSON to the AI, which creates a unique, descriptive name for every node Validate: Verify the AI mapping covers all original node names Apply: If valid, update all node names, parameter references, and connections throughout the workflow Save: Save/Update the workflow with renamed nodes and provide links to both new and previous versions If validation fails (e.g., AI missed nodes), the workflow stops with an error. You can modify the error handling to retry or loop back to the AI node. Setup Connect n8n API credentials Open any n8n node in the workflow and make sure your n8n API credentials is connected Configure AI provider credentials Open the "OpenRouter" node (or replace with your preferred AI) Add your API credentials Adjust the model if needed (current: openai/gpt-5.1-codex-mini) Test the workflow Use Manual Trigger for quick testing with a pre-selected workflow Use Form Trigger for a user-friendly interface with workflow selection Important notice If you're renaming a currently opened workflow, you must reload the page after execution to see the latest version, n8n doesn't automatically refresh the canvas when workflow versions are updated via API. Need help? If you're facing any issues using this workflow, join the community discussion on the n8n forum.
by Yves Junqueira
Who's it for Digital marketing agencies and Meta Ads managers who need to generate comprehensive performance reports across multiple client accounts automatically. Perfect for agencies handling 5+ Meta Ads accounts who want to save hours on manual reporting while delivering AI-powered insights to their teams. What it does Pulls performance data from multiple Meta Ads accounts for a specified time period (last 7, 14, or 30 days) Uses Claude AI with Pipeboard's Meta Ads MCP to analyze campaign performance, identify trends, and generate actionable insights Generates professional reports with AI-driven recommendations for optimization Automatically delivers formatted reports to your Slack channels Runs on a schedule (weekly/daily) or triggered manually How to set up Set up Claude AI integration (requires Anthropic API key) Configure Pipeboard's Meta Ads MCP connection Connect Slack to n8n via OAuth2 Create a list of client account IDs in the workflow configuration Customize your reporting template and Slack delivery settings Requirements n8n version 1.109.2 or newer. Claude AI API access (Anthropic) Pipeboard account Slack workspace access How to customize the workflow Adjust the date range and metrics to track Modify the AI prompts for different types of insights Configure multiple Slack channels for different clients Set up custom scheduling intervals Add email delivery as an additional output channel
by Sergey Filippov
Who's it for Developers building AI-powered workflows who want to ensure their agents work reliably. If you need to validate AI outputs, test agent behavior systematically, or build maintainable automation, this template shows you how. What it does This subworkflow extracts structured meeting details (title, date, time, location, links, attendees) from natural language messages using an AI agent. It demonstrates production-ready patterns: Structured output validation**: JSON schema enforcement prevents malformed responses Error handling**: Graceful failures with full execution traceability Automated evaluation**: Test agent accuracy against expected outputs using Google Sheets Dual execution modes**: Normal extraction + evaluation/testing mode The AI resolves relative time ("tomorrow", "next Friday") using timezone context and handles incomplete data gracefully. How to set it up Connect OpenAI API credential to the AI agent node Copy the test data sheet: https://docs.google.com/spreadsheets/d/1U89nPsasM2WNv1D7gEYINhDwylyxYw7BOd_i8ipFC0M/edit?usp=sharing Update Google Sheet IDs in load_eval_data and record_eval_output nodes Test normal mode: Execute workflow "from trigger" Test evaluation mode: Execute workflow "from load_eval_data" Requirements OpenAI API key Google Sheets OAuth credential Why subworkflow architecture? Reusability: Wrap AI agents in subworkflows to call them from multiple parent workflows. Extract meetings from Slack, email, or webhooks—same agent, consistent results. Testability: This pattern enables isolated testing for each AI component. Set up evaluation datasets, run automated tests, and validate accuracy before deploying to production. You can't do this easily with inline agents. Maintainability: Update the agent logic once, and all parent workflows benefit. Error handling and validation are built-in, so failures are traceable with execution IDs. This framework includes: Dual-trigger pattern (normal + evaluation modes) Output validation that catches silent AI failures Error bubbling with execution metadata for debugging Evaluation framework with semantic/exact matching Proper routing that returns output to parent workflows Following this pattern for other agents To adapt this for any AI task (contact extraction, invoice processing, sentiment analysis, etc.): Replace extract_meeting_details with your AI agent (add tools, memory, etc. as needed) Update Structured Output Parser schema to match your data structure Modify evaluate_match prompt for your validation criteria Create test cases in Google Sheets with your inputs/expected outputs Adjust normalize_eval_data timezone/reference time if needed The validation, error handling, and evaluation infrastructure stays the same regardless of what your agent does.
by Muhammad Ahmad
📝 Automation: Instantly Onboard New Clients from Tally Form to Notion, Google Drive & Slack This automation streamlines the client onboarding process by integrating Tally, Notion, Google Drive, and Slack. When a potential client submits a Tally form, the automation is triggered via a webhook, automatically handling all onboarding steps without manual intervention. ⚙️ How It Works – Step-by-Step Form Submission Triggered A new Tally form submission is received via a webhook. Client Data Extraction The automation extracts essential client details from the form, including: -Name -Email -Project Type -Budget Google Drive Folder Creation A dedicated Google Drive folder is generated using the client’s name and project type for storing onboarding assets. Notion Database Entry Creation A new item is added to a specified Notion database, storing: Client information Project scope Folder link Slack Team Notification A Slack message is sent to your designated team channel containing all onboarding details, ensuring the team is informed instantly. ✅ Pre-Conditions / Requirements A published Tally form collecting client data. A connected Google Drive account with folder creation permissions. An existing Notion database with columns for name, email, budget, etc. A Slack workspace with an active bot/token integrated with the automation tool. 🛠️ Notion Database Structure Your Notion database should include at least the following fields: -Name (Text) -Email (Email) -Project Type (Select) -Budget (Select) -Onboarding Folder Link (URL) 🧩 Customization Guidance You can modify the Google Drive folder naming convention to include a timestamp or custom ID. Adjust Slack message formatting to include project-specific tags or mention specific team members. Extend the Notion entry to include more fields like project deadline or contact notes.
by AppUnits AI
Generate Invoices and Send Reminders for Customers with Jotform, QuickBooks and Outlook This workflow automates the entire process of receiving a product/service order, checking or creating a customer in QuickBooks Online (QBO), generating an invoice, emailing it — all triggered by a form submission (via Jotform), and sending invoice reminders. How It Works Receive Submission Triggered when a user submits a form. Collects data like customer details, selected product/service, etc. Check If Customer Exists Searches QBO to determine if the customer already exists. If Customer Exists:* *Update** customer details (e.g., billing address). If Customer Doesn’t Exist:* *Create** a new customer in QBO. Get The Item Retrieves the selected product or service from QBO. Create The Invoice Generates a new invoice for the customer using the item selected. Send The Invoice Automatically sends the invoice via email to the customer. Store The Invoice In DB Stores the needed invoice details in the DB. Send Reminders Every day at 8 AM, the automation checks each invoice to decide whether to: send a reminder email, skip and send it later, or delete the invoice from the DB (if it's paid or all reminders have been sent). Who Can Benefit from This Workflow? Freelancers** Service Providers** Consultants & Coaches** Small Businesses** E-commerce or Custom Product Sellers** Requirements Jotform webhook setup, more info here QuickBooks Online credentials, more info here Email setup, update email nodes (Send reminder email & Send reminders sent summary), more info about Outlook setup here Create data table with the following columns: invoiceId (string) remainingAmount (number) currency (string) remindersSent (number) lastSentAt (date time) Update Add reminders config node so update the data table id and intervals in days (default is after 2 days, then after 3 days and finally after 5 days ) LLM model credentials
by Sridevi Edupuganti
Description Ask any question and get five different answers instantly. Each answer is written for a different audience—from kids to business executives. Your Telegram bot delivers all five explanations in under 10 seconds and saves them to Google Docs automatically. Perfect for teachers, writers, and anyone who needs to explain things to different people. Who's It For • Educators creating multi-grade curriculum content • Content creators generating material for diverse audiences • Technical writers producing documentation at different expertise levels • Parents explaining complex topics to children • Anyone who needs to explain things to different people. How It Works • Transforms any question into five distinct explanations: kid-friendly stories (5-year-olds), relatable content (teenagers), professional explanations (graduates), academic analysis (PhD researchers), and strategic insights (business executives) • Five AI agents process simultaneously for 3-8 second response times • Delivers six formatted Telegram messages (header + five explanations) • Automatically archives complete conversations to Google Docs • Uses binary tree merge architecture for reliable data handling How to Set Up • Create Telegram bot via @BotFather and add token to n8n credentials • Obtain OpenAI API key and add to n8n credentials • Connect Google account to grant Docs access • Create blank Google Doc and paste URL in workflow's Google Docs node • Activate workflow and test with any question Requirements • Telegram Bot API token (free) • OpenAI API key (pay-per-use • Google account with Docs access (free) • n8n instance (cloud or self-hosted) How to Customize • Modify AI prompts in 'Create 5 Items' node for different tones and styles • Adjust character limits in formatting nodes to control message length • Change output destinations from Telegram to Slack, email, or other platforms • Switch AI providers from OpenAI to alternatives • Add additional comprehension levels by duplicating AI agent nodes Need Help? For detailed notes and implementation, please leverage the README document at: https://drive.google.com/file/d/19Fx-FoihL70qpOi4CnEwQ6Sud2dbUnE_/view?usp=sharing Join the Discord (https://discord.com/invite/XPKeKXeB7d) or Join the n8n community forum (https://community.n8n.io/) for support
by Nishant
Overview Confused which credit card to actually get or swipe? With 100+ cards in the market, hidden caps, and milestone rules, most people end up leaving rewards, perks, and cashback on the table. This workflow uses n8n + GPT + Google Sheets + Telegram to recommend the best credit card for each user’s lifestyle in under 3 seconds, while keeping the logic transparent with a ₹-value breakdown. What does this workflow do? This workflow: Captures User Inputs – Users answer a 7-question lifestyle quiz via Telegram. Stores Responses – Google Sheets logs all answers for resumption & deduplication. Scores Answers – n8n Function nodes map single & multi-select inputs into scores. Generates Recommendations – GPT analyses profile vs. 30+ card dataset. Breaks Down Value – Outputs a transparent table of rewards, milestones, lounge value. Delivers Results – Top 3 card picks returned instantly on Telegram. Why is this useful? Most card comparison tools only list features — they don’t personalise or calculate actual value. This workflow builds a decision engine: 🔍 Personalised → matches lifestyle to best-fit cards 💸 Transparent → shows value in real currency (rewards, milestones, lounges) ⏱ Fast → answers in under 3 seconds 🗂 Organised → Google Sheets keeps audit trail of every user + dedupe Tools used n8n (Orchestrator): Orchestration + logic branching Telegram: User-facing quiz bot Google Sheets: Database of credit cards + logs of user answers OpenAI (GPT): Analyses user profile & generates recommendations Who is this for? 🧑💻 Fintech product builders → see how AI can power recommendation engines 💳 Cardholders → understand which card fits their lifestyle best ⚙️ n8n makers → learn how to combine Sheets + GPT + chat interface into one workflow 🌍 How to adapt it for your country/location This workflow uses a credit card dataset stored in Google Sheets. To make it work for your country: Build your dataset → scrape or collect card details from banks, comparison sites, or official portals Fields to include: Fees, Reward rate, Lounge access, Forex markup, Reward caps, Milestones, Eligibility. You can use web crawlers (e.g., Apify, PhantomBuster) to automate data collection. Update the Google Sheet → replace the India dataset with your country’s cards. Adjust scoring logic → modify Function nodes if your cards use different reward structures (e.g., cashback %, miles, points value). Run the workflow → GPT will analyse against the new dataset and generate recommendations specific to your country. This makes the workflow flexible for any geography. Workflow Highlights ✅ End-to-end credit card recommendation pipeline (quiz → scoring → GPT → result) ✅ Handles single + multi-select inputs fairly with % match scoring ✅ Transparent value breakdown in local currency (rewards, milestones, lounge access) ✅ Google Sheets for persistence, dedupe & audit trail ✅ Delivers top 3 cards in <3 seconds on Telegram ✅ Fully customisable for any country by swapping the dataset
by Nishant
Overview Tired of cookie-cutter “AI LinkedIn post generators”? This workflow goes beyond just text generation — it orchestrates the entire lifecycle of a LinkedIn post. From idea capture to deduplication, from GPT-powered drafting to automatic image generation and link storage, it creates ready-to-publish posts while keeping your content unique and audit-friendly. What does this workflow do? This workflow: Captures Ideas & Briefs – Inputs are logged in Google Sheets with audience, goals, and angles. Deduplicates Smartly – Avoids repeating hooks or ideas with fuzzy GPT-based dedupe + GSheet logs. Generates Posts – GPT (OpenAI) drafts sharp LinkedIn-ready posts based on your brief. Creates Images – Post hook + body is sent to an Image Gen model (DALL·E / SDXL) → PNG asset. Stores & Links – Final text + image uploaded to Google Drive with shareable links. Audit Trail – GSheets keeps full history: raw idea, draft, final post, assets, notes. Why is this useful? Most “AI post generators” just spit out text. This workflow builds a real publishing pipeline: 🔄 No duplicates → keeps posts fresh & original. 🖼 Images included → auto-generated visuals increase engagement on LinkedIn. 📊 Audit-ready → every post has a traceable log in Sheets. ⚡ Fast iteration → from half-baked thought → polished post in minutes. Tools used n8n (Orchestrator): Automates triggers, merges, retries, and Google connectors. OpenAI (LLM): Idea generation, drafting, fuzzy dedupe, and voice conformity. Google Sheets: Source of truth — stores ideas, dedupe logs, audit trail. Google Drive: Stores rendered images and shares links for publishing. Image Generation (DALL·E / SDXL): Creates header graphics from hook + body. Who is this for? 🧑💻 Product Managers / Founders who want to post consistently but don’t have time. 🎨 Creators who want to add unique visuals without hiring a designer. ⚙️ n8n Builders who want to see how AI + automation + storage can be stitched into one pipeline. Workflow Highlights ✅ Full content pipeline (ideas → images → final copy). ✅ GPT-based fuzzy dedupe to avoid repetition. ✅ Auto-generated images for higher engagement. ✅ Clean logs in Google Sheets for future reuse & audits. ✅ Ready-to-publish LinkedIn post in minutes.
by Charles
🚀 Market Research Analytics System > Transform Google Maps data into actionable business insights with AI-powered analysis 📋 Overview This n8n workflow automatically collects business data from Google Maps, analyzes customer reviews using AI, and generates comprehensive market research reports delivered straight to your inbox. 🎯 Use Cases | User Type | Example Usage | |---------------|-------------------| | 🏢 Business Owners | Analyze competition before opening new location | | 💼 Entrepreneurs | Research market gaps and opportunities | | 📊 Marketing Teams | Understand customer sentiment and preferences | | 💰 Investors | Evaluate market potential in target areas | | 🔍 Consultants | Create detailed market reports for clients | 🛠️ Customization Examples Different Business Types // Restaurants {"search_query": "restaurants downtown", "analysis_focus": "restaurant"} // Hotels {"search_query": "hotels near airport", "analysis_focus": "hospitality"} // Fitness Centers {"search_query": "gyms and fitness centers", "analysis_focus": "fitness"} Multiple Cities // New York {"search_location": "@40.7589,-73.9851,12z", "city_name": "New York City"} // London {"search_location": "@51.5074,-0.1278,12z", "city_name": "London"} // Tokyo {"search_location": "@35.6762,139.6503,12z", "city_name": "Tokyo"} 📊 What You'll Get Your automated report includes: 🎯 Executive Summary Key market insights in 3-4 sentences Biggest business opportunities identified Immediate action recommendations 📈 Market Analysis Competition density and market gaps Price segments and quality distribution Geographic hotspots and trends 💬 Customer Intelligence Top 5 factors customers value most Common complaints and pain points Overall sentiment analysis 🏆 Competitive Landscape Strongest competitors identified Their strengths and weaknesses Positioning opportunities 💡 Strategic Recommendations Optimal business model suggestions Pricing and marketing strategies 30/90/180-day action plans ⚙️ Technical Requirements | Service | Cost | Purpose | |-------------|----------|-------------| | SerpAPI | Free tier (100 searches/month) | Google Maps data extraction | | Google Gemini | Free tier available | AI-powered analysis | | Gmail | Free | Report delivery | | n8n | Cloud or self-hosted | Workflow automation | 🚨 Important Notes API Limits:** Free tiers have monthly limits - monitor usage Data Accuracy:** Results depend on Google Maps data availability Processing Time:** Analysis may take 2-5 minutes depending on data volume Language Support:** Works with multiple languages (update language_code) 🔧 Troubleshooting | Issue | Solution | |-----------|--------------| | No results found | Check coordinates format and search query | | API errors | Verify API keys are correctly configured | | Email not received | Check Gmail credentials and recipient address | | Slow processing | Normal for large datasets (20+ businesses) | 🚀 Pro Tips 🎯 Be Specific:** Use targeted search queries like "vegan restaurants" vs "restaurants" 📅 Schedule Runs:** Set up recurring analysis to track market changes 🌍 Multi-Location:** Run for multiple cities to compare markets 📱 Mobile-First:** Reports are mobile-responsive for on-the-go reading 🔄 Iterate:** Refine search parameters based on initial results Ready to unlock your market insights? Configure your parameters and execute the workflow!
by Robert Breen
This n8n workflow automates bulk AI image generation using Freepik's Text-to-Image API. It reads prompts from a Google Sheet, generates multiple variations of each image using Freepik's AI, and automatically uploads the results to Google Drive with organized file names. This is perfect for content creators, marketers, or designers who need to generate multiple AI images in bulk and store them systematically. Key Features: Bulk image generation from Google Sheets prompts Multiple variations per prompt (configurable duplicates) Automatic file naming and organization Direct upload to Google Drive Batch processing for efficient API usage Freepik AI-powered image generation Step-by-Step Implementation Guide Prerequisites Before setting up this workflow, you'll need: n8n instance (cloud or self-hosted) Freepik API account with Text-to-Image access Google account with access to Sheets and Drive Google Sheet with your prompts Step 1: Set Up Freepik API Credentials Go to Freepik API Developer Portal Create an account or sign in Navigate to your API dashboard Generate an API key for Text-to-Image service Copy the API key and save it securely In n8n, go to Credentials → Add Credential → HTTP Header Auth Configure as follows: Name: "Header Auth account" Header Name: x-freepik-api-key Header Value: Your Freepik API key Step 2: Set Up Google Credentials Google Sheets Access: Go to Google Cloud Console Create a new project or select existing one Enable Google Sheets API Create OAuth2 credentials In n8n, go to Credentials → Add Credential → Google Sheets OAuth2 API Enter your OAuth2 credentials and authorize with spreadsheets.readonly scope Google Drive Access: In Google Cloud Console, enable Google Drive API In n8n, go to Credentials → Add Credential → Google Drive OAuth2 API Enter your OAuth2 credentials and authorize Step 3: Create Your Google Sheet Create a new Google Sheet: sheets.google.com Set up your sheet with these columns: Column A: Prompt (your image generation prompts) Column B: Name (identifier for file naming) Example data: | Prompt | Name | |-------------------------------------------|-------------| | A serene mountain landscape at sunrise | mountain-01 | | Modern office space with natural lighting | office-02 | | Cozy coffee shop interior | cafe-03 | Copy the Sheet ID from the URL (the long string between /d/ and /edit) Step 4: Set Up Google Drive Folder Create a folder in Google Drive for your generated images Copy the Folder ID from the URL when viewing the folder Note: The workflow is configured to use a folder called "n8n workflows" Step 5: Import and Configure the Workflow Copy the provided workflow JSON In n8n, click Import from File or Import from Clipboard Paste the workflow JSON Configure each node as detailed below: Node Configuration Details: Start Workflow (Manual Trigger) No configuration needed Used to manually start the workflow Get Prompt from Google Sheet (Google Sheets) Document ID**: Your Google Sheet ID (from Step 3) Sheet Name**: Sheet1 (or your sheet name) Operation**: Read Credentials**: Select your "Google Sheets account" Double Output (Code Node) Purpose**: Creates multiple variations of each prompt JavaScript Code**: const original = items[0].json; return [ { json: { ...original, run: 1 } }, { json: { ...original, run: 2 } }, ]; Customization**: Add more runs for additional variations Loop (Split in Batches) Processes items in batches to manage API rate limits Options**: Keep default settings Reset**: false Create Image (HTTP Request) Method**: POST URL**: https://api.freepik.com/v1/ai/text-to-image Authentication**: Generic → HTTP Header Auth Credentials**: Select your "Header Auth account" Send Body**: true Body Parameters**: Name: prompt Value: ={{ $json.Prompt }} Split Responses (Split Out) Field to Split Out**: data Purpose**: Separates multiple images from API response Convert to File (Convert to File) Operation**: toBinary Source Property**: base64 Purpose**: Converts base64 image data to file format Upload Image to Google Drive (Google Drive) Operation**: Upload Name**: =Image - {{ $('Get Prompt from Google Sheet').item.json.Name }} - {{ $('Double Output').item.json.run }} Drive ID**: My Drive Folder ID**: Your Google Drive folder ID (from Step 4) Credentials**: Select your "Google Drive account" Step 6: Customize for Your Use Case Modify Duplicate Count: Edit the "Double Output" code to create more variations Update File Naming: Change the naming pattern in the Google Drive upload node Adjust Batch Size: Modify the Loop node settings for your API limits Add Image Parameters: Enhance the HTTP request with additional Freepik parameters (size, style, etc.) Step 7: Test the Workflow Ensure your Google Sheet has test data Click Execute Workflow on the manual trigger Monitor the execution flow Check that images are generated and uploaded to Google Drive Verify file names match your expected pattern Step 8: Production Deployment Set up error handling for API failures Configure appropriate batch sizes based on your Freepik API limits Add logging for successful uploads Consider webhook triggers for automated execution Set up monitoring for failed executions Freepik API Parameters Basic Parameters: prompt: Your text description (required) negative_prompt: What to avoid in the image guidance_scale: How closely to follow the prompt (1-20) num_inference_steps: Quality vs speed trade-off (20-100) seed: For reproducible results Example Enhanced Body: { "prompt": "{{ $json.Prompt }}", "negative_prompt": "blurry, low quality", "guidance_scale": 7.5, "num_inference_steps": 50, "num_images": 1 } Workflow Flow Summary Start → Manual trigger initiates the workflow Read Sheet → Gets prompts and names from Google Sheets Duplicate → Creates multiple runs for variations Loop → Processes items in batches Generate → Freepik API creates images from prompts Split → Separates multiple images from response Convert → Transforms base64 to binary file format Upload → Saves images to Google Drive with organized names Complete → Returns to loop for next batch Contact Information Robert A Ynteractive For support, customization, or questions about this workflow: 📧 Email: rbreen@ynteractive.com 🌐 Website: https://ynteractive.com/ 💼 LinkedIn: https://www.linkedin.com/in/robert-breen-29429625/ Need help implementing this workflow or want custom automation solutions? Get in touch for professional n8n consulting and workflow development services.
by Kevin Meneses
What this workflow does This workflow automates end-to-end stock analysis using real market data and AI: Reads a list of stock tickers from Google Sheets Fetches fundamental data (valuation, growth, profitability) and OHLCV price data from EODHD APIs Computes key technical indicators (RSI, SMA 20/50/200, volatility, support & resistance) Uses an AI model to generate: Buy / Watch / Sell recommendation Entry price, stop-loss, and take-profit levels Investment thesis, pros & cons Fundamental quality score (1–10) Stores the final structured analysis back into Google Sheets This creates a repeatable, no-code stock analysis pipeline ready for decision-making or dashboards. Data source Market data is powered by EODHD APIs How to configure this workflow 1. Google Sheets (Input) Create a sheet with a column called: ticker (e.g. MSFT, AAPL, AMZN) Each row represents one stock to analyze. 2. EODHD APIs Create an EODHD account Get your API token Add it to the HTTP Request nodes as: api_token=YOUR_API_KEY EODHD APIs 3. AI Model Configure your AI provider (OpenAI / compatible model) The AI receives: Fundamentals Technical indicators Growth potential score It returns structured JSON with recommendations and trade levels 4. Google Sheets (Output) Results are appended to a Signals tab with: Signal (BUY / WATCH / SELL) Entry, Stop Loss, Take Profit Fundamental score (1–10) Investment thesis and risk notes
by Rahul Joshi
📊 Description This workflow automates the daily HR standup by continuously monitoring active hiring and HR tasks, identifying risks and blockers, and generating an intelligent, action-oriented summary using AI. Every morning, HR and leadership teams receive a clear overview of priorities, overdue items, and potential risks — without any manual preparation. By combining structured task data with AI-driven analysis, the workflow ensures teams start the day aligned, informed, and ready to act. The automation is designed for real-world HR operations, scaling seamlessly as task volume grows while maintaining concise and consistent reporting. 🔁 What this automation does Automatically triggers every morning using a scheduled Cron trigger. Retrieves all HR and hiring-related tasks from a Monday.com board. Filters out completed items to focus only on active and relevant work. Aggregates all remaining tasks into a single structured dataset. Analyzes task status, ownership, and due dates to identify blockers and risks. Uses AI (GPT-4o-mini) to generate a concise, structured daily standup report. Delivers the final standup update to HR via email for immediate visibility. 🧠 Key design decisions Uses a scheduled trigger to ensure consistent, hands-free execution Applies task filtering via a Code node to overcome API limitations Aggregates all tasks to produce a single, consolidated standup report Leverages AI for insight generation, not raw data repetition Excludes completed tasks to reduce noise and improve signal quality Prioritizes concise, actionable output suitable for leadership review ⭐ Key benefits Eliminates manual standup preparation Ensures overdue tasks and blockers are surfaced early Improves visibility and accountability across HR operations Saves daily operational time for HR and managers Produces consistent, professional summaries every day Scales efficiently as teams and task volume increase 🛠️ Tools & services used n8n – Workflow orchestration and automation Monday.com – HR and hiring task management OpenAI (GPT-4o-mini) – Intelligent analysis and summarization Gmail – Delivery of daily standup reports Cron – Scheduled execution 🔐 Requirements Monday.com OAuth credentials OpenAI API key Gmail OAuth credentials n8n (cloud or self-hosted) HR board with status and due date columns 🎯 Target audience HR and Talent Acquisition teams Hiring managers Operations and RevOps teams Startups and scaling organizations Automation teams building internal HR tooling