by Yaron Been
This workflow automatically analyzes website conversion funnels to identify optimization opportunities and track user journey performance. It saves you time by eliminating the need to manually analyze funnel metrics and provides detailed insights into conversion bottlenecks and improvement areas. Overview This workflow automatically scrapes website pages to analyze funnel elements including CTAs, tracking scripts, page structure, and conversion paths. It uses Bright Data to access websites without restrictions and AI to intelligently extract funnel data, identify conversion elements, and provide optimization recommendations. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping website pages without being blocked OpenAI**: AI agent for intelligent funnel analysis and optimization insights Google Sheets**: For storing funnel analysis data and recommendations How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your funnel analysis spreadsheet Customize: Define target website URLs and funnel analysis parameters Use Cases Conversion Optimization**: Identify and fix conversion funnel bottlenecks UX Analysis**: Analyze user experience and journey optimization opportunities Competitor Research**: Study competitor funnel strategies and implementations A/B Testing**: Monitor funnel performance changes over time Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #funnelanalysis #conversionoptimization #brightdata #webscraping #uxanalysis #n8nworkflow #workflow #nocode #websiteanalysis #funneloptimization #conversiontracking #userjourney #websiteoptimization #cro #digitalmarketing #funnelalyzer #websiteperformance #conversionanalytics #uxresearch #websitemetrics #funnelmonitoring #performanceanalysis #websiteinsights #conversionfunnel #userexperience #websiteaudit #funneltracking #optimizationanalysis
by Yaron Been
Description This workflow automatically monitors sneaker prices across multiple retailers and sends you alerts when prices drop on your favorite models. It helps sneaker enthusiasts and collectors find the best deals without constantly checking multiple websites. Overview This workflow automatically monitors sneaker prices across multiple retailers and sends you alerts when prices drop. It uses Bright Data to scrape sneaker websites and can notify you through various channels when your desired models go on sale. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping sneaker retailer websites without getting blocked. Notification Services:** Email, SMS, or other messaging platforms. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications: Configure your preferred notification method. Customize: Add the sneaker models you want to track and your price thresholds. Use Cases Sneaker Collectors:** Get notified when rare models drop in price. Resellers:** Find profitable buying opportunities. Budget Shoppers:** Wait for the best deals on your favorite sneakers. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #sneakers #pricealerts #brightdata #webscraping #sneakerdeals #sneakermonitor #pricedrop #sneakerhead #n8nworkflow #workflow #nocode #sneakersales #dealfinder #sneakermarket #pricetracking #sneakerprices #shoealerts #sneakercollector #sneakershopping #reselling #sneakerreseller #dealnotifications #sneakerautomation #shoedeals
by Yaron Been
Description This workflow monitors Bitcoin prices across multiple exchanges and sends you alerts when significant price drops occur. It helps crypto traders and investors identify buying opportunities without constantly watching the markets. Overview This workflow monitors Bitcoin prices across multiple exchanges and sends you alerts when significant price drops occur. It uses Bright Data to scrape real-time price data and can be configured to notify you through various channels. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping cryptocurrency exchange data without getting blocked. Notification Services:** Email, SMS, Telegram, or other messaging platforms. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications: Configure your preferred notification method. Customize: Set your price thresholds, monitoring frequency, and which exchanges to track. Use Cases Crypto Traders:** Get notified of buying opportunities during price dips. Investors:** Monitor your crypto investments and make informed decisions. Financial Analysts:** Track Bitcoin price movements for market analysis. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #bitcoin #cryptocurrency #brightdata #pricealerts #cryptotrading #bitcoinalerts #cryptoalerts #cryptomonitoring #n8nworkflow #workflow #nocode #cryptoinvesting #bitcoinprice #cryptomarket #tradingalerts #cryptotools #bitcointrading #pricemonitoring #cryptoautomation #bitcoininvestment #cryptotracker #marketalerts #tradingopportunities #cryptoprices
by Yaron Been
Description This workflow automatically collects and organizes research papers from academic databases and journals into Google Sheets. It helps researchers and students save time by eliminating manual searches across multiple academic sources and centralizing research materials. Overview This workflow automatically scrapes research papers from academic databases and journals, then organizes them in Google Sheets. It uses Bright Data to access academic sources and extracts key information like titles, authors, abstracts, and citations. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping academic websites and research databases without getting blocked. Google Sheets:** For organizing and storing research paper data. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Connect Google Sheets: Authenticate your Google account. Customize: Specify research topics, journals, or authors to track. Use Cases Academic Researchers:** Stay updated on new papers in your field. Students:** Collect research for literature reviews and dissertations. Research Teams:** Collaborate on literature databases. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #research #academicpapers #brightdata #googlesheets #researchpapers #academicresearch #literaturesearch #scholarlyarticles #n8nworkflow #workflow #nocode #researchautomation #academicscraping #researchtools #papertracking #academicjournals #researchdatabase #literaturereview #academicwriting #datascraping #researchorganization #scholarlyresearch #citationmanagement #academicproductivity
by victor de coster
The template allows to make Dropcontact batch requests up to 250 requests every 10 minutes (1500/hour). Valuable if high volume email enrichment is expected. Dropcontact will look for email & basic email qualification if first_name, last_name, company_name is provided. +++++++++++++++++++++++++++++++++++++++++ Step 1: Node "Profiles Query" Connect your own source (Airtable, Google Sheets, Supabase,...) the template is using Postgres by default. Note I: Be careful your source is only returning a maximum of 250 items. Note II: The next node uses the next variables, make sure you can map these from your source file: first_name last_name website (company_name would work too) full_name (see note) Note III: This template is using the Dropcontact Batch API, which works in a POST & GET setup. Not a GET request only to retrieve data, as Dropcontact needs to process the batch data load properly. +++++++++++++++++++++++++++++++++++++++++ Step 2: Node "Data Transformation" Will transform the input variables in the proper json format. This json format is expected from the Dropcontact API to make a batch request. "full_name" is being used as a custom identifier to update the returned email to the proper contact in your source database. To make things easy, use a unique identiefer in the full_name variable. +++++++++++++++++++++++++++++++++++++++++ Step3: Node: "Bulk Dropcontact Requests". Enter your Dropcontact credentials in the node: Bulk Dropcontact Requests. +++++++++++++++++++++++++++++++++++++++++ Step4: Connect your output source by mapping the data you like to use. +++++++++++++++++++++++++++++++++++++++++ Step5: Node: "Slack" (OPTIONAL) Connect your slack account, if an error occur, you will be notified. TIP: Try to run the workflow with a batch of 10 (not 250) as it might need to run initially before you will be able to map the data to your final destination. Once the data fields are properly mapped, adjust back to 250.
by Akhil Varma Gadiraju
Workflow: HubSpot Contact Email Validation with Hunter.io Overall Goal This workflow retrieves contacts from HubSpot that have an email address but haven't yet had their email validated by Hunter. It then iterates through each of these contacts, uses Hunter.io to verify their email, updates the contact record in HubSpot with the validation status and date, and finally sends a summary email notification upon completion. How it Works (Step-by-Step Breakdown) Node: "When clicking ‘Test workflow’" (Manual Trigger) Type:** n8n-nodes-base.manualTrigger Purpose:** Start the workflow manually via the n8n interface. Output:** Triggers workflow execution. Node: "HubSpot" (HubSpot) Type:** n8n-nodes-base.hubspot Purpose:** Fetch contacts from HubSpot. Configuration:** Authentication: App Token Operation: Search for contacts Return All: True Filter Groups: Contact HAS_PROPERTY email Contact NOT_HAS_PROPERTY hunter_email_validation_status Output:** List of contact objects. Node: "Loop Over Items" (SplitInBatches) Type:** n8n-nodes-base.splitInBatches Purpose:** Process each contact one-by-one. Configuration:** Options > Reset: false Output:** Output 1 to "Hunter" Output 2 to "Send Email" Node: "Hunter" (Inside the loop) Type:** n8n-nodes-base.hunter Purpose:** Verify email with Hunter.io Configuration:** Operation: Email Verifier Email: {{ $json.properties.email }} Node: "Add Hunter Details (Contact)" (HTTP Request - Inside the loop) Type:** n8n-nodes-base.httpRequest Purpose:** Update HubSpot contact. Configuration:** Method: PATCH URL: https://api.hubapi.com/crm/v3/objects/contacts/{{ $('Loop Over Items').item.json.id }} Headers: Content-Type: application/json Body (JSON): { "properties": { "hunter_email_validation_status": "{{ $json.status }}", "hunter_verification_date": "{{ $now.format('yyyy-MM-dd') }}" } } Node: "Wait" (Inside the loop) Type:** n8n-nodes-base.wait Purpose:** Avoid API rate limits. Configuration:** Wait for 1 second. Node: "Replace Me" (NoOp - Inside the loop) Type:** n8n-nodes-base.noOp Purpose:** Junction node to complete the loop. Node: "Send Email" (After the loop completes) Type:** n8n-nodes-base.emailSend Purpose:** Send summary notification. Configuration:** From Email: test@gmail.com To Email: akhilgadiraju@gmail.com Subject: "Email Verification Completed for Your HubSpot Contacts" HTML: Formatted confirmation message Sticky Notes "HubSpot": Create custom properties (hunter_email_validation_status, hunter_verification_date). "Add Hunter Details": Ensure field names match HubSpot properties. "Wait": Prevent API rate limits. How to Customize It Trigger Replace Manual Trigger with Schedule Trigger (Cron) for automation. Optionally use HubSpot Trigger for new contact events. HubSpot Node Create matching custom properties. Adjust filters and returned properties as needed. Hunter Node Minimal customization needed. HTTP Request Node Update JSON property names if renaming in HubSpot. Customize date format as needed. Wait Node Adjust wait time to balance speed and API safety. Email Node Customize email addresses, subject, and body. Add dynamic contact count with a Set or Function node. Error Handling Add Error Trigger nodes. Use If nodes inside loop to act on certain statuses. Use Cases Clean your email list. Enrich CRM data. Prep verified lists for campaigns. Automate contact hygiene on a schedule. Required Credentials HubSpot App Token Used by: HubSpot node and HTTP Request node Create a Private App in HubSpot with required scopes. Hunter API Used by: Hunter node SMTP Used by: Email Send node Configure host, port, username, and password. Made with ❤️ using n8n by Akhil.
by Akhil Varma Gadiraju
AI-Powered GitHub Commit Reviewer Overview Workflow Name: AI-Powered GitHub Commit Reviewer Author: Akhil Purpose: This n8n workflow triggers on a GitHub push event, fetches commit diffs, formats them into HTML, runs an AI-powered code review using Groq LLM, and sends a detailed review via email. How It Works (Step-by-Step) 1. GitHub Trigger Node Type**: n8n-nodes-base.githubTrigger Purpose**: Initiates the workflow on GitHub push events. Repo**: akhilv77/relevance Output**: JSON with commit and repo details. 2. Parser Node Type**: n8n-nodes-base.set Purpose**: Extracts key info (repo ID, name, commit SHA, file changes). 3. HTTP Request Node Type**: n8n-nodes-base.httpRequest Purpose**: Fetches commit diff details using GitHub API. Auth**: GitHub OAuth2 API. 4. Code (HTML Formatter) Node Type**: n8n-nodes-base.code Purpose**: Formats commit info and diffs into styled HTML. Output**: HTML report of commit details. 5. Groq Chat Model Node Type**: @n8n/n8n-nodes-langchain.lmChatGroq Purpose**: Provides the AI model (llama-3.1-8b-instant). 6. Simple Memory Node Type**: @n8n/n8n-nodes-langchain.memoryBufferWindow Purpose**: Maintains memory context for AI agent. 7. AI Agent Node Type**: @n8n/n8n-nodes-langchain.agent Purpose**: Executes AI-based code review. Prompt**: Reviews for bugs, style, grammar, and security. Outputs styled HTML. 8. Output Parser Node Type**: n8n-nodes-base.code Purpose**: Combines commit HTML with AI review into one HTML block. 9. Gmail Node Type**: n8n-nodes-base.gmail Purpose**: Sends review report via email. Recipient**: akhilgadiraju@gmail.com 10. End Workflow Node Type**: n8n-nodes-base.noOp Purpose**: Marks the end. Customization Tips GitHub Trigger**: Change repo/owner or trigger events. HTTP Request**: Modify endpoint to get specific data. AI Agent**: Update the prompt to focus on different review aspects. Groq Model**: Swap for other supported LLMs if needed. Memory**: Use dynamic session key for per-commit reviews. Email**: Change recipient or email styling. Error Handling Use Error Trigger nodes to handle failures in: GitHub API requests LLM generation Email delivery Use Cases Instant AI-powered feedback on code pushes. Pre-human review suggestions. Security and standards enforcement. Developer onboarding assistance. Required Credentials | Credential | Used By | Notes | |-----------|---------|-------| | GitHub API (ID PSygiwMjdjFDImYb) | GitHub Trigger | PAT with repo and admin:repo_hook | | GitHub OAuth2 API | HTTP Request | OAuth2 token with repo scope | | Groq - Akhil (ID HJl5cdJzjhf727zW) | Groq Chat Model | API Key from GroqCloud | | Gmail OAuth2 - Akhil (ID wqFUFuFpF5eRAp4E) | Gmail | Gmail OAuth2 for sending email | Final Note Made with ❤️ using n8n by Akhil.
by Thibaud
Title: Automatic Strava Titles & Descriptions Generation with AI Description: This n8n workflow connects your Strava account to an AI to automatically generate personalized titles and descriptions for every new cycling activity. It leverages the native Strava trigger to detect new activities, extracts and formats ride data, then queries an AI agent (OpenRouter, ChatGPT, etc.) with an optimized prompt to get a catchy title and inspiring description. The workflow then updates the Strava activity in real time, with zero manual intervention. Key Features: Secure connection to the Strava API (OAuth2) Automatic triggering for every new activity Intelligent data preparation and formatting AI-powered generation of personalized content (title + description) Instant update of the activity on Strava Use Cases: Cyclists wanting to automatically enhance their Strava rides Sports content creators Community management automation for sports groups Prerequisites: Strava account Strava OAuth2 credentials set up in n8n Access to a compatible AI agent (OpenRouter, ChatGPT, etc.) Benefits: Saves time Advanced personalization Boosts the appeal of every ride to your community
by JustinLee
This workflow demonstrates a simple Retrieval-Augmented Generation (RAG) pipeline in n8n, split into two main sections: 🔹 Part 1: Load Data into Vector Store Reads files from disk (or Google Drive). Splits content into manageable chunks using a recursive text splitter. Generates embeddings using the Cohere Embedding API. Stores the vectors into an In-Memory Vector Store (for simplicity; can be replaced with Pinecone, Qdrant, etc.). 🔹 Part 2: Chat with the Vector Store Takes user input from a chat UI or trigger node. Embeds the query using the same Cohere embedding model. Retrieves similar chunks from the vector store via similarity search. Uses Groq-hosted LLM to generate a final answer based on the context. 🛠️ Technologies Used: 📦 Cohere Embedding API ⚡ Groq LLM for fast inference 🧠 n8n for orchestrating and visualizing the flow 🧲 In-Memory Vector Store (for prototyping) 🧪 Usage: Upload or point to your source documents. Embed them and populate the vector store. Ask questions through the chat trigger node. Receive context-aware responses based on retrieved content.
by Friedemann Schuetz
Welcome to my VEO3 Video Generator Workflow! This automated workflow transforms simple text descriptions into professional 8-second videos using Google's cutting-edge VEO3 AI model. Users submit video ideas through a web form, and the system automatically generates optimized prompts, creates high-quality videos with native audio, and delivers them via Google Drive - all powered by Claude 4 Sonnet for intelligent prompt optimization. This workflow has the following sequence: VEO3 Generator Form - Web form interface for users to input video content, format, and duration Video Prompt Generator - AI agent powered by Claude 4 Sonnet that: Analyzes user input for video content requirements Creates factual, professional video titles Generates detailed VEO3 prompts with subject, context, action, style, camera motion, composition, ambiance, and audio elements Optimizes prompts for 16:9 landscape format and 8-second duration Create VEO3 Video - Submits the optimized prompt to fal.ai VEO3 API for video generation Wait 30 seconds - Initial waiting period for video processing to begin Check VEO3 Status - Monitors the video generation status via fal.ai API Video completed? - Decision node that checks if video generation is finished If not completed: Returns to wait cycle If completed: Proceeds to video retrieval Get VEO3 Video URL - Retrieves the final video download URL from fal.ai Download VEO3 Video - Downloads the generated MP4 video file Merge - Combines video data with metadata for final processing Save Video to Google Drive - Uploads the video to specified Google Drive folder Video Output - Displays completion message with Google Drive link to user The following accesses are required for the workflow: Anthropic API** (Claude 4 Sonnet): Documentation Fal.ai API** (VEO3 Model): Create API key at https://fal.ai/dashboard/keys Google Drive API**: Documentation Workflow Features: User-friendly web form**: Simple interface for video content input AI-powered prompt optimization**: Claude 4 Sonnet creates professional VEO3 prompts Automatic video generation**: Leverages Google's VEO3 model via fal.ai Status monitoring**: Real-time tracking of video generation progress Google Drive integration**: Automatic upload and sharing of generated videos Structured output**: Consistent video titles and professional prompt formatting Audio optimization**: VEO3's native audio generation with ambient sounds and music Current Limitations: Format**: Only 16:9 landscape videos supported Duration**: Only 8-second videos supported Processing time**: Videos typically take 60-120 seconds to generate Use Cases: Content creation**: Generate videos for social media, websites, and presentations Marketing materials**: Create promotional videos and advertisements Educational content**: Produce instructional and explanatory videos Prototyping**: Rapid video concept development and testing Creative projects**: Artistic and experimental video generation Business presentations**: Professional video content for meetings and pitches Feel free to contact me via LinkedIn, if you have any questions!
by Samir Saci
Context Hey! I'm Samir, a Supply Chain Data Scientist from Paris who spent six years in China studying and working while struggling to learn Mandarin. I know the challenges of mastering a complex language like Chinese and my greatest support was flash cards. Therefore, I designed this workflow to support fellow Mandarin learners by automating flashcard creation using n8n, so they can focus more on learning and less on manual data entry. 📬 For business inquiries, you can add me on Here Who is this template for? This workflow template is designed for language learners and educators who want to automate the creation of flashcards for Mandarin (or any other language) using Google Translate API, an AI agent for phonetic transcription and generating an illustrative sentence and a free image retrieval API. Why? If you use the open-source application Anki, this workflow will help you automatically generate personalized study materials. How? Let us imagine you want to learn how to say the word Contract in Mandarin. The workflow will automatically Translate the word in Simplified Mandarin (Mandarin: 合同). Provide the phonetic transcription (Pinyin: Hétóng) Generate an example sentence (Example: 我们签订了一份合同.) Download an illustrative picture (For example, a picture of a contract signature) All these fields are automatically recorded in a Google Sheet, making it easy to import into Anki and generate flashcards instantly What do I need to start? This workflow can be used with the free tier plans of the services used. It does not require any advanced programming skills. Prerequisite A Google Drive Account with a folder including a Google Sheet API Credentials: Google Drive API, Google Sheets API and Google Translate API activated with OAuth2 credentials A free API key of pexels.com A google sheet with the columns Next Follow the sticky notes to set up the parameters inside each node and get ready to pump your learning skills. I have detailed the steps in a short tutorial 👇 🎥 Check My Tutorial Notes This workflow can be used for any language. In the AI Agent prompt, you just need to replace the word pinyin with phonetic transcription. You can adapt the trigger to operate the workflow in the way you want. These operations can be performed by batch or triggered by Telegram, email, or webhook. If you want to learn more about how I used Anki flash cards to learn mandarin: 🈷️ Blog Article about Anki Flash Cards This workflow has been created with N8N 1.82.1 Submitted: March 17th, 2025
by Rahul Joshi
Description Automate your weekly social media analytics with this end-to-end AI reporting workflow. 📊🤖 This system collects real-time Twitter (X) and Facebook metrics, merges and validates data, formats it with JavaScript, generates an AI-powered HTML report via GPT-4o, saves structured insights in Notion, and shares visual summaries via Slack and Gmail. Perfect for marketing teams tracking engagement trends and performance growth. 🚀💬 What This Template Does 1️⃣ Starts manually or on-demand to fetch the latest analytics data. 🕹️ 2️⃣ Retrieves follower, engagement, and post metrics from both X (Twitter) and Facebook APIs. 🐦📘 3️⃣ Merges and validates responses to ensure clean, complete datasets. 🔍 4️⃣ Runs custom JavaScript to normalize and format metrics into a unified JSON structure. 🧩 5️⃣ Uses Azure OpenAI GPT-4o to generate a visually rich HTML performance report with tables, emojis, and insights. 🧠📈 6️⃣ Saves the processed analytics into a Notion “Growth Chart” database for centralized trend tracking. 🗂️ 7️⃣ Sends an email summary report to the marketing team, complete with formatted HTML insights. 📧 8️⃣ Posts a concise Slack update comparing platform performance and engagement deltas. 💬 9️⃣ Logs any validation or API errors automatically into Google Sheets for debugging and traceability. 🧾 Key Benefits ✅ Centralizes all social metrics into a single automated flow. ✅ Delivers AI-generated HTML reports ready for email and dashboard embedding. ✅ Reduces manual tracking with Notion and Slack syncs. ✅ Ensures data reliability with built-in validation and error logging. ✅ Gives instant, visual insights for weekly marketing reviews. Features Multi-platform analytics integration (Twitter X + Facebook Graph API). JavaScript node for dynamic data normalization. Azure OpenAI GPT-4o for HTML report generation. Notion database update for long-term trend storage. Slack and Gmail nodes for instant sharing and communication. Automated error capture to Google Sheets for workflow reliability. Visual, emoji-enhanced reporting with HTML formatting and insights. Requirements Twitter OAuth2 API credentials for access to public metrics. Facebook Graph API access token for page insights. Azure OpenAI API key for GPT-4o report generation. Notion API credentials with write access to “Growth Chart” database. Gmail OAuth2 credentials for report dispatch. Slack Bot Token with chat:write permission for posting analytics summaries. Google Sheets OAuth2 credentials for maintaining the error log. Environment Variables TWITTER_API_KEY FACEBOOK_ACCESS_TOKEN AZURE_OPENAI_API_KEY NOTION_GROWTH_DB_ID GMAIL_REPORT_RECIPIENTS SLACK_REPORT_CHANNEL_ID GOOGLE_SHEET_ERROR_LOG_ID Target Audience 📈 Marketing and growth teams tracking cross-platform performance 💡 Social media managers needing automated reporting 🧠 Data analysts compiling weekly engagement metrics 💬 Digital agencies managing multiple brand accounts 🧾 Operations and analytics teams monitoring performance KPIs Step-by-Step Setup Instructions 1️⃣ Connect all API credentials (Twitter, Facebook, Notion, Gmail, Slack, and Sheets). 2️⃣ Paste your Facebook Page ID and Twitter handle in respective API nodes. 3️⃣ Verify your Azure OpenAI GPT-4o connection and prompt text for HTML report generation. 4️⃣ Update your Notion database structure to match “Growth Chart” property names. 5️⃣ Add your marketing email in the Gmail node and test delivery. 6️⃣ Specify the Slack channel ID where summaries will be posted. 7️⃣ Optionally, connect a Google Sheet tab for error tracking (error_id, message). 8️⃣ Execute the workflow once manually to validate data flow. 9️⃣ Activate or schedule it for weekly or daily analytics automation. ✅