by Rahul Joshi
Description Automate your weekly social media analytics with this end-to-end AI reporting workflow. 📊🤖 This system collects real-time Twitter (X) and Facebook metrics, merges and validates data, formats it with JavaScript, generates an AI-powered HTML report via GPT-4o, saves structured insights in Notion, and shares visual summaries via Slack and Gmail. Perfect for marketing teams tracking engagement trends and performance growth. 🚀💬 What This Template Does 1️⃣ Starts manually or on-demand to fetch the latest analytics data. 🕹️ 2️⃣ Retrieves follower, engagement, and post metrics from both X (Twitter) and Facebook APIs. 🐦📘 3️⃣ Merges and validates responses to ensure clean, complete datasets. 🔍 4️⃣ Runs custom JavaScript to normalize and format metrics into a unified JSON structure. 🧩 5️⃣ Uses Azure OpenAI GPT-4o to generate a visually rich HTML performance report with tables, emojis, and insights. 🧠📈 6️⃣ Saves the processed analytics into a Notion “Growth Chart” database for centralized trend tracking. 🗂️ 7️⃣ Sends an email summary report to the marketing team, complete with formatted HTML insights. 📧 8️⃣ Posts a concise Slack update comparing platform performance and engagement deltas. 💬 9️⃣ Logs any validation or API errors automatically into Google Sheets for debugging and traceability. 🧾 Key Benefits ✅ Centralizes all social metrics into a single automated flow. ✅ Delivers AI-generated HTML reports ready for email and dashboard embedding. ✅ Reduces manual tracking with Notion and Slack syncs. ✅ Ensures data reliability with built-in validation and error logging. ✅ Gives instant, visual insights for weekly marketing reviews. Features Multi-platform analytics integration (Twitter X + Facebook Graph API). JavaScript node for dynamic data normalization. Azure OpenAI GPT-4o for HTML report generation. Notion database update for long-term trend storage. Slack and Gmail nodes for instant sharing and communication. Automated error capture to Google Sheets for workflow reliability. Visual, emoji-enhanced reporting with HTML formatting and insights. Requirements Twitter OAuth2 API credentials for access to public metrics. Facebook Graph API access token for page insights. Azure OpenAI API key for GPT-4o report generation. Notion API credentials with write access to “Growth Chart” database. Gmail OAuth2 credentials for report dispatch. Slack Bot Token with chat:write permission for posting analytics summaries. Google Sheets OAuth2 credentials for maintaining the error log. Environment Variables TWITTER_API_KEY FACEBOOK_ACCESS_TOKEN AZURE_OPENAI_API_KEY NOTION_GROWTH_DB_ID GMAIL_REPORT_RECIPIENTS SLACK_REPORT_CHANNEL_ID GOOGLE_SHEET_ERROR_LOG_ID Target Audience 📈 Marketing and growth teams tracking cross-platform performance 💡 Social media managers needing automated reporting 🧠 Data analysts compiling weekly engagement metrics 💬 Digital agencies managing multiple brand accounts 🧾 Operations and analytics teams monitoring performance KPIs Step-by-Step Setup Instructions 1️⃣ Connect all API credentials (Twitter, Facebook, Notion, Gmail, Slack, and Sheets). 2️⃣ Paste your Facebook Page ID and Twitter handle in respective API nodes. 3️⃣ Verify your Azure OpenAI GPT-4o connection and prompt text for HTML report generation. 4️⃣ Update your Notion database structure to match “Growth Chart” property names. 5️⃣ Add your marketing email in the Gmail node and test delivery. 6️⃣ Specify the Slack channel ID where summaries will be posted. 7️⃣ Optionally, connect a Google Sheet tab for error tracking (error_id, message). 8️⃣ Execute the workflow once manually to validate data flow. 9️⃣ Activate or schedule it for weekly or daily analytics automation. ✅
by Maximiliano Rojas-Delgado
Turn Your Ideas into Videos—Right from Google Sheets! This workflow helps you make cool 5-second videos using Fal.AI and Kling 2.1, just by typing your idea into a Google Sheet. You can even choose if you want your video to have sound or not. It’s super easy—no tech skills needed! And the best? 4x Cheaper than Veo3 model with similar quality! Why use this? Just type your idea in a sheet—no fancy tools or uploads. Get a video link back in the same sheet. Works with or without sound—your choice! How does it work? You write your idea, pick the video shape, and say if you want sound (true or false) in the Google Sheet. n8n reads your idea and asks Fal.AI to make your video. When your video is ready, the link shows up in your sheet. What do you need? A Google account and Google Sheets connected with service account (check this link for reference) A copy of the following Google Spreadsheet: Spreadsheet to copy An OpenAI API key A Fal.AI account with some money in it That’s it! Just add your ideas and let the workflow make the videos for you. Have fun creating! if you have any questions, just contact me at max@nervoai.com
by Yulia
Free template for voice & text messages with short-term memory This n8n workflow template is a blueprint for an AI Telegram bot that processes both voice and text messages. Ready to use with minimal setup. The bot remembers the last several messages (10 by default), understands commands and provides responses in HTML. You can easily swap GPT-4 and Whisper for other language and speech-to-text models to suit your needs. Core Features Text: send or forward messages Voice: transcription via Whisper Extend this template by adding LangChain tools. Requirements Telegram Bot API OpenAI API (for GPT-4 and Whisper) 💡 New to Telegram bots? Check our step-by-step guide on creating your first bot and setting up OpenAI access. Use Cases Personal AI assistant Customer support automation Knowledge base interface Integration hub for services that you use: Connect to any API via HTTP Request Tool Trigger other n8n workflows with Workflow Tool
by Samir Saci
Context Hey! I'm Samir, a Supply Chain Data Scientist from Paris who spent six years in China studying and working while struggling to learn Mandarin. I know the challenges of mastering a complex language like Chinese and my greatest support was flash cards. Therefore, I designed this workflow to support fellow Mandarin learners by automating flashcard creation using n8n, so they can focus more on learning and less on manual data entry. 📬 For business inquiries, you can add me on Here Who is this template for? This workflow template is designed for language learners and educators who want to automate the creation of flashcards for Mandarin (or any other language) using Google Translate API, an AI agent for phonetic transcription and generating an illustrative sentence and a free image retrieval API. Why? If you use the open-source application Anki, this workflow will help you automatically generate personalized study materials. How? Let us imagine you want to learn how to say the word Contract in Mandarin. The workflow will automatically Translate the word in Simplified Mandarin (Mandarin: 合同). Provide the phonetic transcription (Pinyin: Hétóng) Generate an example sentence (Example: 我们签订了一份合同.) Download an illustrative picture (For example, a picture of a contract signature) All these fields are automatically recorded in a Google Sheet, making it easy to import into Anki and generate flashcards instantly What do I need to start? This workflow can be used with the free tier plans of the services used. It does not require any advanced programming skills. Prerequisite A Google Drive Account with a folder including a Google Sheet API Credentials: Google Drive API, Google Sheets API and Google Translate API activated with OAuth2 credentials A free API key of pexels.com A google sheet with the columns Next Follow the sticky notes to set up the parameters inside each node and get ready to pump your learning skills. I have detailed the steps in a short tutorial 👇 🎥 Check My Tutorial Notes This workflow can be used for any language. In the AI Agent prompt, you just need to replace the word pinyin with phonetic transcription. You can adapt the trigger to operate the workflow in the way you want. These operations can be performed by batch or triggered by Telegram, email, or webhook. If you want to learn more about how I used Anki flash cards to learn mandarin: 🈷️ Blog Article about Anki Flash Cards This workflow has been created with N8N 1.82.1 Submitted: March 17th, 2025
by Harshil Agrawal
This workflow appends, lookup, updates, and reads data from a Google Sheet spreadsheet. Set node: The Set node is used to generate data that we want to add to Google Sheets. Depending on your use-case you might have data coming from a different source. For example, you might be fetching data from a WebHook call. Add the node that will fetch the data that you want to add to the Google Sheet. Use can then use the Set node to set the data that you want to add to the Google Sheets. Google Sheets node: This node will add the data from the Set node in a new row to the Google Sheet. You will have to enter the Spreadsheet ID and the Range to specify which sheet you want to add the data to. Google Sheets1 node: This node looks for a specific value in the Google Sheet and returns all the rows that contain the value. In this example, we are looking for the value Berlin in our Google Sheet. If you want to look for a different value, enter that value in the Lookup Value field, and specify the column in the Lookup Column field. Set1 node: The Set node sets the value of the rent by $100 for the houses in Berlin. We pass this new data to the next nodes in the workflow. Google Sheets2 node: This node will update the rent for the houses in Berlin with the new rent set in the previous node. We are mapping the rows with their ID. Depending on your use-case, you might want to map the values with a different column. To set this enter the column name in the Key field. Google Sheets3 node: This node returns the information from the Google Sheet. You can specify the columns that should get returned in the Range field. Currently, the node fetches the data for columns A to D. To fetch the data only for columns A to C set the range to A:C. This workflow can be broken down into different workflows each with its own use case. For example, we can have a workflow that appends new data to a Google Sheet, and another workflow that lookups for a certain value and returns that value. You can learn to build this workflow on the documentation page of the Google Sheets node.
by Yaron Been
This workflow provides automated access to the Alitas126 Alitas2 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Alitas126 Alitas2 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Alitas126/alitas2 AI model Alitas126 Alitas2**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Anthony
Disclaimer: This template only works on self-hosted for now, as it uses a community node. Use Case Web scrapers often break due to web page layout changes. This workflow attempts to mitigate this problem by auto-generating web scraping data extractor code via LLM. How It Works This workflow leverages ScrapeNinja n8n community node to: scrape webpage HTML, feed it into LLM (Google Gemini) and ask to write a JS extractor function code, then it executes the written JS extractor against scraped HTML to extract useful data from webpage (the code is safely executed in a sandbox) Installation To install ScrapeNinja n8n node, in your self-hosted instance, go to Settings -> Community nodes, enter "n8n-nodes-scrapeninja", and install. Make sure you are using at least v0.3.0. See this in action: https://www.linkedin.com/feed/update/urn:li:activity:7289659870935490560/
by Muhammad Zeeshan Ahmad
Platform: n8n (Telegram Bot Integration) Purpose: Let users fetch top meme coin prices in real-time using a simple /memecoin Telegram command How It Works (Logic Breakdown) This flow listens for a Telegram command and fetches data from the CoinGecko API to respond with live memecoin prices. 🔹 1. Telegram Trigger Node Listens for incoming Telegram messages from users. Activated when a message is sent in a Telegram chat connected to the bot. Passes the raw message (e.g., /memecoin) to the next node. 🔹 2. IF Node – Check if Message is /memecoin Condition: {{$json"message"}} === "/memecoin" If true ➝ continue to fetch data from CoinGecko. If false ➝ nothing happens. 🔹 3. HTTP Request – Fetch Meme Coins from CoinGecko API: https://api.coingecko.com/api/v3/coins/markets?...category=meme-token Fetches top 5 meme tokens by market cap. Data includes: Name Symbol Current price (USD) Coin ID (for URL linking) 🔹 4. Function Node – Format the Message Parses the JSON response from CoinGecko. Builds a clean message like: ruby Copy Edit 🚀 Dogecoin (DOGE) 💰 Price: $0.123 🔗 More: https://www.coingecko.com/en/coins/dogecoin Loops through top 5 meme coins and adds line breaks. 🔹 5. Telegram Send Node – Reply to User Sends the formatted message to the original chat. Uses chat_id from the trigger to ensure correct user receives it. 🖼 Sample User Flow 👤 User types /memecoin in Telegram bot 🤖 Bot fetches meme coin prices 📬 Bot replies with live prices + links
by Charles
Modern AI systems are powerful but pose privacy risks when handling sensitive data. Organizations need AI capabilities while ensuring: ✅ Sensitive data never leaves secure environments ✅ Compliance with regulations (GDPR, HIPAA, PCI, SOX) ✅ Real-time decision making about data sensitivity ✅ Comprehensive audit trails for regulatory review The Concept: Intelligent Data Classification + Smart Routing The goal of this concept is to build the foundations of the safe and compliant use of LLMs in Agentic workflows by automatically detecting sensitive data, applying sanitization rules, and intelligently routing requests through secure processing channels. This workflow will analyze the user's chat or webhook input and attempt to detect PII using the Enhanced PII Pattern Detector. If detected, the workflow will process that input via a series of Compliance, Auditing, and Security steps which log and sanitizes the request prior to any LLM being pinged. Why Multi-Tier Routing? Traditional systems use binary decisions (sensitive/not sensitive). Our 3-tier approach provides: ✅ Granular Security: Critical PII gets maximum protection ✅ Performance Optimization: Clean data gets full cloud capabilities ✅ Cost Efficiency: Expensive local processing only when needed ✅ User Experience: Maintains conversational flow across security levels Why Context-Aware Detection? Regex patterns alone miss contextual sensitivity. Our approach: ✅ Catches Intent: "Bank account" discussion is sensitive even without account numbers ✅ Reduces False Negatives: Medical discussions stay secure even without explicit medical IDs ✅ Proactive Protection: Identifies sensitive contexts before PII is shared ✅ Compliance Alignment: Matches how regulations actually define sensitive data Why Risk Scoring vs Binary Classification? Binary PII detection creates artificial boundaries. Risk scoring provides: ✅ Nuanced Decisions: Multiple low-risk patterns might aggregate to high risk ✅ Adaptive Thresholds: Organizations can adjust sensitivity based on their needs ✅ Better UX: Users aren't unnecessarily restricted for low-risk scenarios ✅ Audit Transparency: Clear reasoning for every routing decision Why Comprehensive Monitoring? Privacy systems require trust and verification: ✅ Compliance Proof: Audit trails demonstrate regulatory compliance ✅ Performance Optimization: Identify bottlenecks and improve efficiency ✅ Security Validation: Ensure no sensitive data leakage occurs ✅ Operational Insights: Understand usage patterns and system health How to Install: All that you will need for this workflow are credentials for your LLM providers such as Ollama, OpenRouter, OpenAI, Anthropic, etc. This workflow is customizable and allows the user to define the best LLM and storage/memory solutions for their specific use case.
by Jonathan | NEX
Effortlessly integrate NixGuard API into your n8n workflows for real-time security insights using your API key. This connector enables seamless interaction with Nix, providing rapid Retrieval-Augmented Generation (RAG) event knowledge with Wazuh integration - completely free and set up in under 5 minutes! 🚀 Features: ✅ Query NixGuard's AI-driven security insights via API authentication ✅ Real-time security event knowledge integration ✅ Plug-and-play workflow trigger for effortless automation ✅ Wazuh compatibility for full security visibility 🛠 How to Use: 1️⃣ Add your API Key to authenticate with NixGuard. 2️⃣ Integrate with your existing n8n workflows using the workflow trigger (default enabled). 3️⃣ (Optional) Activate the chat trigger to streamline security queries via chat-based inputs. 4️⃣ Run the workflow and get instant security intelligence! 📢 Perfect for: Startup CTO's, SOC teams, security engineers, and developers needing real-time security automation within their infrastructure. 🔗 Learn more about NixGuard: thenex.world 🔗 Get started with a free security subscription: thenex.world/security/subscribe
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically analyzes purchase trends and consumer behavior patterns to identify market opportunities and optimize business strategies. It saves you time by eliminating the need to manually analyze sales data and provides insights into buying patterns, seasonal trends, and customer preferences. Overview This workflow automatically scrapes e-commerce platforms, marketplace data, and sales analytics to extract purchase trends, product popularity, and consumer behavior insights. It uses Bright Data to access sales data and AI to intelligently analyze purchasing patterns, seasonal trends, and market opportunities. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping e-commerce and marketplace platforms without being blocked OpenAI**: AI agent for intelligent purchase trend analysis and forecasting Google Sheets**: For storing purchase trend data and analysis results How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your trend analysis spreadsheet Customize: Define target marketplaces and trend analysis parameters Use Cases E-commerce Strategy**: Identify trending products and market opportunities Product Development**: Understand consumer preferences and demand patterns Marketing Planning**: Optimize campaigns based on seasonal purchase trends Business Intelligence**: Make data-driven decisions using market trend insights Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #purchasetrends #marketanalysis #brightdata #webscraping #ecommerce #n8nworkflow #workflow #nocode #trendanalysis #consumerinsights #marketresearch #salesanalytics #businessintelligence #markettrends #customerinsights #ecommerceanalysis #salesdata #marketforecasting #consumerdata #purchaseanalysis #retailanalytics #marketinsights #demandforecasting #salestrends #consumertrends #marketintelligence #buyingpatterns #marketdemand
by Oneclick AI Squad
This automated n8n workflow qualifies B2B leads via voice calls using the VAPI API and integrates the collected data into Google Sheets. It triggers when a new lead’s phone number is added, streamlining lead qualification and data capture. What is VAPI? VAPI is an API service that enables voice call automation, used here to qualify leads by capturing structured data through interactive calls. Good to Know VAPI API calls may incur costs based on usage; check VAPI pricing for details. Ensure Google Sheets access is properly authorized to avoid data issues. Use credential fields for the HTTP Request node 'Bearer token' instead of hardcoding. Use a placeholder Google Sheet document ID (e.g., "your-sheet-id-placeholder") to avoid leaking private data. How It Works Detect when a new phone number is added for a lead using the New Lead Captured node. Use the Receive Lead Details from VAPI node to capture structured data (name, company, challenges) via a POST request. Trigger an outbound VAPI call to qualify the lead with the Initiate Voice Call (VAPI) node. Store the collected data into a Google Sheet using the Save Qualified Lead to CRM Sheet node. Send a success response back to VAPI with the Send Call Data Acknowledgement node. How to Use Import the workflow into n8n. Configure VAPI API credentials in the HTTP Request node using credential fields. Set up Google Sheets API access and authorize the app. Create a Google Sheet with the following columns: Name (text), Company (text), Challenges (text), Date (date). Test with a sample lead phone number to verify call initiation and data storage. Adjust the workflow as needed and retest. Requirements VAPI API credentials Google Sheets API access Customizing This Workflow Modify the Receive Lead Details from VAPI node to capture additional lead fields or adjust call scripts for specific industries.