by Encoresky
This workflow automates the process of handling conversation transcriptions and distributing key information across your organization. Here's what it does: Trigger: The workflow is initiated via a webhook that receives a transcription (e.g., from a call or meeting). Summarization & Extraction: Using AI, the transcription is summarized, and key information is extracted — such as action items, departments involved, and client details. Department Notifications: The relevant summarized information is automatically routed to specific departments via email based on content classification. CRM Sync: The summarized version is saved to the associated contact or deal in HubSpot for future reference and visibility. *Multi-Channel Alerts: *The summary is also sent via WhatsApp and Slack to keep internal teams instantly informed, regardless of platform. Use Case: Ideal for sales, customer service, or operations teams who manage client conversations and want to ensure seamless cross-departmental communication, documentation, and follow-up. Apps Used: Webhook (Trigger) OpenAI (or other AI/NLP for summarization) HubSpot Email Slack WhatsApp (via Twilio or third-party provider)
by Don Jayamaha Jr
Get deep insights into NFT market trends, sales data, and collection statistics—all powered by AI and OpenSea! This workflow connects GPT-4o-mini, OpenSea API, and n8n automation to provide real-time analytics on NFT collections, wallet transactions, and market trends. It is ideal for NFT traders, collectors, and investors looking to make informed decisions based on structured data. How It Works Receives user queries via Telegram, webhooks, or another connected interface. Determines the correct API tool based on the request (e.g., collection stats, wallet transactions, event tracking). Retrieves data from OpenSea API (requires API key). Processes the information using an AI-powered analytics agent. Returns structured insights in an easy-to-read format for quick decision-making. What You Can Do with This Agent 🔹 Retrieve NFT Collection Stats → Get floor price, volume, sales data, and market cap. 🔹 Track Wallet Activity → Analyze transactions for a given wallet address. 🔹 Monitor NFT Market Trends → Track historical sales, listings, bids, and transfers. 🔹 Compare Collection Performance → View side-by-side market data for different NFT projects. 🔹 Analyze NFT Transaction History → Check real-time ownership changes for any NFT. 🔹 Identify Market Shifts → Detect sudden spikes in demand, price changes, and whale movements. Example Queries You Can Use ✅ "Get stats for the Bored Ape Yacht Club collection." ✅ "Show me all NFT sales from the last 24 hours." ✅ "Fetch all NFT transfers for wallet 0x123...abc on Ethereum." ✅ "Compare the last 3 months of sales volume for Azuki and CloneX." ✅ "Track the top 10 wallets making the most NFT purchases this week." Available API Tools & Endpoints 1️⃣ Get Collection Stats → /api/v2/collections/{collection_slug}/stats (Retrieve NFT collection-wide market data) 2️⃣ Get Events → /api/v2/events (Fetch global NFT sales, transfers, listings, bids, redemptions) 3️⃣ Get Events by Account → /api/v2/events/accounts/{address} (Track transactions by wallet) 4️⃣ Get Events by Collection → /api/v2/events/collection/{collection_slug} (Get sales activity for a collection) 5️⃣ Get Events by NFT → /api/v2/events/chain/{chain}/contract/{address}/nfts/{identifier} (Retrieve historical transactions for a specific NFT) Set Up Steps Get an OpenSea API Key Sign up at OpenSea API and request an API key. Configure API Credentials in n8n Add your OpenSea API key under HTTP Header Authentication. Connect the Workflow to Telegram, Slack, or Database (Optional) Use n8n integrations to send alerts to Telegram, Slack, or save results to Google Sheets, Notion, etc. Deploy and Test Send a query (e.g., "Azuki latest sales") and receive instant NFT market insights! Stay ahead in the NFT market—get real-time analytics with OpenSea’s AI-powered analytics agent!
by Rajneesh Gupta
Malicious File Detection & Threat Summary Automation using Wazuh + VirusTotal + n8n This workflow helps SOC teams automate the detection and reporting of potentially malicious files using Wazuh alerts, VirusTotal hash validation, and integrated summary/report generation. It's ideal for analysts who want instant context and communication for file-based threats — without writing a single line of code. What It Does When Wazuh detects a suspicious file: Ingests Wazuh Alert** A webhook node captures incoming alerts containing file hashes (SHA256/MD5). Parses IOCs** Extracts relevant indicators (file hash, filename, etc.). Validates with VirusTotal** Automatically checks the file hash reputation using VirusTotal's threat intelligence API. Generates Human-Readable Summary** Outputs a structured file report. Routes Alerts Based on Threat Level** Sends a formatted email with the file summary using Gmail. If the file is deemed malicious/suspicious: Creates a file-related incident ticket. Sends an instant Slack alert to notify the team. Tech Stack Used Wazuh** – For endpoint alerting VirusTotal API** – For real-time hash validation n8n** – To orchestrate, parse, enrich, and communicate Slack, Gmail, Incident Tool** – To notify and take action Ideal Use Case This template is designed for security teams looking to automate file threat triage, IOC validation, and alert-to-ticket escalation, with zero human delay. Included Nodes Webhook** (Wazuh) Function** (IOC extraction and summary) HTTP Request** (VirusTotal) If / Switch** (threat level check) Gmail, **Slack, Incident Creation Tips Make sure to add your VirusTotal API key in the HTTP node. Customize the incident creation node to fit your ticketing platform (Jira, ServiceNow, etc.). Add logic to enrich the file alert further using WHOIS or sandbox reports if needed.
by Michael Gullo
A Customizable n8n Automation That Turns Your Inbox Into A Daily Digest. The goal of this workflow is to offer a highly customizable foundation that users can tailor to fit their specific platform and setup. While the current version uses Gmail, it can easily be adapted to work with other providers by replacing the email node with alternatives such as IMAP Email Trigger, Microsoft Outlook, or any compatible Email node. This workflow can also be extended to work with platforms like Telegram, WhatsApp, or any service that supports bots and n8n integration. The core objective is to generate scheduled email summaries whether it’s the most recent email, emails from a specific sender, or all emails received within a day. I built this workflow as a flexible building block for anyone looking to develop a more advanced email agent. It’s designed to reduce the mental load of reviewing emails each day by automatically delivering a summarized version of your inbox. Currently, the summary is saved to Google Docs, chosen for its simplicity and accessibility. However, users can easily modify this to integrate with other document management systems or destinations. I plan to continue updating and expanding this workflow to better serve the needs of users. If you have suggestions, ideas, or feedback, I’d love to hear them your input helps make this tool even more useful. Workflow Components Schedule Node – Triggers the workflow daily at a specified time. Gmail: Get Messages Node – Retrieves the latest email. Can be changed for any amount of emails. Limit Node – Ensures only one or any number emails is processed at a time. If Node – Checks if any emails were retrieved. Code Node – Cleans and formats the email content. Code Node – Provides a fallback message if no emails are found. OpenAI Summary Node – Summarizes the email using CharGPT. Create Google Doc Node – Creates a new Google Document for the summary. Update Google Doc Node – Inserts the summarized content into the document. Expanding The Workflow This workflow is fully modular and easy to extend. To send summaries via Telegram, Slack, or any other emails simply add the respective node after the summary is generated and connect your bot or webhook credentials. To use Outlook instead of Gmail, just swap the email input node with the Microsoft Outlook node or an IMAP Email Trigger, depending on your preferred setup. Need Help? Have Questions? For consulting and support, or if you have questions, please feel free to connect with me on LinkedIn or via email.
by Udit Rawat
This workflow is for automating and centralizing your bookmarking process using AI-powered tagging and seamless integration between your Android device and a self-hosted Read Deck platform (https://readeck.org/en/). This workflow eliminates manual entry, organizes links with smart AI-generated tags, and ensures your bookmarks are always accessible, searchable, and secure. How It Works 📱 Android Shortcut Integration Use the HTTP Shortcuts app to create a 1-tap trigger that sends URLs and titles from your Android phone directly to n8n. 🤖 AI-Powered Tagging & Processing Leverage ChatGPT-4 to analyze content context and auto-generate relevant tags (e.g., “Tech Tutorials,” “Productivity Tools”). Extract clean titles and URLs from messy shared data (even from apps like Twitter or Reddit). 🔗 Readeck Integration Automatically save processed bookmarks to your self-hosted Readeck-like platform with structured metadata (title, URL, tags). ⚡ Silent Automation It runs in the background—no pop-ups or interruptions. 🔒 Pro Security Optional authentication (API tokens, headers) to protect your data. Use Case Perfect for researchers, content creators, or anyone drowning in tabs who wants to: Save articles, videos, or social posts in one click. Organize bookmarks with AI-generated tags. Build a personal knowledge base that’s always accessible. Tutorial 1️⃣ Set Up Android Shortcut Install "HTTP Shortcuts" and configure it to send data to your n8n webhook. Enable “Share Menu” to trigger bookmarks from any app. 2️⃣ Configure n8n Workflow Import the template and add your Read Deck API token (or similar service). 3️⃣ Test & Scale Share a link from your phone—watch it appear in Read Deck instantly! Add error handling or notifications for advanced use. Note: For self-hosted platforms, ensure your instance is publicly accessible (or use a VPN). Why Choose This Workflow? Zero Manual Entry: Save hours of copying/pasting. AI Organization: Say goodbye to chaotic bookmark folders. Privacy First: Host your data on your terms. Transform your bookmarking chaos into a streamlined system—try “Save: Bookmark” today! 🚀
by Muhammad Farooq Iqbal
Google NanoBanana Model Image Editor for Consistent AI Influencer Creation with Kie AI Image Generation & Enhancement Workflow This n8n template demonstrates how to use Kie.ai's powerful image generation models to create and enhance images using AI, with automated story creation, image upscaling, and organized file management through Google Drive and Sheets. Use cases include: AI-powered content creation for social media, automated story visualization with consistent characters, marketing material generation, and high-quality image enhancement workflows. Good to know The workflow uses Kie.ai's google/nano-banana-edit model for image generation and nano-banana-upscale for 4x image enhancement Images are automatically organized in Google Drive with timestamped folders Progress is tracked in Google Sheets with status updates throughout the process The workflow includes face enhancement during upscaling for better portrait results All generated content is automatically saved and organized for easy access How it works Project Setup: Creates a timestamped folder structure in Google Drive and initializes a Google Sheet for tracking Story Generation: Uses OpenAI GPT-4 to create detailed prompts for image generation based on predefined templates Image Creation: Sends the AI-generated prompt along with 5 reference images to Kie.ai's nano-banana-edit model Status Monitoring: Polls the Kie.ai API to monitor task completion with automatic retry logic Image Enhancement: Upscales the generated image 4x using nano-banana-upscale with face enhancement File Management: Downloads, uploads, and organizes all generated content in the appropriate Google Drive folders Progress Tracking: Updates Google Sheets with status information and image URLs throughout the entire process Key Features Automated Story Creation**: AI-powered prompt generation for consistent, cinematic image creation Multi-Stage Processing**: Image generation followed by intelligent upscaling Smart Organization**: Automatic folder creation with timestamps and file management Progress Tracking**: Real-time status updates in Google Sheets Error Handling**: Built-in retry logic and failure state management Face Enhancement**: Specialized enhancement for portrait images during upscaling How to use Manual Trigger: The workflow starts with a manual trigger (easily replaceable with webhooks, forms, or scheduled triggers) Automatic Processing: Once triggered, the entire pipeline runs automatically Monitor Progress: Check the Google Sheet for real-time status updates Access Results: Find your generated and enhanced images in the organized Google Drive folders Requirements Kie.ai Account**: For AI image generation and upscaling services OpenAI API**: For intelligent prompt generation (GPT-4 mini) Google Drive**: For file storage and organization Google Sheets**: For progress tracking and status monitoring Customizing this workflow This workflow is highly adaptable for various use cases: Content Creation**: Modify prompts for different styles (fashion, product photography, architectural visualization) Batch Processing**: Add loops to process multiple prompts or reference images Social Media**: Integrate with social platforms for automatic posting E-commerce**: Adapt for product visualization and marketing materials Storytelling**: Create sequential images for visual narratives or storyboards The modular design makes it easy to add additional processing steps, change AI models, or integrate with other services as needed. Workflow Components Folder Management**: Dynamic folder creation with timestamp naming AI Integration**: OpenAI for prompts, Kie.ai for image processing File Processing**: Binary handling, URL management, and format conversion Status Tracking**: Multi-stage progress monitoring with Google Sheets Error Handling**: Comprehensive retry and failure management systems
by Anthony
How It Works This workflow transforms any webpage into an AI-narrated audio summary delivered via WhatsApp: Receive URL - WhatsApp Trigger captures incoming messages and passes them to URL extraction Extract & validate - Code node extracts URLs using regex and validates format; IF node checks for errors User feedback - Sends either error message ("Please send valid URL") or processing status ("Fetching and analyzing... 10-30 seconds") Fetch webpage - Sub-workflow calls Jina AI Reader (https://r.jina.ai/) to fetch JavaScript-rendered content, bypassing bot blocks Summarize content - GPT-4o-mini processes webpage text in 6000-character chunks, extracts title and generates concise summary Generate audio - OpenAI TTS-1 converts summary text to natural-sounding audio (Opus format for WhatsApp compatibility) Deliver result - WhatsApp node sends audio message back to user with summary Why Jina AI? Many modern websites (like digibyte.io) require JavaScript to load content. Standard HTTP requests only fetch the initial HTML skeleton ("JavaScript must be enabled"). Jina AI executes JavaScript and returns clean, readable text. Setup Steps Time estimate: ~20-25 minutes 1. WhatsApp Business API Setup (10-15 minutes) Create Meta Developer App** - Go to https://developers.facebook.com/, create Business app, add WhatsApp product Get Phone Number ID** - Use Meta's test number or register your own business phone Generate System User Token** - Create at https://business.facebook.com/latest/settings/system_users (permanent token, no 4-hour expiry) Configure Webhook** - Point to your n8n instance URL, subscribe to "messages" events Verify business** - Meta requires 3-5 verification steps (business, app, phone, system user) 2. Configure n8n Credentials (5 minutes) OpenAI** - Add API key in Credentials → OpenAI (used in 2 places: "Convert Summary to Audio" and "OpenAI Chat Model" in sub-workflow) WhatsApp OAuth** - Add in WhatsApp Trigger node using System User token from step 1 WhatsApp API** - Add in all WhatsApp action nodes (Send Error, Send Processing, Send Audio) using same token 3. Link Sub-Workflow (3 minutes) Ensure "[SUB] Get Webpage Summary" workflow is activated In "Get Webpage Summary" node, select the sub-workflow from dropdown Verify workflow ID matches: QglZjvjdZ16BisPN 4. Update Phone Number IDs (2 minutes) Copy your Phone Number ID from Meta console Update in all WhatsApp nodes: Send Error Message, Send Processing Message, Send Audio Summary 5. Test the Flow (2 minutes) Activate both workflows (sub-workflow first, then main) Send test message to WhatsApp: https://example.com Verify: Processing message arrives → Audio summary delivered within 30 seconds Important Notes WhatsApp Caveats: 24-hour window** - Can't auto-message users after 24 hours unless they message first (send "Hi" each morning to reset) Verification fatigue** - Meta requires multiple business verifications; budget 30-60 minutes if first time Test vs Production** - Test numbers work for single users; production requires business verification Audio Format: Using Opus codec (optimal for WhatsApp, smaller file size than MP3) Speed set to 1.0 (normal pace) - adjust in "Convert Summary to Audio" node if needed Cost: ~$0.015 per minute of audio generated Jina AI Integration: Free tier** works for basic use (no API key required) Handles JavaScript-heavy sites automatically Add Authorization: Bearer YOUR_KEY header for higher limits Alternative: Replace with Playwright/Puppeteer for self-hosted rendering Cost Breakdown (per summary): GPT-4o-mini summarization: ~$0.005-0.015 OpenAI TTS audio: ~$0.005-0.015 WhatsApp messages: Free (up to 1,000/month) Total: ~$0.01-0.03 per summary** Troubleshooting: "Cannot read properties of undefined" → Status update, not message (code node returns null correctly) "JavaScript must be enabled" → Website needs Jina AI (already implemented in Fetch site texts node) Audio not sending → Check binary data field is named data in TTS node No webhook received → Verify n8n URL is publicly accessible and webhook subscription includes "messages" Pro Tips: Change voice in TTS node: alloy (neutral), echo (male), nova (female), shimmer (soft) Adjust summary length: Modify chunkSize: 6000 in sub-workflow's Text Splitter node (lower = faster but less detailed) Add rate limiting: Insert Code node after trigger to track user requests per hour Store summaries: Add database node after "Clean up" to archive for later retrieval The Use Cases: Executive commuting - Consume industry news hands-free Research students - Cover 3x more sources while multitasking Visually impaired - Access any webpage via natural audio Sales teams - Research prospects on the go Content creators - Monitor competitors while exercising
by Jitesh Dugar
Automatically qualify inbound demo requests, scrape prospect websites, and send AI-personalized outreach emails—all on autopilot. What This Workflow Does This end-to-end lead automation workflow helps SaaS companies qualify and nurture inbound leads with zero manual work until human approval. Key Features ✅ Smart Email Filtering - Automatically flags personal emails (Gmail, Yahoo, etc.) and routes them to a polite regret message ✅ Website Intelligence - Scrapes prospect websites and extracts business context ✅ AI Analysis - Uses OpenAI to score ICP fit, identify pain points, and find personalization opportunities ✅ Personalized Outreach - AI drafts custom emails referencing specific details from their website ✅ Human-in-the-Loop - Approval gate before sending to ensure quality control ✅ Professional Branding - Even rejected leads get a thoughtful response Perfect For B2B SaaS companies with inbound lead forms Sales teams drowning in demo requests Businesses wanting to personalize at scale Anyone needing intelligent lead qualification What You'll Need Jotform account (or any form tool with webhooks) Create your form for free on Jotform using this link OpenAI API key Gmail account (or any email service) n8n instance (cloud or self-hosted) Workflow Sections 📧 Lead Intake & Qualification - Capture form submissions and filter personal emails 🕷️ Website Scraping - Extract company information from their domain ❌ Regret Flow - Send polite rejection to unqualified leads 🤖 AI Analysis - Analyze prospects and draft personalized emails 📨 Approved Outreach - Human review + send welcome email Customization Tips: Update the AI prompt with your company's ICP and value proposition Modify the personal email provider list based on your market Adjust the regret email template to match your brand voice Add Slack notifications for high-value leads Connect your CRM to log all activities Time Saved: ~15-20 minutes per lead Lead Response: Under 5 minutes (vs hours/days manually)
by DevCode Journey
Who is this for? This workflow is designed for business founders, CMOs, marketing teams, and landing page designers who want to automatically analyze their landing pages and get personalized, unconventional, high-impact conversion rate optimization (CRO) recommendations. It works by scraping the landing page content, then leveraging multiple AI models to roast the page and generate creative CRO ideas tailored specifically for that page. What this Workflow Does / Key Features Captures a landing page URL through a user-friendly form trigger. Scrapes the landing page HTML content using an HTTP request node. Sends the scraped content to a LangChain AI Agent, which orchestrates various AI models (OpenAI, Google Gemini, Mistral, etc.) for deep analysis. The AI Agent produces a friendly, fun, and unconventional “roast” of the landing page, explaining what’s wrong in human tone. Generates 10 detailed, personalized, easy-to-implement, and 2024-relevant CRO recommendations with a “wow” factor. Delivers the analysis and recommendations via Telegram message, Gmail email, and WhatsApp (via Rapiwa). Utilizes multiple AI tools and search APIs to enhance the quality and creativity of the output. Requirements OpenAI API credentials configured in n8n. Google Gemini (PaLM) API credentials for LangChain integration. Mistral Cloud API credentials for text extraction. Telegram bot credentials for sending messages. Gmail OAuth2 credentials for email delivery. Rapiwa API credentials for WhatsApp notifications. Running n8n instance with nodes: Form Trigger, HTTP Request, LangChain AI Agent, Telegram, Gmail, and custom Rapiwa node. How to Use — step-by-step Setup 1) Credentials Add your OpenAI API key under n8n credentials (OpenAi account 2). Add Google Gemini API key (Google Gemini (PaLM) Api account). Add Mistral Cloud API key (Mistral Cloud account). Set up Telegram Bot credentials (Telegram account). Set up Gmail OAuth2 credentials (Gmail account). Add Rapiwa API key for WhatsApp messages (Rapiwa). 2) Configure the Form Trigger Customize the form title, description, and landing page URL input placeholder if desired. 3) Customize Delivery Nodes Modify the Telegram, Gmail, and Rapiwa nodes with your desired recipient info and messaging preferences. 4) Run the Workflow Open the form URL webhook and submit the landing page URL to get a detailed AI-powered CRO roast and recommendations sent directly to your communication channels. Important Notes The AI Agent prompt is designed to create a fun and unconventional roast to engage users emotionally. Avoid generic advice. All CRO recommendations are personalized and contextual based on the scraped content of the provided landing page. Ensure all API credentials are kept secure and not hard-coded. Use n8n credentials management. Adjust the delivery nodes to match your preferred communication channels and recipients. The workflow supports expansion with additional AI models or messaging platforms as needed. 🙋 For Help & Community 👾 Discord: n8n channel 🌐 Website: devcodejourney.com 🔗 LinkedIn: Connect with Shakil 📱 WhatsApp Channel: Join Now 💬 Direct Chat: Message Now
by George Zargaryan
Multichannel AI Assistant Demo for Chatwoot This simple n8n template demonstrates a Chatwoot integration that can: Receive new messages via a webhook. Retrieve conversation history. Process the message history into a format suitable for an LLM. Demonstrate an AI Assistant processing a user's query. Send the AI Assistant's response back to Chatwoot. Use Case: If you have multiple communication channels with your clients (e.g., Telegram, Instagram, WhatsApp, Facebook) integrated with Chatwoot, you can use this template as a starting point to build more sophisticated and tailored AI solutions that cover all channels at once. How it works A webhook receives the message created event from Chatwoot. The webhook data is then filtered to keep only the necessary information for a cleaner workflow. The workflow checks if the message is "incoming." This is crucial to prevent the assistant from replying to its own messages and creating endless loops. The conversation history is retrieved from Chatwoot via an API call using the HTTP Request node. This allows the assistant's interaction to be more natural and continuous without needing to store conversation history locally. A simple AI Assistant processes the conversation history and generates a response to the user based on its built-in knowledge base (see the prompt in the assistant node). The final HTTP Request node sends the AI-generated response back to the appropriate Chatwoot conversation. How to Use In Chatwoot, go to Settings → Integrations → Webhooks and add your n8n webhook URL. Be sure to select the message created event. In the HTTP Request nodes, replace the placeholder values: https://yourchatwooturl.com api_access_token You can find these values on your Chatwoot super admin page. The LLM node is configured to use OpenRouter. Add your OpenRouter credentials, or replace the node with your preferred LLM provider. Requirements An API key for OpenRouter or credentials for your preferred LLM provider. A Chatwoot account with at least one integrated channel and super admin access to obtain the api_access_token. Need Help Building Something More? Contact me on: Telegram:** @ninesfork LinkedIn:** George Zargaryan Happy Hacking! 🚀
by WeblineIndia
IPA Size Tracker with Trend Alerts – Automated iOS Apps Size Monitoring This workflow runs on a daily schedule and monitors IPA file sizes from configured URLs. It stores historical size data in Google Sheets, compares current vs. previous builds and sends email alerts only when significant size changes occur (default: ±10%). A DRY_RUN toggle allows safe testing before real notifications go out. Who’s it for iOS developers tracking app binary size growth over time. DevOps teams monitoring build artifacts and deployment sizes. Product managers ensuring app size budgets remain acceptable. QA teams detecting unexpected size changes in release builds. Mobile app teams optimizing user experience by keeping apps lightweight. How it works Schedule Trigger (daily at 09:00 UTC) kicks off the workflow. Configuration: Define monitored apps with {name, version, build, ipa_url}. HTTP Request downloads the IPA file from its URL. Size Calculation: Compute file sizes in bytes, KB, MB and attach timestamp metadata. Google Sheets: Append size data to the IPA Size History sheet. Trend Analysis: Compare current vs. previous build sizes. Alert Logic: Evaluate thresholds (>10% increase or >10% decrease). Email Notification: Send formatted alerts with comparisons and trend indicators. Rate Limit: Space out notifications to avoid spamming recipients. How to set up 1. Spreadsheet Create a Google Sheet with a tab named IPA Size History containing: Date, Timestamp, App_Name, Version, Build_Number, Size_Bytes, Size_KB, Size_MB, IPA_URL 2. Credentials Google Sheets (OAuth)** → for reading/writing size history. Gmail** → for sending alert emails (use App Password if 2FA is enabled). 3. Open “Set: Configuration” node Define your workflow variables: APP_CONFIGS = array of monitored apps ({name, version, build, ipa_url}) SPREADSHEET_ID = Google Sheet ID SHEET_NAME = IPA Size History SMTP_FROM = sender email (e.g., devops@company.com) ALERT_RECIPIENTS = comma-separated emails SIZE_INCREASE_THRESHOLD = 0.10 (10%) SIZE_DECREASE_THRESHOLD = 0.10 (10%) LARGE_APP_WARNING = 300 (MB) SCHEDULE_TIME = 09:00 TIMEZONE = UTC DRY_RUN = false (set true to test without sending emails) 4. File Hosting Host IPA files on Google Drive, Dropbox or a web server. Ensure direct download URLs are used (not preview links). 5. Activate the workflow Once configured, it will run automatically at the scheduled time. Requirements Google Sheet with the IPA Size History tab. Accessible IPA file URLs. SMTP / gmail account (Gmail recommended). n8n (cloud or self-hosted) with Google Sheets + Email nodes. Sufficient local storage for IPA file downloads. How to customize the workflow Multiple apps**: Add more configs to APP_CONFIGS. Thresholds**: Adjust SIZE_INCREASE_THRESHOLD / SIZE_DECREASE_THRESHOLD. Notification templates**: Customize subject/body with variables: {{app_name}}, {{current_size}}, {{previous_size}}, {{change_percent}}, {{trend_status}}. Schedule**: Change Cron from daily to hourly, weekly, etc. Large app warnings**: Adjust LARGE_APP_WARNING. Trend analysis**: Extend beyond one build (7-day, 30-day averages). Storage backend**: Swap Google Sheets for CSV, DB or S3. Add-ons to level up Slack Notifications**: Add Slack webhook alerts with emojis & formatting. Size History Charts**: Generate trend graphs with Chart.js or Google Charts API. Environment separation**: Monitor dev/staging/prod builds separately. Regression detection**: Statistical anomaly checks. Build metadata**: Log bundle ID, SDK versions, architectures. Archive management**: Auto-clean old records to save space. Dashboards**: Connect to Grafana, DataDog or custom BI. CI/CD triggers**: Integrate with pipelines via webhook trigger. Common Troubleshooting No size data** → check URLs return binary IPA (not HTML error). Download failures** → confirm hosting permissions & direct links. Missing alerts** → ensure thresholds & prior history exist. Google Sheets errors** → check sheet/tab names & OAuth credentials. Email issues** → validate SMTP credentials, spam folder, sender reputation. Large file timeouts** → raise HTTP timeout for >100MB files. Trend errors** → make sure at least 2 builds exist. No runs** → confirm workflow is active and timezone is correct. Need Help? If you’d like this to customize this workflow to suit your app development process, then simply reach out to us here and we’ll help you customize the template to your exact use case.
by Shayan Ali Bakhsh
About this Workflow This workflow helps you repurpose your YouTube videos across multiple social media platforms with zero manual effort. It’s designed for creators, businesses, and marketers who want to maximize reach without spending hours re-uploading content everywhere. How It Works Trigger from YouTube The workflow checks your YouTube channel every 10 minutes via RSS feed. It compares the latest video ID with the last saved one to detect if a new video was uploaded. Tutorial: How to get YouTube Channel RSS Feed Generate Descriptions with AI Uses Gemini 2.5 Flash to automatically generate fresh, engaging descriptions for your posts. Create Images with ContentDrips ContentDrips offers multiple templates (carousel, single image, branding templates, etc.). The workflow generates a custom promotional image using your video description and thumbnail. Install node: npm install n8n-nodes-contentdrips Docs: ContentDrips Blog Tutorial Publish Across Social Platforms with SocialBu Instead of manually connecting each social media API, this workflow uses SocialBu. From a single connection, you can post to: Facebook, Instagram, TikTok, YouTube, Twitter (X), LinkedIn, Threads, Pinterest, and more. Website: SocialBu Get Real-Time Notifications via Discord After each run, the workflow sends updates to your Discord channel. You’ll know if the upload was successful, or if an error occurred (e.g., API limits). Setup guide: Discord OAuth Credentials Why Use This Workflow? Saves time by automating the entire repurposing process. Ensures consistent branding and visuals across platforms. Works around platform restrictions by leveraging SocialBu’s integrations. Keeps you updated with Discord notifications—no guessing if something failed. Requirements YouTube channel RSS feed link ContentDrips API key, template ID, and branding setup SocialBu account with connected social media platforms Discord credentials (for webhook updates) Need Help? Message me on LinkedIn: Shayan Ali Bakhsh Happy Automation 🚀