by John Pranay Kumar Reddy
✨ Summary Efficiently monitor Kubernetes environments by sending only unique error logs from Grafana Loki to Slack. Reduces alert fatigue while keeping your team informed about critical log events. 🧑💻 Who’s it for DevOps or SRE engineers running EKS/GKE/AKS Anyone using Grafana Loki and Promtail for centralized logging Teams that want Slack alerts but hate alert spam 🔍 What it does This n8n workflow queries your Loki logs every 5 minutes, filters only the critical ones (error, timeout, exception, etc.), removes duplicate alerts within the batch, and sends clean alerts to a Slack channel with full metadata (pod, namespace, node, container, log, timestamp). 🧠 How it works 🕒 Schedule Trigger Every 5 minutes (customizable) 🌐 Loki HTTP Query Pulls logs from the last 10 minutes Keyword match: error, failed, oom, etc. 🧹 Log Parsing Extracts log fields (pod, container, etc.) Skips empty/malformed results 🧠 Deduplication Removes repeated error messages (within query window) 📤 Slack Notification Sends nicely formatted message to Slack ⚙️ Requirements Tool Notes Loki- Exposed internally or externally Slack App- With chat:write OAuth n8n- Cloud or self-hosted 🔧 How to Set It Up Import the JSON file into n8n Update: Loki API URL (e.g., http://loki-gateway.monitoring.svc.cluster.local) Slack Bearer Token (via credentials) Target Slack channel (e.g., #k8s-alerts) (Optional) Change keywords in the query regex Activate the workflow Ensure n8n pod/container is having access to your kubernetes cluster/pods/namespaces 🛠 How to Customize Want more or fewer keywords? Adjust the regex in the Query Loki for Error Logs node. Need to increase deduplication logic? Enhance the Remove Duplicate Alerts node. Want 5-log summaries every 5 min? Fork this and add a Batch + Slack group sender. Grafana Loki logs to Slack Output
by Rodrigue Gbadou
How it works Regulatory monitoring**: Continuously tracks changes in laws, regulations, and compliance requirements across multiple jurisdictions Contract analysis**: AI-powered review of existing contracts to identify compliance gaps and risks Automated alerts**: Real-time notifications when regulatory changes affect your contracts or business operations Compliance reporting**: Generates audit-ready reports and documentation for regulatory compliance Set up steps Legal databases**: Connect to legal research platforms (Westlaw, LexisNexis, EUR-Lex) Contract repository**: Integrate with your contract management system or document storage Regulatory feeds**: Configure government and regulatory body RSS feeds and APIs AI legal analysis**: Set up OpenAI or specialized legal AI for contract analysis Compliance calendar**: Integrate with calendar systems for deadline tracking Audit trail**: Configure logging and documentation systems for compliance records Key Features 🔍 Multi-jurisdiction monitoring**: Tracks regulatory changes across different countries and regions 📊 Risk assessment**: Automatically scores compliance risks and potential impact ⚡ Real-time alerts**: Instant notifications when regulations affecting your business change 📋 Gap analysis**: Identifies discrepancies between current contracts and new requirements 🤖 AI-powered analysis**: Uses natural language processing to understand legal text 📈 Compliance dashboard**: Visual overview of compliance status across all contracts 🔄 Automated remediation**: Suggests contract amendments and compliance actions 📱 Mobile notifications**: Critical compliance alerts on mobile devices Compliance areas monitored Data protection**: GDPR, CCPA, and other privacy regulations Financial services**: Banking regulations, securities law, anti-money laundering Healthcare**: HIPAA, medical device regulations, pharmaceutical compliance Employment law**: Labor regulations, workplace safety, discrimination laws Environmental**: ESG requirements, environmental protection regulations Industry-specific**: Sector-specific regulations and standards Contract types supported Vendor agreements**: Supplier contracts and service agreements Employment contracts**: Employee agreements and contractor terms Data processing agreements**: Privacy and data handling contracts Customer agreements**: Terms of service and customer contracts Partnership agreements**: Joint ventures and strategic partnerships Licensing agreements**: Software licenses and intellectual property Automated responses Low risk (0-30)**: Routine monitoring and documentation Medium risk (31-60)**: Enhanced review and stakeholder notification High risk (61-80)**: Immediate legal review and action planning Critical risk (81-100)**: Emergency legal intervention and compliance measures Integration capabilities Legal research**: Westlaw, LexisNexis, Bloomberg Law Document management**: SharePoint, Google Drive, Dropbox Contract systems**: DocuSign, PandaDoc, ContractWorks Communication tools**: Slack, Teams, email for legal team alerts Calendar systems**: Outlook, Google Calendar for compliance deadlines This workflow ensures continuous legal compliance by monitoring regulatory changes and automatically assessing their impact on your contracts and business operations.
by siyad
This n8n workflow automates the process of monitoring inventory levels for Shopify products, ensuring timely updates and efficient stock management. It is designed to alert users when inventory levels are low or out of stock, integrating with Shopify's webhook system and providing notifications through Discord (can be changed to any messaging platform) with product images and details. Workflow Overview Webhook Node (Shopify Listener): This node is set up to listen for Shopify's inventory level webhook. It triggers the workflow whenever there is an update in the inventory levels. The webhook is configured in Shopify settings, where the n8n URL is specified to receive inventory level updates. Function Node (Inventory Check): This node processes the data received from the Shopify webhook. It extracts the available inventory and the inventory item ID, and determines whether the inventory is low (less than 4 items) or out of stock. Condition Nodes (Inventory Level Check): Two condition nodes follow the function node. One checks if the inventory is low (low_inventory equals true), and the other checks if the inventory is out of stock (out_of_stock equals true). GraphQL Node (Product Details Retrieval): Connected to the condition nodes, this node fetches detailed information about the product using Shopify's GraphQL API. It retrieves the product variant, title, current inventory quantity, and the first product image. HTTP Node (Discord Notification): The final node in the workflow sends a notification to Discord. It includes an embed with the product title, a warning message ("This product is running out of stock!"), the remaining inventory quantity, product variant details, and the product image. The notification ensures that relevant stakeholders are immediately informed about critical inventory levels.
by Abdulrahman Alhalabi
NGO TPM Request Management System Benefits For Beneficiaries: 24/7 Accessibility** - Submit requests anytime via familiar Telegram interface Language Flexibility** - Communicate in Arabic through text or voice messages Instant Acknowledgment** - Receive immediate confirmation that requests are logged No Technical Barriers** - Works on basic smartphones without special apps For TPM Teams: Centralized Tracking** - All requests automatically logged with timestamps and user details Smart Prioritization** - AI categorizes issues by urgency and type for efficient response Action Guidance** - Specific recommended actions generated for each request type Performance Analytics** - Track response patterns and common issues over time For NGO Operations: Cost Reduction** - Automated intake reduces manual processing overhead Data Quality** - Standardized categorization ensures consistent reporting Audit Trail** - Complete record of all beneficiary interactions for compliance Scalability** - Handle high volumes without proportional staff increases How it Works Multi-Input Reception - Accepts both text messages and voice recordings via Telegram Voice Transcription - Uses OpenAI Whisper to convert Arabic voice messages to text AI Categorization - GPT-4 analyzes requests and categorizes issues (aid distribution, logistics, etc.) Action Planning - AI generates specific recommended actions for TPM team in Arabic Data Logging - Records all requests, categories, and actions in Google Sheets with user details Confirmation Feedback - Sends acknowledgment message back to users via Telegram Set up Steps Setup Time: ~20 minutes Create Telegram Bot - Get bot token from @BotFather and configure webhook Configure APIs - Set up OpenAI (transcription + chat) and Google Sheets credentials Customize AI Prompts - Adjust system messages for your NGO's specific operations Set Up Spreadsheet - Link Google Sheets for request tracking and reporting Test Workflows - Verify both text and voice message processing paths Detailed Arabic language configuration and TPM-specific categorization examples are included as sticky notes within the workflow. What You'll Need: Telegram Bot Token (free from @BotFather) OpenAI API key (Whisper + GPT-4) Google Sheets API credentials Google Spreadsheet for logging requests Sample Arabic text/voice messages for testing Key Features: Dual input support (text + voice messages) Arabic language processing and responses Structured data extraction (category + recommended action) Complete audit trail with user information Real-time confirmation messaging TPM team-specific workflow optimization
by Omar Akoudad
The workflow is well-designed for CRM analysis with a robust quality control mechanism. The dual-AI approach ensures reliable results, while the webhook integration makes it production-ready for real-time CRM data processing. Dual-AI Architecture: Uses DeepSeek Reasoner for analysis and DeepSeek Chat for verification. Flexible Input: Supports both manual testing and production webhook integration. Quality Assurance: Built-in verification system to ensure report accuracy. Comprehensive Analysis: Covers lead conversion, upsell metrics, agent ranking, and more. Professional Output: Generates structured markdown reports with actionable insights
by Daniel Lianes
Automated Daily AI Summaries from WhatsApp Groups using a Custom AI Agent Transform your WhatsApp group conversations into actionable business intelligence through automated AI analysis and daily reporting. This workflow eliminates manual conversation monitoring by capturing messages in real-time, processing voice notes, and delivering structured insights directly to your team. Overview This workflow provides complete conversation intelligence automation from message capture to insight delivery. It eliminates manual monitoring, analysis, and reporting by using Evolution API integration, OpenAI transcription, and advanced LLM analysis for hands-free business intelligence that scales your team's awareness of important discussions. Core Function: Autonomous conversation analysis that transforms WhatsApp group chatter into structured business insights with zero manual intervention, maintaining consistent daily reporting while capturing emerging opportunities and trends before your competition. Key Capabilities Real-time message capture - Monitors multiple WhatsApp groups simultaneously with instant processing and smart filtering Voice message transcription - Automatic conversion of audio messages to searchable text using OpenAI Whisper AI-powered insight extraction - Advanced LLM analysis identifies trends, opportunities, and actionable information while filtering noise Automated daily reporting - Scheduled intelligence summaries delivered directly to your team via WhatsApp Multi-group organization - Separate tracking and analysis for different communities with unified reporting Smart content filtering - AI agent trained to focus on business-relevant discussions (AI, automation, tech trends, opportunities) Tools Used n8n: Workflow orchestration managing the entire intelligence pipeline from capture to delivery Evolution API: WhatsApp Business API integration for real-time message monitoring and sending OpenAI Whisper: Voice message transcription ensuring no important audio content is missed OpenRouter/GPT-4.1: Advanced AI analysis for intelligent insight extraction and content filtering Google Sheets: Organized message storage with timestamps and metadata for historical analysis Custom AI Agent: "WhatsOn" - specialized business intelligence detective for tech and automation insights How to Install Import the Workflow: Download the JSON file and import into your n8n instance Configure Evolution API: Set up WhatsApp integration and webhook endpoints for message capture API Credentials Setup: Add OpenAI, OpenRouter, and Google Sheets credentials in n8n Group Configuration: Update group IDs in the "Set Info" node with your monitored groups Google Sheets Setup: Create organized spreadsheet with separate tabs for each group Schedule Configuration: Set your preferred daily summary delivery time Test Execution: Run manual test to verify message capture and AI analysis work correctly Use Cases Business Intelligence Automation: Stay informed about industry discussions without manual monitoring Opportunity Detection: Identify emerging trends, tools, and business opportunities in real-time Team Knowledge Sharing: Automated distribution of relevant insights from multiple communities Competitive Intelligence: Monitor industry discussions to stay ahead of market developments Community Management: Track engagement patterns and important conversations across groups Voice Message Processing: Ensure audio-based insights aren't lost in team communications Setup Requirements Evolution API account: WhatsApp Business integration with webhook capabilities OpenAI API: Voice transcription access through Whisper API OpenRouter account: Access to GPT-4.1 for advanced conversation analysis Google Sheets: Message storage and organization with proper permissions configured WhatsApp Groups: Access to business or professional groups with relevant discussions Total setup time: 15-20 minutes once all API accounts are properly configured. How to Customize Analysis Focus: Modify the AI agent's system prompt to target your industry or specific topics. Adjust keyword priorities, conversation themes, or insight categories based on your business needs. Group Management: Add additional groups by extending the Switch node logic, creating new Google Sheets tabs, and updating group ID variables. Scale from 3 to unlimited group monitoring. Delivery Schedule: Change summary frequency from daily to weekly, multiple times per day, or custom schedules. Add multiple delivery destinations for different team segments. AI Intelligence: Customize the "WhatsOn" agent personality, adjust insight priorities, modify filtering criteria, or add sentiment analysis for deeper conversation understanding. Storage & Organization: Modify Google Sheets structure, add custom metadata fields, integrate with other databases, or connect to business intelligence dashboards for advanced analytics. Advanced Features Smart Voice Processing Automatically transcribes voice messages to text using OpenAI's Whisper API, ensuring critical audio-based discussions are captured and analyzed alongside text conversations. Intelligent Content Filtering The AI agent is specifically trained to identify valuable business insights while filtering out casual conversation, ensuring your daily summaries focus on actionable information that drives decisions. Multi-Fragment Delivery System Large intelligence summaries are automatically broken into properly formatted WhatsApp messages with natural pacing to avoid delivery issues and improve readability. Historical Analysis Capability All conversations are stored with full metadata in Google Sheets, enabling historical trend analysis, keyword tracking, and long-term pattern recognition for strategic planning. Ready to transform group conversations into competitive intelligence? This template converts casual WhatsApp discussions into structured business insights delivered automatically to your team, ensuring you never miss important industry developments or opportunities. Google Sheets Template The workflow includes a pre-configured structure for tracking: Message timestamps and sender information Full conversation content with voice transcriptions Group-specific organization and categorization Daily summary delivery logs and performance metrics Was this helpful? Let me know! I truly hope this WhatsApp intelligence system helps streamline your team's awareness of important conversations. Your feedback helps me create better automation resources for the n8n community. Ready to Build Something Great? If you're looking to take your n8n skills or business automation to the next level, I can help. 🎓 n8n Coaching: Want to become an n8n pro? I offer one-on-one coaching sessions to help you master workflows, tackle specific problems, and build with confidence. ➡️ Book a Coaching Session 💼 n8n Consulting: Have a complex project, an integration challenge, or need a custom workflow built for your business? Let's work together to create a powerful automation solution. ➡️ Inquire About Consulting Services Stay Updated on Automation For more content automation strategies, AI workflow tips, and business automation insights: Follow me on LinkedIn Happy Automating! Daniel Lianes
by kreonovo
What this does This automation will create tables in your Airtable base and send Webflow form submissions as records in those tables. It keeps track of your Webflow forms with a Form Index table to ensure that form content does not get sent to the wrong table if you have Webflow forms with identical names. This is useful for Enhancing your Webflow form submission management. By sending structured data to your Airtable base you can do more with your Webflow forms. Usage guide Full written and video guide This automation is made to be as low effort as possible to start enhancing your Webflow forms. Simply create credentials for Webflow and a Personal Access Token for your Airtable base and connect them to the relevant nodes. The guide demonstrates a realworld example using a Webflow template and breaksdown the flow and logic for all the nodes.
by Yaron Been
This workflow automatically analyzes sales territory performance, comparing revenue, win rates, and activity across regions. Remove the guesswork from territory planning and drive balanced growth. Overview On a weekly schedule, the workflow pulls CRM data for each territory, merges it with demographic and market size info scraped via Bright Data, and feeds everything into OpenAI for performance benchmarking. Outliers—both high and low performers—are highlighted in a Google Data Studio dashboard and summarized in a Slack message. Tools Used n8n** – Orchestrates data collection and analysis CRM API** – Source of sales metrics by territory Bright Data** – Scrapes external market indicators (population, GDP, etc.) OpenAI** – Normalizes and benchmarks territories Google Sheets / Data Studio** – Stores and visualizes results Slack** – Sends the weekly summary How to Install Import the Workflow into n8n. Connect Your CRM API credentials. Configure Bright Data credentials. Set Up OpenAI API key. Authorize Google services & Slack. Customize Territory Definitions in the Set node. Use Cases Sales Leadership**: Rebalance territories based on potential. Revenue Operations**: Identify underserved regions. Financial Planning**: Allocate resources where ROI is highest. Incentive Design**: Reward reps fairly based on potential. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #territorymanagement #salesanalytics #brightdata #openai #n8nworkflow #nocode #revenueops
by Kumar Shivam
Shopify AI Product Description Factory The Shopify AI Product Description Factory is a production-grade n8n workflow that converts product images and metadata into refined, SEO-aware descriptions—fully automated and region-agnostic. It blends GPT-4o vision for visible attribute extraction, Claude 3.5 Sonnet for premium copy, Perplexity research for verified brand context, Google Sheets for orchestration and audit trails, plus automated daily sales analytics enrichment. Link-header pagination and structured output enforcement ensure reliable scale. To refine according to your usecase connect via my profile @connect Key Advantages Vision-first copywriting Uses gpt-4o to identify only visible physical attributes (closure, heel, materials, sole) from product images—no guesses. Premium copy generation anthropic/claude-3.5-sonnet crafts concise, benefit-led descriptions with consistent tone, length control, and clean formatting. Research-assisted accuracy perplexityTool verifies vendor/brand context from official sources to avoid speculation or fabricated claims. Pagination you can trust Automates Shopify REST pagination via Link headers and persists page_info for resumable runs. Google Sheets orchestration Centralized staging, status tracking, and QA in Products, with ProcessingState for batch/page markers, and Error_log for diagnostics. Bulletproof error feedback errorTrigger + AI diagnosis logs clear, non-technical and technical explanations to Error_log for fast recovery. Automated sales analytics Daily sales tracking automatically captures and enriches total sales data for comprehensive business intelligence and performance monitoring. How It Works Intake and filtering httpRequest fetches /admin/api/2024-04/products.json?limit=200&{page_info} code filters only items with: Image present Empty body_html The currSeas:SS2025 tag Extracts tag metadata such as x-styleCode, country_of_origin, and gender when available Pagination controller code parses Link headers for rel="next" and extracts page_info googleSheets updates ProcessingState with page_info_next and increments the batch number for resumable polling Generation pipeline googleSheets pulls rows with Status = Ready for AI Description; limit throttles batch size openAi Analyze image (model gpt-4o) returns strictly visible features lmChatOpenRouter (Claude 3.5) composes the SEO description, optionally blending verified vendor context from perplexityTool outputParserStructured guarantees strict JSON: product_id, product_title (normalized), generated_description, status googleSheets writes results back to Products for review/publish Sales analytics enrichment Schedule Trigger** runs daily at 2:01 PM to capture previous day's sales httpRequest fetches paid orders from Shopify REST API with date range filtering splitOut and summarize nodes calculate total daily sales Automatic Google Sheets logging with date stamps and totals Zero-sale days are properly recorded for complete analytics continuity Reliability and insight errorTrigger routes failures to an AI agent that explains the root cause and appends a concise note to Error_log. What's Inside (Node Map) Data + API httpRequest (Shopify REST 2024-04 for products and orders) googleSheets (multiple sheet operations) googleSheetsTool (error logging) AI models openAi (gpt-4o vision analysis) lmChatOpenRouter (anthropic/claude-3.5-sonnet for content generation) AI Agent** (intelligent error diagnosis) Analytics & Processing splitOut (order data processing) summarize (sales totals calculation) set nodes (data field mapping) Tools and guards perplexityTool (brand research) outputParserStructured (JSON validation) memoryBufferWindow (conversation context) Control & Scheduling scheduleTrigger (multiple time-based triggers) cron (periodic execution) limit (batch size control) if (conditional logic) code (custom filtering and pagination logic) Observability errorTrigger + AI diagnosis to Error_log Processing state tracking Sales analytics logging Content & Compliance Rules Locale-agnostic copy**; brand voice is configurable per store Only image-verifiable attributes** (no guesses); clean HTML suitable for Shopify themes Optional normalization rules (e.g., color/branding cleanup, title sanitization) Style code inclusion supported when x-styleCode is present Gender-aware content generation when gender tag is present Strict JSON output** and schema consistency for safe downstream publishing Setup Steps Core integrations Shopify Access Token** — Products read + Orders read (REST 2024-04) OpenAI API** — gpt-4o vision OpenRouter API** — Claude Sonnet (3.5) Perplexity API** — vendor/market verification via perplexityTool Google Sheets OAuth** — Products, ProcessingState, Error_log, Sales analytics Configure sheets ProcessingState** with fields: batch number page_info_next Products** with: Product ID Product Title Product Type Vendor Image url Status country of origin x_style_code gender Generated Description Error_log** with: timestamp Reason of Error Sales Analytics Sheet** with: Date Total Sales Workflow Capabilities Discovery and staging Auto-paginate Shopify; stage eligible products in Sheets with reasons and timestamps. Vision-grounded copywriting Descriptions reflect only visible attributes plus verified brand context; concise, mobile-friendly structure with gender-aware tone. Metadata awareness Auto-injects x-styleCode, country_of_origin, and gender when present; natural SEO for brand and product type. Sales intelligence Automated daily sales tracking with Melbourne timezone support, handles zero-sale days, and maintains complete historical records. Error analytics Layman + technical diagnosis logged to Error_log to shorten MTTR. Safe output Structured JSON via outputParserStructured for predictable row updates. Credentials Required Shopify Access Token** (Products + Orders read permissions) OpenAI API Key** (GPT-4o vision) OpenRouter API Key** (Claude Sonnet) Perplexity API Key** Google Sheets OAuth** Ideal For E-commerce teams** scaling compliant, on-brand product copy with comprehensive sales insights Agencies and SEO specialists** standardizing image-grounded descriptions with performance tracking and analytics Stores** needing resumable pagination, auditable content operations, and automated daily sales reporting in Sheets Advanced Features Dual-workflow architecture**: Content generation + Sales analytics in one system Link-header pagination with page_info persistence in ProcessingState Title/content normalization (e.g., color removal) configurable per brand Gender-aware copywriting** based on product tags Memory windows (memoryBufferWindow) to keep multi-step prompts consistent Melbourne timezone support** for accurate daily sales cutoffs Zero-sales handling** ensures complete analytics continuity Structured Output enforcement for downstream safety AI-powered error diagnosis** with technical and layman explanations Time & Scheduling (Universal) The workflow includes two independent schedules: Content Generation**: Every 5 minutes (configurable) for product processing Sales Analytics**: Daily at 2:01 PM Melbourne time for previous day's sales For globally distributed teams, schedule triggers and timestamps can be standardized on UTC to avoid regional drift. Pro Tip Start with small batches (limit set to 10 or fewer) to validate both copy generation and sales tracking flows. The workflow handles dual operations independently - content generation failures won't affect sales analytics and vice versa. Monitor the Error_log sheet for any issues and use the ProcessingState sheet to track pagination progress.
by Arnaud MARIE
Replicate Line Items on New Deal in HubSpot Workflow Use Case This workflow solves the problem of manually copying line items from one deal to another in HubSpot, reducing manual work and minimizing errors. What this workflow does Triggers** upon receiving a webhook with deal IDs. Retrieves** the IDs of the won and created deals. Fetches** line items associated with the won deal. Extracts** product SKUs from the retrieved line items. Fetches** product details based on SKUs. Creates** new line items for the created deal and associates them. Sends** a Slack notification with success details. Step up steps Create a HubSpot Deal Workflow 1.1 Set up your trigger (ex: when deal stage = Won) 1.2 Add step : Create Record (deal) 1.3 Add Step : Send webhook. The webhook should be a Get to your n8n first trigger. Set two query parameter : deal_id_won as the Record ID of the deal triggering the HubSpot Workflow deal_id_create as the Record ID of the deal created above. Click Insert Data -> The created object Set up your HubSpot App token in HubSpot -> Settings -> Integration -> Private Apps Set up your HubSpot Token integration using the predefined model. Set up your Slack connection Add an error Workflow to monitor errors
by Sk developer
Automation Flow: Image to Image Using GPT Sora This flow automates the process of generating images using a provided prompt and reference image via the Sora GPT Image API from RapidAPI. The generated images are stored in Google Drive, and details are logged in Google Sheets. Nodes Overview 1. On Form Submission Type**: n8n-nodes-base.formTrigger Description**: This node triggers when a user submits the form containing the prompt and image URL. It ensures the form fields are filled in and ready for processing. Form Fields: Prompt: A text description of the desired image. Image URL: The URL of the reference image to be used. Webhook ID: Unique identifier for form submission. 2. HTTP Request to Sora GPT Image API Type**: n8n-nodes-base.httpRequest Description: Sends the prompt and image URL to the **Sora GPT Image API to generate a new image based on the provided inputs. API Endpoint: Sora GPT Image API (via RapidAPI) Method: POST Body Parameters: Prompt: User-provided text. Image URL: The reference image URL. Width & Height: Image size is set to 1024x1024. 3. Code (Base64 Conversion) Type**: n8n-nodes-base.code Description**: This node processes the base64-encoded image data returned from the API. It decodes and formats the image to be uploaded to Google Drive. Output: Converts the base64 string into a binary JPEG file. 4. Upload Image to Google Drive Type**: n8n-nodes-base.googleDrive Description: Uploads the generated image to **Google Drive, storing it in a designated folder. Authentication: Google Service Account. File Name: The image file name is dynamically set from the previous node. 5. Log Details to Google Sheets Type**: n8n-nodes-base.googleSheets Description: This node logs the **Prompt, Generated Image, and Generation Date into a Google Sheets document for tracking and auditing purposes. Columns Mapped: Prompt: The user’s input text. Image: The name of the generated image file. Generated Date: Date and time of image generation. Flow Summary User Submits Form: Triggered when the form with the prompt and image URL is submitted. Image Generation: The data is sent to the Sora GPT Image API from RapidAPI to generate the image. Image Processing: The generated image (base64 format) is decoded and saved as a file. Google Drive Upload: The image is uploaded to Google Drive for storage. Google Sheets Logging: All relevant details (Prompt, Image, Date) are saved in Google Sheets. Benefits Automated Image Creation: Quickly generate images using AI based on a simple prompt and reference image via **RapidAPI. Efficient Workflow**: The entire process from form submission to image generation and storage is automated, saving time and reducing manual work. Centralized Storage: Generated images are stored in **Google Drive, ensuring easy access and organization. Audit Trail: The details of each generated image are logged in **Google Sheets, making it easy to track, review, and manage past creations. Scalable and Reusable**: Can be adapted to multiple use cases, such as creative design, marketing materials, or social media content generation. Problems Solved Manual Image Editing**: Eliminates the need for manual image manipulation and creation, allowing for automatic generation based on user inputs. Disorganized File Storage: With automatic uploads to **Google Drive, the images are stored in a centralized and organized manner. Lack of Record-Keeping: By logging image generation details in **Google Sheets, there's always a record of past creations, improving tracking and management. Time-Consuming Processes**: The automation drastically reduces the time spent on manual tasks, allowing users to focus on other aspects of their work or creative processes. This flow simplifies the process of creating AI-generated images based on user inputs, leveraging the power of the Sora GPT Image API via RapidAPI, making it a powerful tool for creative, design, and marketing purposes.
by David Olusola
How It Works – Data Deduplication in n8n This tutorial demonstrates how to remove duplicate records from a dataset using JavaScript logic inside n8n's Code nodes. It simulates real-world data cleaning by generating sample user data with intentional duplicates (based on email addresses) and walks you through the process of deduplication step-by-step. The process includes: Creating Sample Data with duplicates. Filtering Out Duplicates using filter() and findIndex() based on email. Displaying Cleaned Results with simple statistics for before-and-after comparison. This is ideal for scenarios like CRM imports, ETL processes, and general data hygiene. ⚙️ Set-Up Steps 🔹 Step 1: Manual Trigger Node: When clicking 'Test workflow' Purpose: Initiates the workflow manually for testing. 🔹 Step 2: Generate Sample Data Node: Create Sample Data (Code node) What it does: Creates 6 users, including 2 intentional duplicates (by email). Outputs data as usersJson with metadata (totalCount, message). Mimics real-world messy datasets. 🔹 Step 3: Deduplicate the Data Node: Deduplicate Users (Code node) What it does: Parses usersJson. Uses .filter() + .findIndex() to keep only the first instance of each email. Logs total, unique, and removed counts. Outputs clean user list as separate items. 🔹 Step 4: Display Results Node: Display Results (Code node) What it does: Outputs structured summary: Unique users Status Timestamp Prepares results for review or downstream use. 📈 Sample Output Original count: 6 users Deduplicated count: 4 users Duplicates removed: 2 users 🎯 Learning Objectives You'll learn how to: Use .filter() and .findIndex() in n8n Code nodes Clean JSON data within workflows Create simple, effective deduplication pipelines Output structured summaries for reporting or integration 🧠Best Practices Validate input format (e.g., JSON schema) Handle null or missing fields gracefully Use logging for visibility Add error handling for production use Use pagination/chunking for large datasets