by MRJ
:car: Business Value Proposition Accelerates ISO 26262 compliance for automotive/industrial systems by automating safety analysis while maintaining rigorous audit standards. :gear: How It Works graph TD A[Engineer uploadssystem description] --> B(LLM identifies hazards) B --> C(LLM scores risks per ISO 26262) C --> D(Generates mitigation strategies) D --> E(Produces audit-ready reports) :chart_with_upwards_trend: Key Benefits Time 50-70% faster than manual HAZOP/FMEA sessions Instant report generation vs. weeks of documentation Risk Mitigation Pre-validated templates reduce human error Auto-generated traceability simplifies audits :warning: Governance Controls Human-in-the-loop: All LLM outputs require engineer sign-off Version tracking: Full history of modifications Audit mode: Export all decision rationales :computer: Technical Requirements Runs on existing n8n instances Docker deployment (<1hr setup) Integrates with JAMA/DOORS (optional) :wrench: Setup and Usage Prerequisites Docker (Install Guide) Docker Compose (Install Guide) n8n instance (Free Self-Hosted or Cloud - Paid) OpenAI API key (Get Key) Enterprise-ready deployment: When supported by IT infrastructure teams, this solution transforms into a scalable AI safety assistant, providing real-time HARA guidance akin to engineering Co-pilot tools. :arrow_down: Installation and :play_or_pause_button: Running the Workflow For installation procedures and usage of workflow, refer the repository :warning: Validation & Limitations AI-Assisted Analysis Considerations | Advantage | Mitigation Strategy | Implementation Example | |-----------|---------------------|------------------------| | Rapid hazard identification | Human validation layer | Manual review nodes in workflow | | Consistent S/E/C scoring | Rule-based validation | ASIL-D → Redundancy check | | Edge case coverage | Cross-reference with historical data | Integration with incident databases | Critical Validation Steps AI Output Review node in n8n Example: (by code) { "type": "function", "parameters": { "functionCode": "if ($input.item.json.ASIL === 'D' && !$input.item.json.redundancy) throw new Error('ASIL D requires redundancy');" } } Version Control Prompt versions tied to ISO standard editions (e.g., ISO26262:2018-v1.2) Git-tracked changes to ai_models/training_data/ Audit trails Providing a log structure for audit trails Log structure /logs/ └── YYYY-MM-DD/ ├── hazards_approved.log └── hazards_rejected.log
by Ranjan Dailata
Who this is for The TrustPilot SaaS Product Review Tracker is designed for product managers, SaaS growth teams, customer experience analysts, and marketing teams who need to extract, summarize, and analyze customer feedback at scale from TrustPilot. This workflow is tailored for: Product Managers** - Monitoring feedback to drive feature improvements Customer Support & CX Teams** - Identifying sentiment trends or recurring issues Marketing & Growth Teams** - Leveraging testimonials and market perception Data Analysts** - Tracking competitor reviews and benchmarking Founders & Executives** - Wanting aggregated insights into customer satisfaction What problem is this workflow solving? Manually monitoring, extracting, and summarizing TrustPilot reviews is time-consuming, fragmented, and hard to scale across multiple SaaS products. This workflow automates that process from unlocking the data behind anti-bot layers to summarizing and storing customer insights enabling teams to respond faster, spot trends, and make data-backed product decisions. This workflow solves: The challenge of scraping protected review data (using Bright Data Web Unlocker) The need for structured insights from unstructured review content The lack of automated delivery to storage and alerting systems like Google Sheets or webhooks What this workflow does Extract TrustPilot Reviews: Uses Bright Data Web Unlocker to bypass anti-bot protections and pull markdown-based content from product review pages Convert Markdown to Text: Leverages a basic LLM chain to clean and convert scraped markdown into plain text Structured Information Extraction: Uses OpenAI GPT-4o via the Information Extractor node to extract fields like product name, review date, rating, and reviewer sentiment Summarization Chain: Generates concise summaries of overall review sentiment and themes using OpenAI Merge & Aggregate Output: Consolidates individual extracted records into a structured batch output Outbound Data Delivery: Google Sheets – Appends summary and structured review data Write to Disk – Persists raw and processed content locally Webhook Notification – Sends a real-time alert with summarized insights Pre-conditions You need to have a Bright Data account and do the necessary setup as mentioned in the "Setup" section below. You need to have an OpenAI Account. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, Configure the Google Sheet Credentials with your own account. Follow this documentation - Set Google Sheet Credential In n8n, configure the OpenAi account credentials. Ensure the URL and Bright Data zone name are correctly set in the Set URL, Filename and Bright Data Zone node. Set the desired local path in the Write a file to disk node to save the responses. How to customize this workflow to your needs Target Multiple Products : Configure the Bright Data input URL dynamically for different SaaS product TrustPilot URLs Loop through a product list and run parallel jobs for each Customize Extraction Fields : Update the prompt in the Information Extractor to include: Review title Response from company Specific feature mentions Competitor references Tune Summarization Style Change tone**: executive summary, customer pain-point focus, or marketing quote extract Enable sentiment aggregation** (e.g., 30% negative, 50% neutral, 20% positive) Expand Output Destinations Push to Notion, Airtable, or CRM tools using additional webhook nodes Generate and send PDF reports (via PDFKit or HTML-to-PDF nodes) Schedule summary digests via Gmail or Slack
by A Z
Automatically scrape Meta Threads for posts hiring specific roles (e.g. automation engineers, video editors, graphic designers), filter true hiring intent, deduplicate, and send alerts. We are taking automation roles as an example for now. What it does This workflow continuously scans Threads for fresh posts mentioning the roles you care about. It uses AI to filter out self-promotion and service ads, keeping only posts where the author is hiring. Qualified posts are saved into Google Sheets for tracking and sent to Telegram for instant alerts. It’s ideal for freelancers, agencies, and job seekers who want a steady radar of opportunities. How it works (Step by Step) Schedule trigger – Runs on a set interval (e.g. every 12 hours). Scrape Threads posts – Fetches recent posts from multiple keywords (e.g., “n8n expert”, “hire video editor”, “graphic designer”, etc.) via Apify. Merge results – Combines posts into a single stream. Normalize fields – Maps raw data into clean fields: text, author, URL, timestamp, profile link. AI filter – Uses an AI Agent to: Accept only posts where someone is hiring (rejects “hire me” style self-promo). Apply simple geography rules (e.g., allow US, UK, UAE, CA; pass unknowns). Exclude roles outside your scope. Deduplication – Checks Google Sheets to skip posts already seen. Save to Google Sheets – Writes qualified posts with full details. Telegram alerts – Sends you the matched post instantly so you can act. Who it’s for Freelancers: Get first dibs on gigs before others spot them. Agencies: Build a client pipeline by tracking hiring signals. Job seekers: Spot hidden opportunities in your target field. Customization Ideas Swap keywords to monitor roles you care about (e.g., “UI/UX designer”, “motion graphics editor”, “copywriter”). Add Slack or Discord notifications instead of Telegram. Expand geo rules to match your region. Use Sheets as a CRM—add columns for status, outreach date, etc
by A Z
Automatically scrape X (Twitter) for posts hiring specific roles (e.g., automation engineers, video editors, graphic designers), filter true hiring intent with AI, deduplicate in Google Sheets, and alert via Telegram. What it does Pulls recent X/Twitter posts for multiple role keywords via Apify. Normalizes each post (text, author, links, location). Uses an AI Agent to keep only posts where the author is hiring (not self-promo). Checks Google Sheets for duplicates by URL before saving. Writes qualified posts to a sheet and sends a Telegram notification. We are using n8n automation roles as the example here How it works (Step by Step) Schedule Trigger – Runs on an interval (currently every 12 hours). Scrape X/Twitter – Apify tweet-scraper fetches up to 50 latest posts for keywords like: n8n developer, looking for n8n, n8n expert, hire AI automation, looking for AI automation. Normalize Fields – Set node maps to: url, text, author.userName, author.url, author.location. AI Filter & Dedupe Check Accept only clear hiring posts for n8n/AI automation roles (reject self-promotion). Queries Google Sheets to see if url already exists; duplicates are dropped. Gate – IF node passes only non-empty AI outputs. Parse JSON Safely – Code node extracts/validates JSON from the AI output. Save to Google Sheets – Appends/updates a row (matching on url). Telegram Alert – Sends a message with the tweet URL, author, location, and text. Who it’s for Freelancers, agencies, and job seekers who want a steady radar of real hiring posts for their target roles. Customization Ideas Swap keywords to track other roles (video editors, designers, copywriters, etc.). Add Slack/Discord notifications. Extend the AI rules (e.g., different geographies or role scopes). Treat the sheet as a mini-CRM (status, outreach date, notes).
by Murtaja Ziad
A n8n workflow designed to shorten URLs using Dub.co API. How it works: It shortens a url using Dub.co API, with the ability to use custom domains and projects. It updates the current shortened url if the slug has been already used. Estimated Time: Around 15 minutes. Requirements: A Dub.co account. Configuration: Configure the "API Auth" node to add your Dub.co API key, project slug, and the long URL. There some extras that you're able to configure too. You will be able to do that by clicking the "API Auth" node and filling the fields. Detailed Instructions: Sticky notes within the workflow provide extensive setup information and guidance. Keywords: n8n workflow, dub.co, dub.sh, url shortener, short urls, short links
by Joey D’Anna
This template is an error handler that will log n8n workflow errors to a Monday.com board for troubleshooting and tracking. Prerequisites Monday account and Monday credential Create a board on Monday for error logging, with the following columns and types: Timestamp (text) Error Message (text) Stack Trace (long text) Determine the column IDs using Monday's instructions Setup Edit the Monday nodes to use your credential Edit the node labeled CREATE ERROR ITEM to point to your error log board and group name Edit the column IDs in the "Column Values" field of the UPDATE node to match the IDs of the fields on your error log board To trigger error logging, select this automation as the error workflow on any automation For more detailed logging, add Stop and Error nodes in your workflow to send specific error messages to your board.
by n8n Team
This workflow creates/updates ClickUp tasks when Notion database pages are created/updated. All fields in the Notion database are mapped to a ClickUp property. Notion database will require setup before the workflow can be used. See the list of fields available in the setup below. Prerequisites Notion account and Notion credentials. ClickUp account and ClickUp credentials. How it works When a new database page is created in Notion, the workflow creates a new task in ClickUp with all required fields. The new ClickUp task's ID is saved in the Notion database page's "ClickUp ID" field. Then, when the database page is updated in Notion, the workflow updates the specific ClickUp task identified by the "ClickUp ID" field in Notion. Setup This workflow requires that you set up a Notion database. To do so, follow the steps below: In Notion, create a new database. Add the following columns to the database: Task name (renamed from "Name") Status (with type "Select" with the following options: "to do", "in progress", "review", "revision", "complete") Deadline (with type "Date") ClickUp ID (with type "Text") Add any other fields you require. Share the database to n8n. By default, the workflow will fill all the fields provided above, except for any other additional fields you add.
by Karol
How it works This workflow automates publishing content from any RSS feed directly to Facebook and Instagram. It reads new RSS entries, extracts the article content, generates a short social-media-friendly summary using an AI model, and then creates an AI-generated image based on the topic. The post is uploaded to Facebook and Instagram (via Graph API) and logged in Google Sheets for reference. Finally, a Telegram bot sends you a notification with links to the published posts. Set up steps Insert your RSS feed URL in the RSS Feed Trigger node. Configure Google Sheets credentials and replace the example sheet with your own. In Supabase Config, insert your Supabase URL and bucket name. In Facebook/Instagram nodes, replace [INSERT_YOUR_SITE_ID] with your own page or account ID. Connect your Facebook Graph API credentials (remove hardcoded tokens). Connect your OpenAI / Anthropic / Gemini credentials for text and image generation. Set up your Telegram Bot credentials if you want to receive notifications. Notes • Sticky notes inside the workflow explain each section (RSS trigger, filtering, content generation, posting, logging, notifications). • No credentials are saved in the template – you must connect your own before running. • All generated content (text + images) is fully automated but can be customized (e.g. change AI prompts for your preferred style).
by Jimleuk
This n8n template is one of a 3-part series exploring use-cases for clustering vector embeddings: Survey Insights Customer Insights Community Insights This template demonstrates the Survey Insights scenario where survey participant responses can be quickly grouped by similarity and an AI agent can generate insights on those groupings. With this workflow, researchers can save days and even weeks of work breaking down cohorts of participants and identify frequently mentioned positives and negatives. Sample Output: https://docs.google.com/spreadsheets/d/e/2PACX-1vT6m8XH8JWJTUAfwojc68NAUGC7q0lO7iV738J7aO5fuVjiVzdTRRPkMmT1C4N8TwejaiT0XrmF1Q48/pubhtml# How it works All survey questions and responses are imported from a Google Sheet. Responses are then inserted into a Qdrant collection carefully tagged with the question and survey metadata. For each question, all relevant response are put through a clustering algorithm using the Python Code node. The Qdrant points are returned in clustered groups. Each group is looped to fetch the payloads of the points and feed them to the AI agent to summarise and generate insights for. The resulting insights and raw responses are then saved to the Google Spreadsheet for further analysis by the researcher. Requirements Survey data and format as shown in the attached google sheet. Qdrant Vectorstore for storing embeddings. OpenAI account for embeddings and LLM. Customising the Template Adjust clustering parameters which make sense for your data. Add more clusters for open-ended questions and less clusters when responses are multiple choice.
by Aditya Gaur
Who is this template for? This template is designed for teams who need to automate data retrieval from SharePoint lists using n8n. It is ideal for users who want to authenticate via OAuth and then use the token to access SharePoint API endpoints, pulling in list data directly into n8n. How it works The template first generates an OAuth token using the Microsoft OAuth API. This token is then used to authenticate requests to the SharePoint List API, allowing the workflow to fetch data from a specified SharePoint list. By following the n8n workflow, the user can configure the necessary credentials and endpoints to automate SharePoint data access securely. Setup steps Step 1: Replace {tenant_id}, {client_id}, and {client_secret} with your Azure AD details for OAuth authentication. Step 2: Specify the SharePoint list API endpoint in the template (under "SharePoint List Fetch" node). Step 3: Configure the SharePoint list URL and make adjustments for specific data fields if necessary.
by Incrementors
🛒 Lead Workflow: Yelp & Trustpilot Scraping + OpenAI Analysis via BrightData > Description: Automated lead generation workflow that scrapes business data from Yelp and Trustpilot based on location and category, analyzes credibility, and sends personalized outreach emails using AI. > ⚠️ Important: This template requires a self-hosted n8n instance to run. 📋 Overview This workflow provides an automated lead generation solution that identifies high-quality prospects from Yelp and Trustpilot, analyzes their credibility through reviews, and sends personalized outreach emails. Perfect for digital marketing agencies, sales teams, and business development professionals. ✨ Key Features 🎯 Smart Location Analysis** AI breaks down cities into sub-locations for comprehensive coverage 🛍 Yelp Integration** Scrapes business details using BrightData's Yelp dataset ⭐ Trustpilot Verification** Validates business credibility through review analysis 📊 Data Storage** Automatically saves results to Google Sheets 🤖 AI-Powered Outreach** Generates personalized emails using Claude AI 📧 Automated Sending** Sends emails directly through Gmail integration 🔄 How It Works User Input: Submit location, country, and business category through a form AI Location Analysis: Gemini AI identifies sub-locations within the specified area Yelp Scraping: BrightData extracts business information from multiple locations Data Processing: Cleans and stores business details in Google Sheets Trustpilot Verification: Scrapes reviews and company details for credibility check Email Generation: Claude AI creates personalized outreach messages Automated Outreach: Sends emails to qualified prospects via Gmail 📊 Data Output | Field | Description | Example | |---------------|----------------------------------|----------------------------------| | Company Name | Business name from Yelp/Trustpilot | Best Local Restaurant | | Website | Company website URL | https://example-restaurant.com | | Phone Number | Business contact number | (555) 123-4567 | | Email | Business email address | demo@example.com | | Address | Physical business location | 123 Main St, City, State | | Rating | Overall business rating | 4.5/5 | | Categories | Business categories/tags | Restaurant, Italian, Fine Dining | 🚀 Setup Instructions ⏱️ Estimated Setup Time: 10–15 minutes Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access BrightData account with Yelp and Trustpilot datasets Google Gemini API access Anthropic API key for Claude Gmail account for sending emails Step 1: Import the Workflow Copy the JSON workflow code In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Google Sheets Integration Create two Google Sheets: Yelp data: Name, Categories, Website, Address, Phone, URL, Rating Trustpilot data: Company Name, Email, Phone Number, Address, Rating, Company About Copy Sheet IDs from URLs In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Update all Google Sheets nodes with your Sheet IDs Step 3: Configure BrightData Set up BrightData credentials in n8n Replace API token with: BRIGHT_DATA_API_KEY Verify dataset access: Yelp dataset: gd_lgugwl0519h1p14rwk Trustpilot dataset: gd_lm5zmhwd2sni130p Test connections Step 4: Configure AI Models Google Gemini (Location Analysis)** Add Google Gemini API credentials Configure model: models/gemini-1.5-flash Claude AI (Email Generation)** Add Anthropic API credentials Configure model: claude-sonnet-4-20250514 Step 5: Configure Gmail Integration Set up Gmail OAuth2 credentials in n8n Update "Send Outreach Email" node Test email sending Step 6: Test & Activate Activate the workflow Test with sample data: Country: United States Location: Dallas Category: Restaurants Verify data appears in Google Sheets Check that emails are generated and sent 📖 Usage Guide Starting a Lead Generation Campaign Access the form trigger URL Enter your target criteria: Country: Target country Location: City or region Category: Business type (e.g., restaurants) Submit the form to start the process Monitoring Results Yelp Data Sheet:** View scraped business information Trustpilot Sheet:** Review credibility data Gmail Sent Items:** Track outreach emails sent 🔧 Customization Options Modifying Email Templates Edit the "AI Generate Email Content" node to customize: Email tone and style Services mentioned Call-to-action messages Branding elements Adjusting Data Filters Modify rating thresholds Set minimum review counts Add geographic restrictions Filter by business size Scaling the Workflow Increase batch sizes Add delays between requests Use parallel processing Add error handling 🚨 Troubleshooting Common Issues & Solutions 1. BrightData Connection Failed Cause: Invalid API credentials or dataset access Solution: Verify credentials and dataset permissions 2. No Data Extracted Cause: Invalid location or changed page structure Solution: Verify location names and test other categories 3. Gmail Authentication Issues Cause: Expired OAuth tokens Solution: Re-authenticate and check permissions 4. AI Model Errors Cause: API quota exceeded or invalid keys Solution: Check usage limits and API key Performance Optimization Rate Limiting:** Add delays Error Handling:** Retry failed requests Data Validation:** Check for malformed data Memory Management:** Process in smaller batches 📈 Use Cases & Examples 1. Digital Marketing Agency Lead Generation Goal:** Find businesses needing marketing Target:** Restaurants, retail stores Approach:** Focus on good-rated but low-online-presence businesses 2. B2B Sales Prospecting Goal:** Find software solution clients Target:** Growing businesses Approach:** Focus on recent positive reviews 3. Partnership Development Goal:** Find complementary businesses Target:** Established businesses Approach:** Focus on reputation and satisfaction scores ⚡ Performance & Limits Expected Performance Processing Time:** 5–10 minutes/location Data Accuracy:** 90%+ Success Rate:** 85%+ Daily Capacity:** 100–500 leads Resource Usage API Calls:** ~10–20 per business Storage:** Minimal (Google Sheets) Execution Time:** 3–8 minutes/10 businesses Network Usage:** ~5–10MB/business 🤝 Support & Community Getting Help n8n Community Forum:** community.n8n.io Docs:** docs.n8n.io BrightData Support:** Via dashboard Contributing Share improvements Report issues and suggestions Create industry-specific variations Document best practices > 🔒 Privacy & Compliance: Ensure GDPR/CCPA compliance. Always respect robots.txt and terms of service of scraped sites. 🎯 Ready to Generate Leads! This workflow provides a complete solution for automated lead generation and outreach. Customize it to fit your needs and start building your pipeline today! For any questions or support, please contact: 📧 info@incrementors.com or fill out this form: Contact Us
by Kumar Shivam
Complete AI Product Description Generator Transforms product images into high-converting copy with GPT-4o Vision + Claude 3.5 The Shopify AI Product Description Factory is a production-grade n8n workflow that converts product images and metadata into refined, SEO-aware descriptions—fully automated and region-agnostic. It blends GPT-4o vision for visible attribute extraction, Claude 3.5 Sonnet for premium copy, Perplexity research for verified brand context, Google Sheets for orchestration and audit trails, plus automated daily sales analytics enrichment. Link-header pagination and structured output enforcement ensure reliable scale. To refine according to your usecase connect via my profile @connect Key Advantages Vision-first copywriting Uses gpt-4o to identify only visible physical attributes (closure, heel, materials, sole) from product images—no guesses. Premium copy generation anthropic/claude-3.5-sonnet crafts concise, benefit-led descriptions with consistent tone, length control, and clean formatting. Research-assisted accuracy perplexityTool verifies vendor/brand context from official sources to avoid speculation or fabricated claims. Pagination you can trust Automates Shopify REST pagination via Link headers and persists page_info for resumable runs. Google Sheets orchestration Centralized staging, status tracking, and QA in Products, with ProcessingState for batch/page markers, and Error_log for diagnostics. Bulletproof error feedback errorTrigger + AI diagnosis logs clear, non-technical and technical explanations to Error_log for fast recovery. Automated sales analytics Daily sales tracking automatically captures and enriches total sales data for comprehensive business intelligence and performance monitoring. How It Works Intake and filtering httpRequest fetches /admin/api/2024-04/products.json?limit=200&{page_info} code filters only items with: Image present Empty body_html The currSeas:SS2025 tag Extracts tag metadata such as x-styleCode, country_of_origin, and gender when available Pagination controller code parses Link headers for rel="next" and extracts page_info googleSheets updates ProcessingState with page_info_next and increments the batch number for resumable polling Generation pipeline googleSheets pulls rows with Status = Ready for AI Description; limit throttles batch size openAi Analyze image (model gpt-4o) returns strictly visible features lmChatOpenRouter (Claude 3.5) composes the SEO description, optionally blending verified vendor context from perplexityTool outputParserStructured guarantees strict JSON: product_id, product_title (normalized), generated_description, status googleSheets writes results back to Products for review/publish Sales analytics enrichment Schedule Trigger** runs daily at 2:01 PM to capture previous day's sales httpRequest fetches paid orders from Shopify REST API with date range filtering splitOut and summarize nodes calculate total daily sales Automatic Google Sheets logging with date stamps and totals Zero-sale days are properly recorded for complete analytics continuity Reliability and insight errorTrigger routes failures to an AI agent that explains the root cause and appends a concise note to Error_log. What's Inside (Node Map) Data + API httpRequest (Shopify REST 2024-04 for products and orders) googleSheets (multiple sheet operations) googleSheetsTool (error logging) AI models openAi (gpt-4o vision analysis) lmChatOpenRouter (anthropic/claude-3.5-sonnet for content generation) AI Agent** (intelligent error diagnosis) Analytics & Processing splitOut (order data processing) summarize (sales totals calculation) set nodes (data field mapping) Tools and guards perplexityTool (brand research) outputParserStructured (JSON validation) memoryBufferWindow (conversation context) Control & Scheduling scheduleTrigger (multiple time-based triggers) cron (periodic execution) limit (batch size control) if (conditional logic) code (custom filtering and pagination logic) Observability errorTrigger + AI diagnosis to Error_log Processing state tracking Sales analytics logging Content & Compliance Rules Locale-agnostic copy**; brand voice is configurable per store Only image-verifiable attributes** (no guesses); clean HTML suitable for Shopify themes Optional normalization rules (e.g., color/branding cleanup, title sanitization) Style code inclusion supported when x-styleCode is present Gender-aware content generation when gender tag is present Strict JSON output** and schema consistency for safe downstream publishing Setup Steps Core integrations Shopify Access Token** — Products read + Orders read (REST 2024-04) OpenAI API** — gpt-4o vision OpenRouter API** — Claude Sonnet (3.5) Perplexity API** — vendor/market verification via perplexityTool Google Sheets OAuth** — Products, ProcessingState, Error_log, Sales analytics Configure sheets ProcessingState** with fields: batch number page_info_next Products** with: Product ID Product Title Product Type Vendor Image url Status country of origin x_style_code gender Generated Description Error_log** with: timestamp Reason of Error Sales Analytics Sheet** with: Date Total Sales Workflow Capabilities Discovery and staging Auto-paginate Shopify; stage eligible products in Sheets with reasons and timestamps. Vision-grounded copywriting Descriptions reflect only visible attributes plus verified brand context; concise, mobile-friendly structure with gender-aware tone. Metadata awareness Auto-injects x-styleCode, country_of_origin, and gender when present; natural SEO for brand and product type. Sales intelligence Automated daily sales tracking with Melbourne timezone support, handles zero-sale days, and maintains complete historical records. Error analytics Layman + technical diagnosis logged to Error_log to shorten MTTR. Safe output Structured JSON via outputParserStructured for predictable row updates. Credentials Required Shopify Access Token** (Products + Orders read permissions) OpenAI API Key** (GPT-4o vision) OpenRouter API Key** (Claude Sonnet) Perplexity API Key** Google Sheets OAuth** Ideal For E-commerce teams** scaling compliant, on-brand product copy with comprehensive sales insights Agencies and SEO specialists** standardizing image-grounded descriptions with performance tracking and analytics Stores** needing resumable pagination, auditable content operations, and automated daily sales reporting in Sheets Advanced Features Dual-workflow architecture**: Content generation + Sales analytics in one system Link-header pagination with page_info persistence in ProcessingState Title/content normalization (e.g., color removal) configurable per brand Gender-aware copywriting** based on product tags Memory windows (memoryBufferWindow) to keep multi-step prompts consistent Melbourne timezone support** for accurate daily sales cutoffs Zero-sales handling** ensures complete analytics continuity Structured Output enforcement for downstream safety AI-powered error diagnosis** with technical and layman explanations Time & Scheduling (Universal) The workflow includes two independent schedules: Content Generation**: Every 5 minutes (configurable) for product processing Sales Analytics**: Daily at 2:01 PM Melbourne time for previous day's sales For globally distributed teams, schedule triggers and timestamps can be standardized on UTC to avoid regional drift. Pro Tip Start with small batches (limit set to 10 or fewer) to validate both copy generation and sales tracking flows. The workflow handles dual operations independently - content generation failures won't affect sales analytics and vice versa. Monitor the Error_log sheet for any issues and use the ProcessingState sheet to track pagination progress.