by vinci-king-01
Customer Support Analysis Dashboard with AI and Automated Insights ๐ฏ Target Audience Customer support managers and team leads Customer success teams monitoring satisfaction Product managers analyzing user feedback Business analysts measuring support metrics Operations managers optimizing support processes Quality assurance teams monitoring support quality Customer experience (CX) professionals ๐ Problem Statement Manual analysis of customer support tickets and feedback is time-consuming and often misses critical patterns or emerging issues. This template solves the challenge of automatically collecting, analyzing, and visualizing customer support data to identify trends, improve response times, and enhance overall customer satisfaction. ๐ง How it Works This workflow automatically monitors customer support channels using AI-powered analysis, processes tickets and feedback, and provides actionable insights for improving customer support operations. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Ticket Analysis - Uses advanced NLP to categorize, prioritize, and analyze support tickets Multi-Channel Integration - Monitors email, chat, help desk systems, and social media Automated Insights - Generates reports on trends, response times, and satisfaction scores Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting ๐ Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the ticket was processed | "2024-01-15T10:30:00Z" | | ticket_id | String | Unique ticket identifier | "SUP-2024-001234" | | customer_email | String | Customer contact information | "john@example.com" | | subject | String | Ticket subject line | "Login issues with new app" | | description | String | Full ticket description | "I can't log into the mobile app..." | | category | String | AI-categorized ticket type | "Technical Issue" | | priority | String | Calculated priority level | "High" | | sentiment_score | Number | Customer sentiment (-1 to 1) | -0.3 | | urgency_indicator | String | Urgency classification | "Immediate" | | response_time | Number | Time to first response (hours) | 2.5 | | resolution_time | Number | Time to resolution (hours) | 8.0 | | satisfaction_score | Number | Customer satisfaction rating | 4.2 | | agent_assigned | String | Support agent name | "Sarah Johnson" | | status | String | Current ticket status | "Resolved" | ๐ ๏ธ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Help desk system API access (Zendesk, Freshdesk, etc.) Email service integration (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for customer support analysis Configure the sheet name (default: "Support Analysis") 4. Configure Support System Integration Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for your help desk system or support portal Customize the user prompt to extract specific ticket data Set up categories and priority thresholds 5. Set up Notification Channels Configure Slack webhook or API credentials for alerts Set up email service credentials for critical issues Define alert thresholds for different priority levels Test notification delivery 6. Configure Schedule Trigger Set analysis frequency (hourly, daily, etc.) Choose appropriate time zones for your business hours Consider support system rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test ticket analysis with sample data ๐ Workflow Customization Options Modify Analysis Targets Add or remove support channels (email, chat, social media) Change ticket categories and priority criteria Adjust analysis frequency based on ticket volume Extend Analysis Capabilities Add more sophisticated sentiment analysis Implement customer churn prediction models Include agent performance analytics Add automated response suggestions Customize Alert System Set different thresholds for different ticket types Create tiered alert systems (info, warning, critical) Add SLA breach notifications Include trend analysis alerts Output Customization Add data visualization and reporting features Implement support trend charts and graphs Create executive dashboards with key metrics Add customer satisfaction trend analysis ๐ Use Cases Support Ticket Management**: Automatically categorize and prioritize tickets Response Time Optimization**: Identify bottlenecks in support processes Customer Satisfaction Monitoring**: Track and improve satisfaction scores Agent Performance Analysis**: Monitor and improve agent productivity Product Issue Detection**: Identify recurring problems and feature requests SLA Compliance**: Ensure support teams meet service level agreements ๐จ Important Notes Respect support system API rate limits and terms of service Implement appropriate delays between requests to avoid rate limiting Regularly review and update your analysis parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and GDPR compliance for customer data ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Ticket parsing errors: Review the Code node's JavaScript logic Rate limiting: Adjust analysis frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Help desk system API documentation Customer support analytics best practices
by vinci-king-01
Competitor Price Monitoring Dashboard with AI and Real-time Alerts ๐ฏ Target Audience E-commerce managers and pricing analysts Retail business owners monitoring competitor pricing Marketing teams tracking market positioning Product managers analyzing competitive landscape Data analysts conducting pricing intelligence Business strategists making pricing decisions ๐ Problem Statement Manual competitor price monitoring is inefficient and often leads to missed opportunities or delayed responses to market changes. This template solves the challenge of automatically tracking competitor prices, detecting significant changes, and providing actionable insights for strategic pricing decisions. ๐ง How it Works This workflow automatically monitors competitor product prices using AI-powered web scraping, analyzes price trends, and sends real-time alerts when significant changes are detected. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain up-to-date price data AI-Powered Scraping - Uses ScrapeGraphAI to intelligently extract pricing information from competitor websites Price Analysis Engine - Processes historical data to detect trends and anomalies Alert System - Sends notifications via Slack and email when price changes exceed thresholds Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting ๐ Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the price was recorded | "2024-01-15T10:30:00Z" | | competitor_name | String | Name of the competitor | "Amazon" | | product_name | String | Product name and model | "iPhone 15 Pro 128GB" | | current_price | Number | Current price in USD | 999.00 | | previous_price | Number | Previous recorded price | 1099.00 | | price_change | Number | Absolute price difference | -100.00 | | price_change_percent | Number | Percentage change | -9.09 | | product_url | URL | Direct link to product page | "https://amazon.com/iphone15" | | alert_triggered | Boolean | Whether alert was sent | true | | trend_direction | String | Price trend analysis | "Decreasing" | ๐ ๏ธ Setup Instructions Estimated setup time: 15-20 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Slack workspace for notifications (optional) Email service for alerts (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for price monitoring data Configure the sheet name (default: "Price Monitoring") 4. Configure Competitor URLs Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for each competitor you want to monitor Customize the user prompt to extract specific pricing data Set appropriate price thresholds for alerts 5. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials (SendGrid, SMTP, etc.) Define alert thresholds and notification preferences Test notification delivery 6. Configure Schedule Trigger Set monitoring frequency (hourly, daily, etc.) Choose appropriate time zones for your business hours Consider competitor website rate limits 7. Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test alert notifications with sample data ๐ Workflow Customization Options Modify Monitoring Targets Add or remove competitor websites Change product categories or specific products Adjust monitoring frequency based on market volatility Extend Price Analysis Add more sophisticated trend analysis algorithms Implement price prediction models Include competitor inventory and availability tracking Customize Alert System Set different thresholds for different product categories Create tiered alert systems (info, warning, critical) Add SMS notifications for urgent price changes Output Customization Add data visualization and reporting features Implement price history charts and graphs Create executive dashboards with key metrics ๐ Use Cases Dynamic Pricing**: Adjust your prices based on competitor movements Market Intelligence**: Understand competitor pricing strategies Promotion Planning**: Time your promotions based on competitor actions Inventory Management**: Optimize stock levels based on market conditions Customer Communication**: Proactively inform customers about price changes ๐จ Important Notes Respect competitor websites' terms of service and robots.txt Implement appropriate delays between requests to avoid rate limiting Regularly review and update your monitoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider legal implications of automated price monitoring ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Price parsing errors: Review the Code node's JavaScript logic Rate limiting: Adjust monitoring frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Slack API documentation for notification setup
by WeblineIndia
IPA Size Tracker with Trend Alerts โ Automated iOS Apps Size Monitoring This workflow runs on a daily schedule and monitors IPA file sizes from configured URLs. It stores historical size data in Google Sheets, compares current vs. previous builds and sends email alerts only when significant size changes occur (default: ยฑ10%). A DRY_RUN toggle allows safe testing before real notifications go out. Whoโs it for iOS developers tracking app binary size growth over time. DevOps teams monitoring build artifacts and deployment sizes. Product managers ensuring app size budgets remain acceptable. QA teams detecting unexpected size changes in release builds. Mobile app teams optimizing user experience by keeping apps lightweight. How it works Schedule Trigger (daily at 09:00 UTC) kicks off the workflow. Configuration: Define monitored apps with {name, version, build, ipa_url}. HTTP Request downloads the IPA file from its URL. Size Calculation: Compute file sizes in bytes, KB, MB and attach timestamp metadata. Google Sheets: Append size data to the IPA Size History sheet. Trend Analysis: Compare current vs. previous build sizes. Alert Logic: Evaluate thresholds (>10% increase or >10% decrease). Email Notification: Send formatted alerts with comparisons and trend indicators. Rate Limit: Space out notifications to avoid spamming recipients. How to set up 1. Spreadsheet Create a Google Sheet with a tab named IPA Size History containing: Date, Timestamp, App_Name, Version, Build_Number, Size_Bytes, Size_KB, Size_MB, IPA_URL 2. Credentials Google Sheets (OAuth)** โ for reading/writing size history. Gmail** โ for sending alert emails (use App Password if 2FA is enabled). 3. Open โSet: Configurationโ node Define your workflow variables: APP_CONFIGS = array of monitored apps ({name, version, build, ipa_url}) SPREADSHEET_ID = Google Sheet ID SHEET_NAME = IPA Size History SMTP_FROM = sender email (e.g., devops@company.com) ALERT_RECIPIENTS = comma-separated emails SIZE_INCREASE_THRESHOLD = 0.10 (10%) SIZE_DECREASE_THRESHOLD = 0.10 (10%) LARGE_APP_WARNING = 300 (MB) SCHEDULE_TIME = 09:00 TIMEZONE = UTC DRY_RUN = false (set true to test without sending emails) 4. File Hosting Host IPA files on Google Drive, Dropbox or a web server. Ensure direct download URLs are used (not preview links). 5. Activate the workflow Once configured, it will run automatically at the scheduled time. Requirements Google Sheet with the IPA Size History tab. Accessible IPA file URLs. SMTP / gmail account (Gmail recommended). n8n (cloud or self-hosted) with Google Sheets + Email nodes. Sufficient local storage for IPA file downloads. How to customize the workflow Multiple apps**: Add more configs to APP_CONFIGS. Thresholds**: Adjust SIZE_INCREASE_THRESHOLD / SIZE_DECREASE_THRESHOLD. Notification templates**: Customize subject/body with variables: {{app_name}}, {{current_size}}, {{previous_size}}, {{change_percent}}, {{trend_status}}. Schedule**: Change Cron from daily to hourly, weekly, etc. Large app warnings**: Adjust LARGE_APP_WARNING. Trend analysis**: Extend beyond one build (7-day, 30-day averages). Storage backend**: Swap Google Sheets for CSV, DB or S3. Add-ons to level up Slack Notifications**: Add Slack webhook alerts with emojis & formatting. Size History Charts**: Generate trend graphs with Chart.js or Google Charts API. Environment separation**: Monitor dev/staging/prod builds separately. Regression detection**: Statistical anomaly checks. Build metadata**: Log bundle ID, SDK versions, architectures. Archive management**: Auto-clean old records to save space. Dashboards**: Connect to Grafana, DataDog or custom BI. CI/CD triggers**: Integrate with pipelines via webhook trigger. Common Troubleshooting No size data** โ check URLs return binary IPA (not HTML error). Download failures** โ confirm hosting permissions & direct links. Missing alerts** โ ensure thresholds & prior history exist. Google Sheets errors** โ check sheet/tab names & OAuth credentials. Email issues** โ validate SMTP credentials, spam folder, sender reputation. Large file timeouts** โ raise HTTP timeout for >100MB files. Trend errors** โ make sure at least 2 builds exist. No runs** โ confirm workflow is active and timezone is correct. Need Help? If youโd like this to customize this workflow to suit your app development process, then simply reach out to us here and weโll help you customize the template to your exact use case.
by Growth AI
Who's it for Marketing teams, business intelligence professionals, competitive analysts, and executives who need consistent industry monitoring with AI-powered analysis and automated team distribution via Discord. What it does This intelligent workflow automatically monitors multiple industry topics, scrapes and analyzes relevant news articles using Claude AI, and delivers professionally formatted intelligence reports to your Discord channel. The system provides weekly automated monitoring cycles with personalized bot communication and comprehensive content analysis. How it works The workflow follows a sophisticated 7-phase automation process: Scheduled Activation: Triggers weekly monitoring cycles (default: Mondays at 9 AM) Query Management: Retrieves monitoring topics from centralized Google Sheets configuration News Discovery: Executes comprehensive Google News searches using SerpAPI for each configured topic Content Extraction: Scrapes full article content from top 3 sources per topic using Firecrawl AI Analysis: Processes scraped content using Claude 4 Sonnet for intelligent synthesis and formatting Discord Optimization: Automatically segments content to comply with Discord's 2000-character message limits Automated Delivery: Posts formatted intelligence reports to Discord channel with branded "Claptrap" bot personality Requirements Google Sheets account for query management SerpAPI account for Google News access Firecrawl account for article content extraction Anthropic API access for Claude 4 Sonnet Discord bot with proper channel permissions Scheduled execution capability (cron-based trigger) How to set up Step 1: Configure Google Sheets query management Create monitoring sheet: Set up Google Sheets document with "Query" sheet Add search topics: Include industry keywords, competitor names, and relevant search terms Sheet structure: Simple column format with "Query" header containing search terms Access permissions: Ensure n8n has read access to the Google Sheets document Step 2: Configure API credentials Set up the following credentials in n8n: Google Sheets OAuth2: For accessing query configuration sheet SerpAPI: For Google News search functionality with proper rate limits Firecrawl API: For reliable article content extraction across various websites Anthropic API: For Claude 4 Sonnet access with sufficient token limits Discord Bot API: With message posting permissions in target channel Step 3: Customize scheduling settings Cron expression: Default set to "0 9 * * 1" (Mondays at 9 AM) Frequency options: Adjust for daily, weekly, or custom monitoring cycles Timezone considerations: Configure according to team's working hours Execution timing: Ensure adequate processing time for multiple topics Step 4: Configure Discord integration Set up Discord delivery settings: Guild ID: Target Discord server (currently: 919951151888236595) Channel ID: Specific monitoring channel (currently: 1334455789284364309) Bot permissions: Message posting, embed suppression capabilities Brand personality: Customize "Claptrap" bot messaging style and tone Step 5: Customize content analysis Configure AI analysis parameters: Analysis depth: Currently processes top 3 articles per topic Content format: Structured markdown format with consistent styling Language settings: Currently configured for French output (easily customizable) Quality controls: Error handling for inaccessible articles and content How to customize the workflow Query management expansion Topic categories: Organize queries by industry, competitor, or strategic focus areas Keyword optimization: Refine search terms based on result quality and relevance Dynamic queries: Implement time-based or event-triggered query modifications Multi-language support: Add international keyword variations for global monitoring Advanced content processing Article quantity: Modify from 3 to more articles per topic based on analysis needs Content filtering: Add quality scoring and relevance filtering for article selection Source preferences: Implement preferred publisher lists or source quality weighting Content enrichment: Add sentiment analysis, trend identification, or competitive positioning Discord delivery enhancements Rich formatting: Implement Discord embeds, reactions, or interactive elements Multi-channel distribution: Route different topics to specialized Discord channels Alert levels: Add priority-based messaging for urgent industry developments Archive functionality: Create searchable message threads or database storage Integration expansions Slack compatibility: Replace or supplement Discord with Slack notifications Email reports: Add formatted email distribution for executive summaries Database storage: Implement persistent storage for historical analysis and trending API endpoints: Create webhook endpoints for third-party system integration AI analysis customization Analysis templates: Create topic-specific analysis frameworks and formatting Competitive focus: Enhance competitor mention detection and analysis depth Trend identification: Implement cross-topic trend analysis and strategic insights Summary levels: Create executive summaries alongside detailed technical analysis Advanced monitoring features Intelligent content curation The system provides sophisticated content management: Relevance scoring: Automatic ranking of articles by topic relevance and publication authority Duplicate detection: Prevents redundant coverage of the same story across different sources Content quality assessment: Filters low-quality or promotional content automatically Source diversity: Ensures coverage from multiple perspectives and publication types Error handling and reliability Graceful degradation: Continues processing even if individual articles fail to scrape Retry mechanisms: Automatic retry logic for temporary API failures or network issues Content fallbacks: Uses article snippets when full content extraction fails Notification continuity: Ensures Discord delivery even with partial content processing Results interpretation Intelligence report structure Each monitoring cycle delivers: Topic-specific summaries: Individual analysis for each configured search query Source attribution: Complete citation with publication date, source, and URL Structured formatting: Consistent presentation optimized for quick scanning Professional analysis: AI-generated insights maintaining factual accuracy and business context Performance analytics Monitor system effectiveness through: Processing metrics: Track successful article extraction and analysis rates Content quality: Assess relevance and usefulness of delivered intelligence Team engagement: Monitor Discord channel activity and report utilization System reliability: Track execution success rates and error patterns Use cases Competitive intelligence Market monitoring: Track competitor announcements, product launches, and strategic moves Industry trends: Identify emerging technologies, regulatory changes, and market shifts Partnership tracking: Monitor alliance formations, acquisitions, and strategic partnerships Leadership changes: Track executive movements and organizational restructuring Strategic planning support Market research: Continuous intelligence gathering for strategic decision-making Risk assessment: Early warning system for industry disruptions and regulatory changes Opportunity identification: Spot emerging markets, technologies, and business opportunities Brand monitoring: Track industry perception and competitive positioning Team collaboration enhancement Knowledge sharing: Centralized distribution of relevant industry intelligence Discussion facilitation: Provide common information baseline for strategic discussions Decision support: Deliver timely intelligence for business planning and strategy sessions Competitive awareness: Keep teams informed about competitive landscape changes Workflow limitations Language dependency: Currently optimized for French analysis output (easily customizable) Processing capacity: Limited to 3 articles per query (configurable based on API limits) Platform specificity: Configured for Discord delivery (adaptable to other platforms) Scheduling constraints: Fixed weekly schedule (customizable via cron expressions) Content access: Dependent on article accessibility and website compatibility with Firecrawl API dependencies: Requires active subscriptions and proper rate limit management for all integrated services
by Sona Labs
Generate Sora videos, stitch clips, and post to Twitter Generate creative ASMR cutting video concepts with GPT-5.1, create high-quality video clips using Sora v2, stitch them together with Cloudinary, and automatically post to Twitter/Xโtransforming ideas into viral content without manual video editing. How it works Step 1: Generate Video Concepts Schedule Trigger activates the workflow automatically GPT-5.1 AI agent generates 3 unique ASMR cutting scene prompts with unusual objects Creates structured video prompts optimized for Sora v2 (frontal camera angle, cutting actions) Generates Twitter-ready captions with relevant hashtags Saves all concepts and scripts to Google Sheets for tracking Step 2: Create Video Clips with Sora v2 Generates 3 separate Sora v2 video clips in parallel (8-12 seconds each) Each clip uses unique prompts from GPT-5.1 output Videos render at 720x1280 resolution (vertical format for social media) System waits 30 seconds for rendering to complete Step 3: Monitor & Download Videos Loops through all 3 video generation requests Checks Sora API status every 30 seconds until rendering completes Automatically skips failed renders (continues workflow with successful videos) Downloads completed videos from Sora API Uploads each clip to Cloudinary for storage and processing Step 4: Stitch Videos Together Collects all uploaded Cloudinary video IDs Builds Cloudinary transformation URL to stitch 3 clips into one seamless video Applies Twitter-compatible encoding (H.264 baseline, AAC audio, MP4 format) Downloads the final stitched video Step 5: Upload to Twitter/X Prepares video file data and calculates total file size Uses Twitter's chunked upload API (INIT โ APPEND โ FINALIZE) Waits for Twitter's video processing to complete Checks processing status until video is ready Posts tweet with AI-generated caption and attached video Updates Google Sheets status to "Posted" What you'll get AI-Generated Concepts**: Creative ASMR cutting ideas with unusual objects (glass avocados, lava rocks, rainbow soap) Professional Video Clips**: Three 8-12 second Sora v2 videos per concept with 720x1280 resolution Seamless Stitching**: Single combined video optimized for Twitter/X specifications Engaging Captions**: GPT-5.1 generated tweets with hashtags designed for virality Automated Posting**: Direct upload to Twitter/X without manual intervention Cloud Backup**: All videos stored in Cloudinary with metadata Progress Tracking**: Google Sheets integration shows workflow status (In Progress โ Posted) Error Handling**: Failed Sora renders are automatically skipped Why use this Save 4+ hours per video**: Eliminate scripting, shooting, editing, and posting time Consistent posting schedule**: Set it and forget it with the Schedule Trigger Scale content creation**: Generate multiple video variations in 20-30 minutes Professional quality**: Leverage Sora v2's AI video generation for realistic cutting scenes Optimize for virality**: GPT-5.1 creates concepts and captions designed for engagement Reduce creative burnout**: AI handles ideation, execution, and distribution No video editing skills needed**: Complete automation from concept to post Test multiple concepts**: Generate 3 variations per run to see what resonates Setup instructions Required accounts and credentials: OpenAI API Key (GPT-5.1 and Sora v2 access required) Sign up at https://platform.openai.com Ensure your account has Sora v2 API access enabled Generate API key from API Keys section Note: Sora v2 is currently in limited beta Google Sheets OAuth (for tracking video ideas and status) Free Google account required Create a spreadsheet with columns: Category, Scene 1, Scene 2, Scene 3, Status n8n will request OAuth permissions during setup Cloudinary Account (for video storage and stitching) Sign up at https://cloudinary.com (free tier available) Note your cloud name from the dashboard Create an upload preset named n8n_integration Enable unsigned uploads for the preset Twitter OAuth 1.0a Credentials (for automated posting) Apply for Twitter Developer access at https://developer.twitter.com Create a new app in the Developer Portal Generate: API Key, API Secret, Access Token, Access Token Secret Enable "Read and Write" permissions (not just Read) OAuth 1.0a is required for media uploads (OAuth 2.0 won't work) Configuration steps: Update OpenAI API Key: Add your OpenAI API key to these nodes: "OpenAI Chat Model" credentials "Create Sora Video Scene - 1" (Authorization header) "Create Sora Video Scene - 2" (Authorization header) "Create Sora Video Scene - 3" (Authorization header) "Check Video Status" (Authorization header) "Download Completed Video" (Authorization header) Replace Bearer API KEY with Bearer YOUR_ACTUAL_API_KEY Configure Google Sheets: Open "Save Category and Clip Scripts" and "Update Status" nodes Authenticate with your Google account (OAuth 2.0) Select your spreadsheet and sheet name Ensure columns match: Category, Scene 1, Scene 2, Scene 3, Status The workflow will update Status from "In Progress" to "Posted" Update Cloudinary Settings: In "Upload to Cloudinary" node: Replace {Cloud name here} in the URL with your Cloudinary cloud name Verify upload preset is set to n8n_integration In "Build Stitch URL" node: Open the Code node Replace dph9n4uei on line 1 with your cloud name This builds the video stitching transformation URL Add Twitter OAuth 1.0a Credentials: Configure OAuth 1.0a in these nodes: "Twitter Upload - INIT" "Twitter Upload - APPEND" "Finalize Upload" "Check Twitter Processing Status" "Post a Tweet" Use the same OAuth 1.0a credential for all nodes Ensure your Twitter app has "Read and Write" permissions Adjust Schedule Trigger (optional): Default: Runs on every interval Modify in "Schedule Trigger" node to set specific times Recommended: Once per day or every few hours to avoid rate limits Test the workflow: Click "Execute Workflow" to test manually first Verify GPT-5.1 generates 3 video concepts Check that Sora v2 creates all 3 videos Confirm Cloudinary stitches videos correctly Ensure Twitter post appears with video and caption Important notes: Sora API Rate Limits**: Sora v2 may have rendering quotas. Monitor your usage Video Rendering Time**: Each Sora clip takes 2-5 minutes. Total workflow: 15-25 minutes Failed Videos**: The workflow automatically skips failed renders and continues Twitter Video Limits**: Maximum 512MB per video, MP4 format required Cloudinary Free Tier**: 25 credits/month includes video transformations Cost Estimate**: ~$1-3 per run (Sora API pricing varies) Troubleshooting: "Sora API access required"**: Contact OpenAI to enable Sora v2 API on your account Twitter upload fails**: Verify OAuth 1.0a credentials have "Read and Write" permissions Cloudinary upload fails**: Check cloud name and ensure upload preset exists Videos don't stitch**: Verify all 3 videos uploaded successfully to Cloudinary Google Sheets not updating**: Confirm OAuth permissions and sheet column names match Next steps: Enable the Schedule Trigger to automate daily/weekly posts Monitor Google Sheets to track posted content Adjust GPT-5.1 prompts in "ASMR Cutting Ideas" for different content themes Experiment with different video durations (8 vs 12 seconds) Add error notifications using Email or Slack nodes
by Trung Tran
Cloudflare Incident Monitoring & Escalation Workflow ๐ Try Decodo โ Web Scraping & Data API (Coupon: TRUNG) Decodo is a powerful public data access platform offering managed web scraping APIs and proxy infrastructure to collect structured web data at scale. It handles proxies, anti-bot protection, JavaScript rendering, retries, and global IP rotationโso you can focus on data, not scraping complexity. Why Decodo Managed Web Scraping API with anti-bot bypass & high success rates Works with JS-heavy sites; outputs JSON/HTML/CSV Easy integration (Python, Node.js, cURL) for eCommerce, SERP, social & general web data ๐๏ธ Special Discount Use coupon TRUNG to get the Advanced Scraping API plan โ 23,000 requests for $5. Who this workflow is for For DevOps, SRE, IT Ops, and Platform teams running production traffic behind Cloudflare who need reliable incident awareness without alert fatigue. Use it if you want: Continuous Cloudflare incident monitoring Clear severity-based routing Automatic escalation into JIRA Clean Slack & Telegram notifications Deduplicated, noise-controlled alerts What this workflow does This workflow polls the Cloudflare Status API, detects unresolved incidents, scores their impact, and routes them to the right channels. High-impact incidents are escalated to JIRA. Lower-impact updates are notified (or skipped) to reduce noise. How it works (high level) Runs on a fixed schedule (e.g. every 5 minutes) Fetches current Cloudflare incidents Stops early if no active issues exist Normalizes and scores incidents (severity, impact, affected service) Deduplicates previously-alerted incidents Builds human-readable notification payloads Routes by impact: High โ create JIRA incident + notify Low โ notify or suppress Sends alerts to Slack and Telegram Requirements Decoco Scrapper API credential n8n (self-hosted or Cloud) Cloudflare Status API (public) Slack bot (chat:write) Telegram bot + chat ID JIRA project with issue-create permission Optional LLM credentials (summarization/classification) Notes All secrets are stored in n8n Credentials Workflow is idempotent and safe to rerun No assumptions about root cause or remediation Built for production-grade incident visibility with n8n.
by Mark Shcherbakov
Video Guide I prepared a detailed guide that showed the whole process of integrating the Binance API and storing data in Airtable to manage funding statements associated with tokens in a wallet. Youtube Link Who is this for? This workflow is ideal for developers, financial analysts, and cryptocurrency enthusiasts who want to automate the process of managing funding statements and token prices. Itโs particularly useful for those who need a systematic approach to track and report funding fees associated with tokens in their wallets. What problem does this workflow solve? Managing funding statements and token prices across multiple platforms can be cumbersome and error-prone. This workflow automates the process, allowing users to seamlessly fetch funding fees from Binance and record them alongside token prices in Airtable, minimizing manual data entry and potential discrepancies. What this workflow does This workflow integrates the Binance API with an Airtable database, facilitating the storage and management of funding statements linked to tokens in a wallet. The agent can: Fetch funding fees and current positions from Binance. Aggregate data to create structured funding statements. Insert records into Airtable, ensuring proper linkage between funding data and tokens. API Authentication: The workflow establishes authentication with the Binance API using a Crypto Node to handle API keys and signatures, ensuring secure and verified requests. Data Collection: It retrieves necessary data, including funding fees and current positions with properly formatted API requests to ensure seamless communication with Binance. Airtable Integration: The workflow inserts aggregated funding statements and token data into the corresponding Airtable records, managing token existence checks to avoid duplicate entries. Setup Set Up Airtable Database: Create an Airtable base with tables for Funding Statements and Tokens. Generate Binance API Key: Log in and create an API key with appropriate permissions. Set Up Authentication in N8N: Utilize a Crypto Node for Binance API authentication. Configure API Request to Binance: Set request method and headers for communication with the Binance API. Fetch Funding Fees and Current Positions: Retrieve funding data and current positions efficiently. Aggregate and Create Statements: Aggregate data to create detailed funding statements. Insert Data into Airtable: Input the structured data into Airtable and manage token records. Using Get Price Node: Implement a Get Price Node to maintain current token price tracking without additional setup.
by Cheng Siong Chin
How It Works This workflow automates end-to-end contract and invoice management using AI intelligence. It processes proposals through intelligent contract generation, approval workflows, and automated invoicing. OpenAI analyzes proposal content while the system routes approvals intelligently. Upon approval, contracts are generated, invoices created, and notifications sent. The workflow also monitors payment status, generates financial forecasts, and manages follow-up tasks, eliminating manual contract generation delays and approval bottlenecks while ensuring accurate financial record-keeping. Setup Steps Configure OpenAI API credentials in n8n credentials manager. Connect Google Sheets account for invoice and forecast storage. Set up Gmail for approval notifications and client communications. Input Stripe/payment processor credentials for payment tracking. Map proposal form inputs to workflow start node. Prerequisites OpenAI API key, Google Sheets account, Gmail account, Stripe/payment processor access Use Cases Multi-stage approval workflows, SaaS contract management, professional services invoicing Customization Modify approval logic in conditional nodes, replace OpenAI with Anthropic API Benefits Reduces contract processing time by 80%, eliminates approval delays, prevents invoicing errors
by David
Who might benfit from this workflow? Do you have to record your working hours yourself? Then this n8n workflow in combination with an iOS shortcut will definitely help you. Once set up, you can use a shortcut, which can be stored as an app icon on your home screen, to record the start, end and duration of your break. How it works Once setup you can tap the iOS shortcut on your iPhone. You will see a menu containing three options: "Track Start", "Track Break" and "Track End". After time is tracked iOS will display you a notification about the successful operation. How to set it up Copy the notion database to your notion workspace (Top right corner). Copy the n8n workflow to your n8n workspace In the notion nodes in the n8n workflow, add your notion credentials and select the copied notion database. Download the iOS Shortcut from our documentation page Edit the shortcut and paste the url of your n8n Webhook trigger node to the first "Text" node of the iOS shortcut flow. It is a best practice to use authentication. You can do so by adding "Header" auth to the webhook node and to the shrotcut. You need help implementing this or any other n8n workflow? Feel free to contact me via LinkedIn or my business website. You want to start using n8n? Use this link to register for n8n (This is an affiliate link)
by achiya
Find and share AliExpress affiliate products through Telegram Build a Telegram bot that helps users find AliExpress products using natural language requests. The bot uses OpenAI to optimize search queries, Decodo to scrape product listings, and AI analysis to select the best options based on ratings, reviews, and priceโthen automatically generates affiliate tracking links for each recommendation. What it does When users send "Find me wireless keyboard": Bot checks user is member of your Telegram channel (optional) Validates command starts with accepted phrases OpenAI generates optimized English search query Decodo scrapes products from AliExpress AI analyzes the top 10 products and selects best 2 based on reviews, ratings, and price AliExpress Affiliate API creates tracking links Bot sends formatted recommendations with images, prices, ratings, and links Who this is for Affiliate marketers monetizing Telegram channels E-commerce entrepreneurs automating recommendations Channel owners adding value while earning commissions Anyone building AliExpress affiliate systems Setup requirements Credentials needed Telegram Bot API Create bot via @BotFather Add token to n8n Make bot admin in your channel AliExpress Affiliate API Sign up for affiliate program Get: App Key, App Secret, Tracking ID Add to n8n OpenAI API Get API key Add to n8n Used for search and analysis Configuration required Before activation: Channel username - Replace @YOUR_CHANNEL in 2 nodes: Check Channel Membership Verify Channel Member Tracking ID - Set YOUR_AFFILIATE_TRACKING_ID in: Generate Affiliate Links Create Affiliate Link Channel URL - Update button in Request Channel Join Bot admin - Make bot admin in your channel How to use User commands Users send messages starting with: Find me [product] Search for [product] Look for [product] Get me [product] Send me [product] Show me [product] Examples: Find me wireless mouse Search for phone case Look for bluetooth speaker Bot responses Non-member: Asks to join channel Invalid format: Shows usage examples Valid request: Sends "searching..." status Processes with AI Returns 2 recommendations Each includes: image, title, price, rating, orders, link "More Results" button available Customization options Product count: Edit "Select Top 2 Products" node Selection criteria: Modify AI prompts in "AI Product Search" Commands: Add/remove in "Validate Command Format" Channel gate: Delete verification nodes to remove Language: Translate Telegram message nodes AI model: Switch to GPT-3.5-turbo for lower costs Technical details Workflow components: Entry: Telegram webhook Verification: Channel membership Validation: Command format Processing: AI query โ Decodo scrape โ AI analysis Output: Affiliate links โ Message format โ Send APIs used: Telegram Bot API - User interaction OpenAI API - Search optimization, product analysis Decodo - AliExpress scraping AliExpress Affiliate API - Link generation Error handling: Invalid commands โ Usage guide Non-members โ Join request No results โ Error message Spam โ Auto-removal Best practices Cost management: OpenAI: $0.01-0.05 per search Cache popular searches Use GPT-3.5 for lower costs Security: Store credentials in n8n Rotate API keys regularly Monitor activity Performance: Use webhook mode Set up error notifications Implement rate limiting Troubleshooting Bot not responding Verify workflow activated Check credentials valid Review error logs Channel verification fails Confirm bot is admin Check @username correct Ensure user joined No products found Validate credentials Check tracking ID Try different terms Links broken Confirm account active Verify tracking ID Check permissions Version Version: 1.0 Updated: January 2026 Compatible: n8n v1.0+ Setup: 10-15 minutes
by Dustin
Short an simple: This Workflow will sync (add and delete) your Liked Songs to an custom playlist that can be shared. Setup: Create an app on the Spotify Developer Dashboard. Create Spotify Credentials - Just click on one of the Spotify Nodes in the Workflow an click on "create new credentials" and follow the guide. Create the Spotify Playlist that you want to sync to. Copy the exact name of you playlist, go into Node "Edit set Vars" and replace the value "CHANGE MEEEE" with your playlist name. Set your Spotify Credentiels on every Spotify Node. (Should be marekd with Yellow and Red Notes) Do you use Gotify? - No: Delete the Gotify Nodes (all the way to the right end of the Workflow) - Yes: Customize the Gotify Nodes to your needs.
by Cheng Siong Chin
How It Works This workflow automates environmental, social, and governance (ESG) data collection, compliance validation, and sustainability reporting for corporations managing complex regulatory requirements and stakeholder transparency expectations. Designed for sustainability officers, compliance teams, and investor relations departments, it solves the challenge of aggregating ESG metrics across global operations, validating data accuracy, and generating standardized reports for multiple frameworks. The system schedules regular monitoring, fetches consolidated ESG data from operational systems, generates S&D (sustainability and disclosure) submissions, validates compliance through dual AI agents (Compliance Analyzer ensures regulatory adherence, Decision Coordination orchestrates specialized sub-agents for aggregate analysis, traceability monitoring, summary generation, and governance reporting), checks star ratings for data quality, routes findings by compliance status (critical/routine), and produces standardized reports with traceability records. Organizations achieve 90% reduction in reporting cycle time, ensure multi-framework compliance, eliminate manual data aggregation errors, and maintain complete audit trails for regulatory scrutiny. Setup Steps Connect Schedule Trigger for monitoring frequency Configure ESG data sources with API credentials Add AI model API keys to Compliance Analyzer and Decision Coordination Agent nodes Define reporting frameworks and compliance requirements in agent prompts Set quality rating thresholds for data completeness and materiality scoring parameters Configure alert mechanisms for critical compliance gaps requiring immediate remediation Prerequisites ESG data management system access, AI service accounts Use Cases Carbon emissions tracking and reporting, supply chain sustainability monitoring Customization Modify agent prompts for industry-specific materiality topics Benefits Reduces reporting cycle time by 90%, ensures multi-framework compliance simultaneously