by WeblineIndia
๐ง Sentiment Analysis of Product Reviews using Google Sheets & OpenAI ๐ Quick Implementation Steps Automated customer feedback analyzer: Trigger**: Google Sheets triggers on new product review rows. Sentiment Analysis**: Review text sent to OpenAI. Writeback**: Resulting sentiment (Positive, Neutral, Negative) is written back to the sheet. Just connect your credentials and sheet โ you're ready to go! ๐ What It Does This workflow automatically analyzes user-submitted product reviews and classifies them by sentiment using OpenAIโs powerful language models. It eliminates the need to manually sift through feedback by tagging each review with a sentiment score. The sentiment result is then written back to the Google Sheet next to the original review, enabling you to get a fast, clear snapshot of overall customer perception, satisfaction and pain points. Whether you're monitoring 10 or 10,000 reviews, this process scales effortlessly and updates every minute. ๐ค Whoโs It For This workflow is designed for: E-commerce teams** collecting user reviews. Product teams** monitoring customer feedback. Marketing teams** identifying promotable reviews. Support teams** watching for negative experiences. SaaS platforms**, apps, and survey tools managing structured text feedback. โ Requirements Youโll need: A Google Sheet with two columns: Review and Sentiment Google Sheets OAuth2 credentials in n8n OpenAI API Key (for GPT-4o-mini or GPT-3.5) n8n instance with LangChain and OpenAI nodes enabled โ๏ธ How It Works Google Sheets Trigger: Watches for new rows every minute OpenAI Integration: Uses LangChainโs Sentiment Analysis node Passes review text into GPT-4o-mini via the OpenAI Chat Model node Sheet Update: The sentiment result (Positive, Negative, or Neutral) is written into the Sentiment column in the same row. Sticky Notes included for better visual understanding inside the workflow editor. ๐ ๏ธ Steps to Configure and Use 1. Prepare Your Google Sheet Make sure your sheet is named Sheet1 with the following structure: | Review | Sentiment | |---------------------------------------|-----------| | Absolutely love it! | | | Not worth the price. | | 2. Set Up Credentials Google Sheets**: OAuth2 credentials OpenAI**: API Key added via OpenAI API credential in n8n 3. Import & Activate Workflow Import the workflow JSON into your n8n instance. Assign the proper credentials to the trigger and OpenAI nodes. Activate the workflow. ๐งฉ How To Customize ๐๏ธ Alerting: Add Slack/Email nodes for negative sentiment alerts ๐ Triggering: Change the polling interval to real-time triggers (e.g., webhook) ๐ Extended Sentiment: Modify sentiment categories (e.g., "Mixed", "Sarcastic") ๐งพ Summary Report: Add Cron + Aggregation nodes for daily/weekly summaries ๐ง Prompt Tuning: Adjust system prompt for deeper or context-based sentiment evaluation ๐งฑ Addโons (Optional Features) Email Digest of Negative Reviews Google Drive Logging Team Notification via Slack Summary to Notion, Airtable, or Google Docs ๐ Use Case Examples Online Stores**: Auto-tag reviews for reputation monitoring Product Teams**: See which feature releases generate positive or negative buzz CX Dashboards**: Feed real-time sentiment to internal BI tools Marketing**: Extract glowing reviews for social proof Support**: Triage issues by flagging critical comments instantly ...and many more applications wherever text feedback is collected. ๐งฐ Troubleshooting Guide | Issue | Possible Cause | Suggested Fix | |-------------------------|---------------------------------------------|---------------------------------------------------| | Sentiment not updating | Sheet credentials missing or misconfigured | Reconnect Google Sheets OAuth2 | | Blank sentiment | Review column empty or misaligned | Ensure proper column header & value present | | OpenAI errors | Invalid or expired API key | Regenerate API Key from OpenAI and re-auth | | Workflow doesnโt run | Polling settings incorrect | Confirm interval & document ID in trigger node | ๐ค Need Help? If you need assistance for โ Help setting up this workflow โ๏ธ Customizing prompts or output ๐ Automating your full review pipeline ๐ Contact us today at WeblineIndia. We will be happy to assist.
by Rahul Joshi
Description: Discover which marketing channels actually convert with this n8n automation template. The workflow fetches all opportunities from HighLevel (GHL), filters for โClosed Wonโ deals, computes lead-to-sale conversion metrics per source, and sends a summary report to Slack while logging raw data into Google Sheets for ongoing analysis. Perfect for marketing teams, growth analysts, and sales managers who want to reduce wasted ad spend and double down on sources that deliver real ROI. โ What This Template Does (Step-by-Step) โก Manual or Scheduled Trigger Run the workflow manually for instant analysis or automate it daily/weekly with a schedule trigger. ๐ฅ Fetch All Opportunities from HighLevel Pulls every deal record from your GHL CRM, including status, amount, and lead source fields. ๐ Filter for Closed-Won Deals Separates deals by outcome โ only โWonโ deals are used for conversion tracking, while others trigger Slack alerts for team review. ๐ Log Won Deals to Google Sheets Saves every successful dealโs details into a structured Google Sheet for long-term performance tracking. ๐งฎ Calculate Lead Source Metrics Aggregates results by lead source, calculating total deals, conversion rate, and total revenue per source automatically. ๐ข Send Slack Summary Report Posts a neat summary of conversion metrics to a dedicated Slack channel like #lead-source-report, ensuring visibility for the marketing and sales teams. ๐ Alert for Lost/Pending Deals Non-won opportunities are flagged and shared with the team via Slack for timely follow-ups. ##๐ง Key Features ๐ Automated lead source performance tracking ๐ฌ Slack alerts for both success and loss updates ๐ Real-time conversion and ROI visibility โ๏ธ Seamless GHL + Google Sheets + Slack integration ๐ Ready to run on-demand or on schedule ๐ผ Use Cases ๐ก Measure campaign ROI across channels ๐ฏ Identify top-performing ad platforms ๐ข Send weekly sales source reports to marketing ๐ฐ Optimize budget allocation using data-driven insights ๐ฆ Required Integrations HighLevel (GHL) โ for opportunity data retrieval Google Sheets โ for storing and visualizing deal data Slack โ for team notifications and reports ๐ฏ Why Use This Template? โ Saves hours of manual reporting work โ Ensures consistent performance tracking โ Highlights winning and underperforming sources โ Helps marketing teams focus on what truly converts
by Omer Fayyaz
This n8n template implements a Calendly Booking & Cancellation Automation Hub that automatically processes Calendly webhook events, logs data to Google Sheets, and sends intelligent Slack notifications Who's it for This template is designed for professionals, teams, and businesses who use Calendly for scheduling and want to automate their booking management workflow. It's perfect for: Sales teams** who need instant notifications about new bookings and cancellations Service providers** (consultants, coaches, therapists) who want to track appointments automatically Businesses** that need centralized logging of all booking events for analytics Teams** that want smart categorization of urgent bookings and last-minute cancellations Organizations** requiring automated follow-up workflows based on booking status How it works / What it does This workflow creates a comprehensive Calendly automation system that automatically processes booking confirmations and cancellations. The system: Listens for Calendly events via webhook trigger for: invitee.created - New booking confirmations invitee.canceled - Booking cancellations Routes events intelligently using a Switch node to separate booking and cancellation processing For Bookings: Extracts and transforms all booking data (invitee info, event details, timing, location, guests) Calculates computed fields (formatted dates/times, duration, days until event, urgency flags) Detects urgent bookings (same-day or next-day appointments) for priority handling Logs complete booking information to Google Sheets Sends formatted Slack notifications with meeting links, reschedule/cancel options For Cancellations: Extracts cancellation details (reason, who canceled, timing) Categorizes cancellations into three types: Last Minute (within 24 hours of event) - High priority follow-up Standard (upcoming events) - Normal priority Past Event (already occurred) - Low priority Calculates hours before event for timing analysis Logs cancellation data to Google Sheets Sends categorized Slack alerts with follow-up priority indicators Data Management: Stores all bookings in a dedicated Google Sheets tab Stores all cancellations in a separate Google Sheets tab Maintains complete event history for analytics and reporting How to set up 1. Configure Calendly Webhook Trigger Go to developer.calendly.com Create an OAuth2 application or use Personal Access Token In n8n, add Calendly OAuth2 credentials The workflow automatically registers webhooks for invitee.created and invitee.canceled events Ensure your Calendly account has the necessary permissions 2. Set up Google Sheets Create a Google Sheets spreadsheet with two tabs: Bookings - For logging new booking confirmations Cancellations - For logging cancelled appointments Configure Google Sheets OAuth2 credentials in n8n Update the document ID in both Google Sheets nodes: "Log to Bookings Sheet1" node "Log to Cancellations Sheet" node The workflow uses auto-mapping, so ensure your sheet headers match the data fields 3. Configure Slack Notifications Create a Slack app at api.slack.com Add Bot Token Scopes: chat:write, channels:read Install the app to your workspace Add Slack OAuth2 credentials in n8n Update the channel name in both Slack nodes (default: "general") Customize notification messages if needed 4. Test the Workflow Activate the workflow in n8n Create a test booking in Calendly Verify that: Data appears in Google Sheets Slack notification is received All fields are correctly populated Test cancellation flow by canceling a booking 5. Customize (Optional) Adjust urgency detection logic (currently same-day or next-day) Modify Slack notification formatting Add email notifications using Email nodes Integrate with CRM systems (HubSpot, Salesforce, etc.) Add follow-up email automation Requirements Calendly account** with active scheduling links Google Sheets account** with a spreadsheet set up Slack workspace** with app installation permissions n8n instance** (self-hosted or cloud) OAuth2 credentials** for Calendly, Google Sheets, and Slack How to customize the workflow Modify Urgency Detection Edit the "Check Urgency" IF node to change what constitutes an urgent booking Currently flags same-day or next-day bookings Adjust the days_until_event threshold as needed Enhance Slack Notifications Customize message formatting in Slack nodes Add emoji or formatting to match your team's style Include additional fields from the booking data Add @mentions for urgent bookings Add Email Notifications Insert Email nodes after Slack notifications Send confirmation emails to invitees Notify team members via email Create email templates for different event types Integrate with CRM Add HTTP Request nodes to sync bookings to your CRM Update contact records when bookings are created Create opportunities or deals from booking data Sync cancellation reasons for analysis Add Analytics Create additional Google Sheets tabs for analytics Use formulas to calculate booking rates, cancellation rates Track popular time slots and event types Monitor team member availability Customize Data Fields Modify the "Transform Booking Data" and "Transform Cancellation Data" Set nodes Add custom fields based on your Calendly form questions Extract additional metadata from the webhook payload Calculate business-specific metrics Key Features Automatic event processing** - No manual intervention required Smart urgency detection** - Identifies same-day and next-day bookings automatically Intelligent cancellation categorization** - Classifies cancellations by timing and priority Comprehensive data extraction** - Captures all booking details including guests, questions, and metadata Dual logging system** - Separate sheets for bookings and cancellations Rich Slack notifications** - Formatted messages with meeting links and action buttons Computed fields** - Automatically calculates duration, days until event, formatted dates/times Error handling** - Nodes configured with continueRegularOutput to prevent workflow failures Scalable architecture** - Handles high-volume booking scenarios Use Cases Sales team automation** - Instant notifications when prospects book demos Consultant scheduling** - Track all client appointments in one place Service business management** - Monitor bookings and cancellations for service providers Team calendar coordination** - Keep team members informed about schedule changes Analytics and reporting** - Build dashboards from logged booking data Customer relationship management** - Sync booking data with CRM systems Follow-up automation** - Trigger email sequences based on booking status Resource planning** - Analyze booking patterns to optimize scheduling Data Fields Captured Booking Data Event ID, invitee name, email, first name Event name, start/end times (ISO format) Formatted date and time (human-readable) Timezone, duration in minutes Meeting URL (Google Meet, Zoom, etc.) Reschedule and cancel URLs Location type (virtual, in-person, etc.) Guest count and guest emails Questions and answers (JSON format) Days until event, same-day flag Urgency status and label Processing timestamp Cancellation Data Event ID, invitee name, email Original scheduled date and time Cancellation reason Who canceled (invitee/host) Canceler type Hours before event Last-minute flag (< 24 hours) Cancellation category and priority Cancellation timestamp Workflow Architecture The workflow uses a routing pattern to handle different event types: Calendly Webhook Trigger โ Receives all events Route Event Type (Switch) โ Separates bookings from cancellations Parallel Processing โ Each path processes independently Data Transformation โ Set nodes extract and format data Intelligent Routing โ IF/Switch nodes categorize by urgency/type Data Logging โ Google Sheets stores all events Notifications โ Slack alerts team members Example Scenarios Scenario 1: New Booking Customer books a 30-minute consultation for tomorrow Workflow detects it's a next-day booking (urgent) Data logged to "Bookings" sheet with urgency flag Slack notification sent with ๐จ URGENT label Team member receives instant alert Scenario 2: Last-Minute Cancellation Customer cancels meeting 2 hours before scheduled time Workflow categorizes as "last-minute" cancellation Data logged to "Cancellations" sheet with high priority Slack alert sent with ๐จ LAST MINUTE label Team can immediately follow up or fill the slot Scenario 3: Standard Cancellation Customer cancels meeting 3 days in advance Workflow categorizes as "standard" cancellation Data logged with normal priority Slack notification sent with standard formatting Team can plan accordingly This template transforms your Calendly scheduling into a fully automated booking management system, ensuring no booking goes unnoticed and providing valuable insights into your scheduling patterns and customer behavior.
by vinci-king-01
Smart Blockchain Monitor with ScrapeGraphAI Risk Detection and Instant Alerts ๐ฏ Target Audience Cryptocurrency traders and investors DeFi protocol managers and developers Blockchain security analysts Financial compliance officers Crypto fund managers and institutions Risk management teams Blockchain developers monitoring smart contracts Digital asset custodians ๐ Problem Statement Manual blockchain monitoring is time-consuming and prone to missing critical events, often leading to delayed responses to high-value transactions, security threats, or unusual network activity. This template solves the challenge of real-time blockchain surveillance by automatically detecting, analyzing, and alerting on significant blockchain events using AI-powered intelligence and instant notifications. ๐ง How it Works This workflow automatically monitors blockchain activity in real-time, uses ScrapeGraphAI to intelligently extract transaction data from explorer pages, performs sophisticated risk analysis, and instantly alerts your team about significant events across multiple blockchains. Key Components Blockchain Webhook - Real-time trigger that activates when new blocks are detected Data Normalizer - Standardizes blockchain data across different networks ScrapeGraphAI Extractor - AI-powered transaction data extraction from blockchain explorers Risk Analyzer - Advanced risk scoring based on transaction patterns and values Smart Filter - Intelligently routes only significant events for alerts Slack Alert System - Instant formatted notifications to your team ๐ Risk Analysis Specifications The template performs comprehensive risk analysis with the following parameters: | Risk Factor | Threshold | Score Impact | Description | |-------------|-----------|--------------|-------------| | High-Value Transactions | >$10,000 USD | +15 per transaction | Individual transactions exceeding threshold | | Block Volume | >$1M USD | +20 points | Total block transaction volume | | Block Volume | >$100K USD | +10 points | Moderate block transaction volume | | Failure Rate | >10% | +15 points | Percentage of failed transactions in block | | Multiple High-Value | >3 transactions | Alert trigger | Multiple large transactions in single block | | Critical Failure Rate | >20% | Alert trigger | Extremely high failure rate indicator | Risk Levels: High Risk**: Score โฅ 50 (Immediate alerts) Medium Risk**: Score โฅ 25 (Standard alerts) Low Risk**: Score < 25 (No alerts) ๐ Supported Blockchains | Blockchain | Explorer | Native Support | Transaction Detection | |------------|----------|----------------|----------------------| | Ethereum | Etherscan | โ Full | High-value, DeFi, NFT | | Bitcoin | Blockchair | โ Full | Large transfers, institutional | | Binance Smart Chain | BscScan | โ Full | DeFi, high-frequency trading | | Polygon | PolygonScan | โ Full | Layer 2 activity monitoring | ๐ ๏ธ Setup Instructions Estimated setup time: 15-20 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Slack workspace with webhook or bot token Blockchain data source (Moralis, Alchemy, or direct node access) Basic understanding of blockchain explorers Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure proper functionality 3. Set up Slack Integration Add Slack OAuth2 or webhook credentials Configure your target channel for blockchain alerts Test message delivery to ensure notifications work Customize alert formatting preferences 4. Configure Blockchain Webhook Set up the webhook endpoint for blockchain data Configure your blockchain data provider (Moralis, Alchemy, etc.) Ensure webhook payload includes block number and blockchain identifier Test webhook connectivity with sample data 5. Customize Risk Parameters Adjust high-value transaction threshold (default: $10,000) Modify risk scoring weights based on your needs Configure blockchain-specific risk factors Set failure rate thresholds for your use case 6. Test and Validate Send test blockchain data to trigger the workflow Verify ScrapeGraphAI extraction accuracy Check risk scoring calculations Confirm Slack alerts are properly formatted and delivered ๐ Workflow Customization Options Modify Risk Analysis Adjust high-value transaction thresholds per blockchain Add custom risk factors (contract interactions, specific addresses) Implement whitelist/blacklist address filtering Configure time-based risk adjustments Extend Blockchain Support Add support for additional blockchains (Solana, Cardano, etc.) Customize explorer URL patterns Implement chain-specific transaction analysis Add specialized DeFi protocol monitoring Enhance Alert System Add email notifications alongside Slack Implement severity-based alert routing Create custom alert templates Add alert escalation rules Advanced Analytics Add transaction pattern recognition Implement anomaly detection algorithms Create blockchain activity dashboards Add historical trend analysis ๐ Use Cases Crypto Trading**: Monitor large market movements and whale activity DeFi Security**: Track protocol interactions and unusual contract activity Compliance Monitoring**: Detect suspicious transaction patterns Institutional Custody**: Alert on high-value transfers and security events Smart Contract Monitoring**: Track contract interactions and state changes Market Intelligence**: Analyze blockchain activity for trading insights ๐จ Important Notes Respect ScrapeGraphAI API rate limits and terms of service Implement appropriate delays to avoid overwhelming blockchain explorers Keep your API credentials secure and rotate them regularly Monitor API usage to manage costs effectively Consider blockchain explorer rate limits for high-frequency monitoring Ensure compliance with relevant financial regulations Regularly update risk parameters based on market conditions ๐ง Troubleshooting Common Issues: ScrapeGraphAI extraction errors: Check API key and account status Webhook trigger failures: Verify webhook URL and payload format Slack notification failures: Check bot permissions and channel access False positive alerts: Adjust risk scoring thresholds Missing transaction data: Verify blockchain explorer accessibility Rate limit errors: Implement delays and monitor API usage Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Blockchain explorer API documentation Slack API documentation for advanced configurations Cryptocurrency compliance and regulatory guidelines
by vinci-king-01
Copyright Infringement Detector with ScrapeGraphAI Analysis and Legal Action Automation ๐ฏ Target Audience Intellectual property lawyers and legal teams Brand protection specialists Content creators and publishers Marketing and brand managers Digital rights management teams Copyright enforcement agencies Media companies and publishers E-commerce businesses with proprietary content Software and technology companies Creative agencies protecting client work ๐ Problem Statement Manual monitoring for copyright infringement is time-consuming, often reactive rather than proactive, and can miss critical violations that damage brand reputation and revenue. This template solves the challenge of automatically detecting copyright violations, analyzing infringement patterns, and providing immediate legal action recommendations using AI-powered web scraping and automated legal workflows. ๐ง How it Works This workflow automatically scans the web for potential copyright violations using ScrapeGraphAI, analyzes content similarity, determines legal action requirements, and provides automated alerts for immediate response to protect intellectual property rights. Key Components Schedule Trigger - Runs automatically every 24 hours to monitor for new infringements ScrapeGraphAI Web Search - Uses AI to search for potential copyright violations across the web Content Comparer - Analyzes potential infringements and calculates similarity scores Infringement Detector - Determines legal action required and creates case reports Legal Action Trigger - Routes cases based on severity and urgency Brand Protection Alert - Sends urgent alerts for high-priority violations Monitoring Alert - Tracks medium-risk cases for ongoing monitoring ๐ Detection and Analysis Specifications The template monitors and analyzes the following infringement types: | Infringement Type | Detection Method | Risk Level | Action Required | |-------------------|------------------|------------|-----------------| | Exact Text Match | High similarity score (>80%) | High | Immediate cease & desist | | Paraphrased Content | Moderate similarity (50-80%) | Medium | Monitoring & evidence collection | | Unauthorized Brand Usage | Brand name detection in content | High | Legal consultation | | Competitor Usage | Known competitor domain detection | High | DMCA takedown | | Image/Video Theft | Visual content analysis | High | Immediate action | | Domain Infringement | Suspicious domain patterns | Medium | Investigation | ๐ ๏ธ Setup Instructions Estimated setup time: 30-35 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Telegram or other notification service credentials Legal team contact information Copyrighted content database Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Schedule Trigger Configure the monitoring frequency (default: every 24 hours) Adjust timing to match your business hours Set appropriate timezone for your legal team 4. Configure Copyrighted Content Database Update the Content Comparer node with your protected content Add brand names, slogans, and unique phrases Include competitor and suspicious domain lists Set similarity thresholds for different content types 5. Customize Legal Action Rules Update the Infringement Detector node with your legal thresholds Configure action plans for different infringement types Set up case priority levels and response timelines Define evidence collection requirements 6. Set up Alert System Configure Telegram bot or other notification service Set up different alert types for different severity levels Configure legal team contact information Test alert delivery and formatting 7. Test and Validate Run the workflow manually with test search terms Verify all detection steps complete successfully Test alert system with sample infringement data Validate legal action recommendations ๐ Workflow Customization Options Modify Detection Parameters Adjust similarity thresholds for different content types Add more sophisticated text analysis algorithms Include image and video content detection Customize brand name detection patterns Extend Legal Action Framework Add more detailed legal action plans Implement automated cease and desist generation Include DMCA takedown automation Add court filing preparation workflows Customize Alert System Add integration with legal case management systems Implement tiered alert systems (urgent, high, medium, low) Add automated evidence collection and documentation Include reporting and analytics dashboards Output Customization Add integration with legal databases Implement automated case tracking Create compliance reporting systems Add trend analysis and pattern recognition ๐ Use Cases Brand Protection**: Monitor unauthorized use of brand names and logos Content Protection**: Detect plagiarism and content theft Legal Enforcement**: Automate initial legal action processes Competitive Intelligence**: Monitor competitor content usage Compliance Monitoring**: Ensure proper attribution and licensing Evidence Collection**: Automatically document violations for legal proceedings ๐จ Important Notes Respect website terms of service and robots.txt files Implement appropriate delays between requests to avoid rate limiting Regularly review and update copyrighted content database Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Ensure compliance with local copyright laws and regulations Consult with legal professionals before taking automated legal action Maintain proper documentation for all detected violations ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status False positive detections: Adjust similarity thresholds and detection parameters Alert delivery failures: Check notification service credentials Legal action errors: Verify legal team contact information Schedule trigger failures: Check timezone and interval settings Content analysis errors: Review the Code node's JavaScript logic Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Copyright law resources and best practices Legal automation and compliance guidelines Brand protection and intellectual property resources
by vinci-king-01
Carbon Footprint Tracker with ScrapeGraphAI Analysis and ESG Reporting Automation ๐ฏ Target Audience Sustainability managers and ESG officers Environmental compliance teams Corporate social responsibility (CSR) managers Energy and facilities managers Supply chain sustainability coordinators Environmental consultants Green building certification teams Climate action plan coordinators Regulatory compliance officers Corporate reporting and disclosure teams ๐ Problem Statement Manual carbon footprint calculation and ESG reporting is complex, time-consuming, and often inaccurate due to fragmented data sources and outdated emission factors. This template solves the challenge of automatically collecting environmental data, calculating accurate carbon footprints, identifying reduction opportunities, and generating comprehensive ESG reports using AI-powered data collection and automated sustainability workflows. ๐ง How it Works This workflow automatically collects energy and transportation data using ScrapeGraphAI, calculates comprehensive carbon footprints across all three scopes, identifies reduction opportunities, and generates automated ESG reports for sustainability compliance and reporting. Key Components Schedule Trigger - Runs automatically every day at 8:00 AM to collect environmental data Energy Data Scraper - Uses ScrapeGraphAI to extract energy consumption data and emission factors Transport Data Scraper - Collects transportation emission factors and fuel efficiency data Footprint Calculator - Calculates comprehensive carbon footprint across Scope 1, 2, and 3 emissions Reduction Opportunity Finder - Identifies cost-effective carbon reduction opportunities Sustainability Dashboard - Creates comprehensive sustainability metrics and KPIs ESG Report Generator - Automatically generates ESG compliance reports Create Reports Folder - Organizes reports in Google Drive Save Report to Drive - Stores final reports for stakeholder access ๐ Carbon Footprint Analysis Specifications The template calculates and tracks the following emission categories: | Emission Scope | Category | Data Sources | Calculation Method | Example Output | |----------------|----------|--------------|-------------------|----------------| | Scope 1 (Direct) | Natural Gas | EPA emission factors | Consumption ร 11.7 lbs CO2/therm | 23,400 lbs CO2 | | Scope 1 (Direct) | Fleet Fuel | EPA fuel economy data | Miles รท MPG ร 19.6 lbs CO2/gallon | 11,574 lbs CO2 | | Scope 2 (Electricity) | Grid Electricity | EPA emission factors | kWh ร 0.92 lbs CO2/kWh | 46,000 lbs CO2 | | Scope 3 (Indirect) | Employee Commute | EPA transportation data | Miles ร 0.77 lbs CO2/mile | 19,250 lbs CO2 | | Scope 3 (Indirect) | Air Travel | EPA aviation factors | Miles ร 0.53 lbs CO2/mile | 26,500 lbs CO2 | | Scope 3 (Indirect) | Supply Chain | Estimated factors | Electricity ร 0.1 multiplier | 4,600 lbs CO2 | ๐ ๏ธ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Drive API access for report storage Organizational energy and transportation data ESG reporting requirements and standards Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Schedule Trigger Configure the daily schedule (default: 8:00 AM UTC) Adjust timezone to match your business hours Set appropriate frequency for your reporting needs 4. Configure Data Sources Update the Energy Data Scraper with your energy consumption sources Configure the Transport Data Scraper with your transportation data Set up organizational data inputs (employees, consumption, etc.) Customize emission factors for your region and industry 5. Customize Carbon Calculations Update the Footprint Calculator with your organizational data Configure scope boundaries and calculation methodologies Set up industry-specific emission factors Adjust for renewable energy and offset programs 6. Configure Reduction Analysis Update the Reduction Opportunity Finder with your investment criteria Set up cost-benefit analysis parameters Configure priority scoring algorithms Define implementation timelines and effort levels 7. Set up Report Generation Configure Google Drive integration for report storage Set up ESG report templates and formats Define stakeholder access and permissions Test report generation and delivery 8. Test and Validate Run the workflow manually with test data Verify all calculation steps complete successfully Check data accuracy and emission factor validity Validate ESG report compliance and formatting ๐ Workflow Customization Options Modify Data Collection Add more energy data sources (renewables, waste, etc.) Include additional transportation modes (rail, shipping, etc.) Integrate with building management systems Add real-time monitoring and IoT data sources Extend Calculation Framework Add more Scope 3 categories (waste, water, etc.) Implement industry-specific calculation methodologies Include carbon offset and credit tracking Add lifecycle assessment (LCA) capabilities Customize Reduction Analysis Add more sophisticated ROI calculations Implement scenario modeling and forecasting Include regulatory compliance tracking Add stakeholder engagement metrics Output Customization Add integration with sustainability reporting platforms Implement automated stakeholder notifications Create executive dashboards and visualizations Add compliance monitoring and alert systems ๐ Use Cases ESG Compliance Reporting**: Automate sustainability disclosure requirements Carbon Reduction Planning**: Identify and prioritize reduction opportunities Regulatory Compliance**: Meet environmental reporting mandates Stakeholder Communication**: Generate transparent sustainability reports Investment Due Diligence**: Provide ESG data for investors and lenders Supply Chain Sustainability**: Track and report on Scope 3 emissions ๐จ Important Notes Respect data source terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update emission factors for accuracy Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Ensure compliance with local environmental reporting regulations Validate calculations against industry standards and methodologies Maintain proper documentation for audit and verification purposes ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Data source access issues: Check website accessibility and rate limits Calculation errors: Verify emission factors and organizational data Report generation failures: Check Google Drive permissions and quotas Schedule trigger failures: Check timezone and cron expression Data accuracy issues: Validate against manual calculations and industry benchmarks Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance EPA emission factor databases and methodologies GHG Protocol standards and calculation guidelines ESG reporting frameworks and compliance requirements Sustainability reporting best practices and standards
by Yaron Been
Enhance Your Workflow with 2Ndmoises_Generator AI This n8n workflow integrates with Replicateโs moicarmonas/2ndmoises_generator model to generate custom outputs based on your prompt. It handles authentication, triggers the prediction, monitors progress, and processes the final result automatically. ๐ Section 1: Trigger & Authentication โก On Clicking โExecuteโ (Manual Trigger) Purpose: Start the workflow manually whenever you want to run it. Benefit: Great for testing or running on-demand without needing automation. ๐ Set API Key (Set Node) Purpose: Stores your Replicate API key in the workflow. How it works: Adds your API key as a variable that other nodes can reference. Benefit: Keeps credentials secure and reusable. ๐ Section 2: Sending the AI Request ๐ค Create Prediction (HTTP Request Node) Purpose: Sends a request to the Replicate API to start generating output with the model. Input Parameters: prompt (text you provide) seed (for reproducibility) width / height / lora\_scale (generation settings) Benefit: Allows full customization of the AIโs generation process. ๐ Extract Prediction ID (Code Node) Purpose: Pulls out the Prediction ID and status from the API response. Why important: The ID is required to check the jobโs progress later. Benefit: Automates polling without manual tracking. ๐ Section 3: Polling & Waiting โณ Wait (Wait Node) Purpose: Adds a short pause (2 seconds) between checks. Benefit: Prevents hitting the API too quickly and avoids errors. ๐ Check Prediction Status (HTTP Request Node) Purpose: Calls Replicate again to ask if the prediction is finished. Benefit: Automates progress monitoring without manual refresh. โ Check If Complete (If Node) Purpose: Decides whether the model has finished generating. Paths: True โ Move on to processing the result. False โ Go back to Wait and recheck. Benefit: Ensures the workflow loops until a final output is ready. ๐ Section 4: Handling the Result ๐ฆ Process Result (Code Node) Purpose: Cleans up the final API response and extracts: Status Output (generated result) Metrics Timestamps (created\_at / completed\_at) Model info Benefit: Provides a structured, ready-to-use output for other workflows or integrations. ๐ Workflow Overview Table | Section | Node Name | Purpose | | ----------------- | ----------------------- | ---------------------------- | | 1. Trigger & Auth | On Clicking โExecuteโ | Starts the workflow manually | | | Set API Key | Stores API credentials | | 2. AI Request | Create Prediction | Sends request to Replicate | | | Extract Prediction ID | Gets prediction ID + status | | 3. Polling | Wait | Adds delay before recheck | | | Check Prediction Status | Monitors progress | | | Check If Complete | Routes based on completion | | 4. Result | Process Result | Extracts and cleans output | | Notes | Sticky Note | Explains setup + model info | ๐ฏ Key Benefits ๐ Secure authentication using Set node for API key storage. ๐ค Hands-free generation โ just provide a prompt, the workflow handles everything else. ๐ Automated polling ensures you always get the final result without manual checking. ๐ฆ Clean structured output ready for downstream use in apps, dashboards, or notifications.
by Harshil Agrawal
This workflow allows you to add a datapoint to Beeminder when a new activity is added to Strava. You can use this workflow to keep a track of your fitness activities and connect Strava with Beeminder. If you want to keep a track of different activities like the number of hours worked in a week or the number of tasks completed, you can use the relevant node. For example, you can use the Clockify Trigger node or the Toggl Trigger node.
by PDF Vector
Overview Organizations dealing with high-volume document processing face challenges in efficiently handling diverse document types while maintaining quality and tracking performance metrics. This enterprise-grade workflow provides a scalable solution for batch processing documents including PDFs, scanned documents, and images (JPG, PNG) with comprehensive analytics, error handling, and quality assurance. What You Can Do Process thousands of documents in parallel batches efficiently Monitor performance metrics and success rates in real-time Handle diverse document formats with automatic format detection Generate comprehensive analytics dashboards and reports Implement automated quality assurance and error handling Who It's For Large organizations, document processing centers, digital transformation teams, enterprise IT departments, and businesses that need to process thousands of documents reliably with detailed performance tracking and analytics. The Problem It Solves High-volume document processing without proper monitoring leads to bottlenecks, quality issues, and inefficient resource usage. Organizations struggle to track processing success rates, identify problematic document types, and optimize their workflows. This template provides enterprise-grade batch processing with comprehensive analytics and automated quality assurance. Setup Instructions: Configure Google Drive credentials for document folder access Install the PDF Vector community node from the n8n marketplace Configure PDF Vector API credentials with appropriate rate limits Set up batch processing parameters (batch size, retry logic) Configure quality thresholds and validation rules Set up analytics dashboard and reporting preferences Configure error handling and notification systems Key Features: Parallel batch processing for maximum throughput Support for mixed document formats (PDFs, Word docs, images) OCR processing for handwritten and scanned documents Comprehensive analytics dashboard with success rates and performance metrics Automatic document prioritization based on size and complexity Intelligent error handling with automatic retry logic Quality assurance checks and validation Real-time processing monitoring and alerts Customization Options: Configure custom document categories and processing rules Set up specific extraction templates for different document types Implement automated workflows for documents that fail quality checks Configure credit usage optimization to minimize costs Set up custom analytics and reporting dashboards Add integration with existing document management systems Configure automated notifications for processing completion or errors Implementation Details: The workflow uses intelligent batching to process documents efficiently while monitoring performance metrics in real-time. It automatically handles different document formats, applies OCR when needed, and provides detailed analytics to help organizations optimize their document processing operations. The system includes sophisticated error recovery and quality assurance mechanisms. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by Feedspace
This workflow sends instant Telegram notifications whenever a new testimonial is submitted through Feedspace, helping you respond quickly to customer feedback and share positive reviews. Who is this for? Businesses collecting testimonials via Feedspace Marketing teams who want instant alerts for new reviews Customer success teams monitoring feedback Anyone wanting to automate testimonial notifications What problem does it solve? It removes the need to manually check Feedspace by instantly notifying your team on Telegram, so they can quickly review, respond to, and share new testimonials as soon as theyโre submitted. What it does When a customer submits a testimonial on your Feedspace form, this workflow: Receives the webhook payload from Feedspace Extracts the customer's name and public review URL Sends a formatted Telegram message Returns a success/error response to Feedspace Requirements Feedspace account** with webhook integration enabled Telegram Bot** (create via @BotFather) Your Telegram Chat ID** (get via @userinfobot) Setup steps Create a Telegram Bot Open Telegram and message @BotFather Send /newbot and follow the prompts Save the API token you receive Use it to connect to Telegram account Get Your Chat ID Message @userinfobot on Telegram Send /start Copy your numeric Chat ID Configure the Workflow Import this template into n8n Add your Telegram Bot credentials Set your Chat ID (via n8n variable or directly in the node) Connect to Feedspace Activate the workflow and copy the Production webhook URL Go to Feedspace โ Automations โ Webhooks Paste the webhook URL and activate it See https://www.feedspace.io/help/automation/ for more information
by vinci-king-01
Creative Asset Manager with ScrapeGraphAI Analysis and Brand Compliance ๐ฏ Target Audience Creative directors and design managers Marketing teams managing brand assets Digital asset management (DAM) administrators Brand managers ensuring compliance Content creators and designers Marketing operations teams Creative agencies managing client assets Brand compliance officers ๐ Problem Statement Managing creative assets manually is inefficient and error-prone, often leading to inconsistent branding, poor organization, and compliance issues. This template solves the challenge of automatically analyzing, organizing, and ensuring brand compliance for creative assets using AI-powered analysis and automated workflows. ๐ง How it Works This workflow automatically processes uploaded creative assets using ScrapeGraphAI for intelligent analysis, generates comprehensive tags, checks brand compliance, organizes files systematically, and maintains a centralized dashboard for creative teams. Key Components Asset Upload Trigger - Webhook endpoint that activates when new creative assets are uploaded ScrapeGraphAI Asset Analyzer - Uses AI to extract detailed information from visual assets Tag Generator - Creates comprehensive, searchable tags based on asset analysis Brand Compliance Checker - Evaluates assets against brand guidelines and standards Asset Organizer - Creates organized folder structures and standardized naming Creative Team Dashboard - Updates Google Sheets with organized asset information ๐ Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | asset_id | String | Unique asset identifier | "asset_1703123456789_abc123def" | | name | String | Standardized filename | "image-social-media-2024-01-15T10-30-00.jpg" | | path | String | Storage location path | "/creative-assets/2024/01/image/social-media" | | asset_type | String | Type of creative asset | "image" | | dimensions | String | Asset dimensions | "1920x1080" | | file_format | String | File format | "jpg" | | primary_colors | Array | Extracted color palette | ["#FF6B35", "#004E89"] | | content_description | String | AI-generated content description | "Modern office workspace with laptop" | | text_content | String | Any text visible in asset | "Welcome to our workspace" | | style_elements | Array | Detected style characteristics | ["modern", "minimalist"] | | generated_tags | Array | Comprehensive tag list | ["high-resolution", "brand-logo", "social-media"] | | usage_context | String | Suggested usage context | "social-media" | | brand_elements | Array | Detected brand elements | ["logo", "typography"] | | compliance_score | Number | Brand compliance score (0-100) | 85 | | compliance_status | String | Approval status | "approved-with-warnings" | | compliance_issues | Array | List of compliance problems | ["Non-brand colors detected"] | | upload_date | DateTime | Asset upload timestamp | "2024-01-15T10:30:00Z" | | searchable_keywords | String | Search-optimized keywords | "image social-media modern brand-logo" | ๐ ๏ธ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access File upload system or DAM integration Brand guidelines document (for compliance configuration) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for creative asset management Configure the sheet name (default: "Creative Assets Dashboard") 4. Configure Webhook Trigger Set up the webhook endpoint for asset uploads Configure the webhook URL in your file upload system Ensure asset_url parameter is passed in webhook payload Test webhook connectivity 5. Customize Brand Guidelines Update the Brand Compliance Checker node with your brand colors Configure approved file formats and size limits Set required brand elements and fonts Define resolution standards and quality requirements 6. Configure Asset Organization Customize folder structure preferences Set up naming conventions for different asset types Configure metadata extraction preferences Set up search optimization parameters 7. Test and Validate Upload a test asset to trigger the workflow Verify all analysis steps complete successfully Check Google Sheets for proper data formatting Validate brand compliance scoring ๐ Workflow Customization Options Modify Analysis Parameters Adjust ScrapeGraphAI prompts for specific asset types Customize tag generation algorithms Modify color analysis sensitivity Add industry-specific analysis criteria Extend Brand Compliance Add more sophisticated brand guideline checks Implement automated correction suggestions Include legal compliance verification Add accessibility compliance checks Customize Organization Structure Modify folder hierarchy based on team preferences Implement custom naming conventions Add version control and asset history Configure backup and archiving rules Output Customization Add integration with DAM systems Implement asset approval workflows Create automated reporting and analytics Add team collaboration features ๐ Use Cases Brand Asset Management**: Automatically organize and tag brand assets Compliance Monitoring**: Ensure all assets meet brand guidelines Creative Team Collaboration**: Centralized asset management and sharing Marketing Campaign Management**: Organize assets by campaign and context Asset Discovery**: AI-powered search and recommendation system Quality Control**: Automated quality and compliance checks ๐จ Important Notes Respect ScrapeGraphAI API rate limits and terms of service Implement appropriate delays between requests to avoid rate limiting Regularly review and update brand guidelines in the compliance checker Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and copyright compliance for creative assets Ensure proper backup and version control for important assets ๐ง Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Webhook trigger failures: Check webhook URL and payload format Google Sheets permission errors: Check OAuth2 scope and permissions Asset analysis errors: Review the ScrapeGraphAI prompt configuration Brand compliance false positives: Adjust guideline parameters File organization issues: Check folder permissions and naming conventions Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Digital asset management best practices Brand compliance and governance guidelines
by Qandil
What it does Automated weekly WAF security assessments with Slack reporting. Detects your WAF vendor, runs a security assessment, grades your protection, and alerts your team when the grade drops below threshold. About WAFtester WAFtester is an open-source CLI for testing Web Application Firewalls. It ships 27 MCP tools, 2,800+ attack payloads across 18 categories (SQLi, XSS, SSRF, SSTI, command injection, XXE, and more), detection signatures for 26 WAF vendors and 9 CDNs, and enterprise-grade assessment with F1/MCC scoring and letter grades (A+ through F). GitHub: github.com/waftester/waftester Docs: Installation | Examples | Commands Who it's for Security teams needing continuous WAF monitoring DevOps engineers tracking WAF configuration drift Compliance teams requiring regular security assessments How it works The workflow has seven nodes: Weekly Schedule โ Triggers every Monday at 3 AM (configurable) Detect WAF โ Calls WAFtester's detect_waf tool to fingerprint the WAF vendor and CDN Start Assessment โ Launches an async assess task testing SQLi, XSS, traversal, cmdi, and SSRF Wait โ Pauses to let the assessment run Poll Results โ Calls get_task_status to retrieve completed results Check Results โ Routes based on the WAF grade (pass if "A" or better, fail otherwise) Slack (Pass/Fail) โ Posts a summary to your Slack channel with grade, detection rate, and bypass count How to set up Start WAFtester MCP server: docker run -p 8080:8080 ghcr.io/waftester/waftester:latest mcp --http :8080 Set environment variables: WAF_TARGET_URL (required), WAFTESTER_MCP_URL, SLACK_CHANNEL Add Slack OAuth2 credentials and select them in both Slack nodes Activate the workflow Alternatively, use the included docker-compose.yml to run both n8n and WAFtester together. Requirements | Requirement | Details | |---|---| | WAFtester MCP server | Docker image (ghcr.io/waftester/waftester:latest) or binary install | | Slack | Workspace with OAuth2 bot credentials | | Authorization | Only test targets you have explicit written permission to test | How to customize Adjust schedule in the Weekly Schedule node Change grade threshold in the Check Results node Add attack categories in Start Assessment's categories array Swap Slack for email, Teams, or any n8n notification node Links WAFtester website GitHub repository Installation guide Full examples Docker Hub