by vinci-king-01
Smart Supplier Health Monitor with ScrapeGraphAI Risk Detection and Multi-Channel Alerts π― Target Audience Procurement managers and directors Supply chain risk analysts CFOs and financial controllers Vendor management teams Enterprise risk managers Operations managers Contract administrators Business continuity planners π Problem Statement Manual supplier monitoring is reactive and time-consuming, often missing early warning signs of financial distress that could disrupt your supply chain. This template solves the challenge of proactive supplier health surveillance by automatically monitoring financial indicators, news sentiment, and market conditions to predict supplier risks before they impact your business operations. π§ How it Works This workflow automatically monitors your critical suppliers' financial health using AI-powered web scraping, analyzes multiple risk factors, identifies alternative suppliers when needed, and sends intelligent alerts through multiple channels to ensure your procurement team can act quickly on emerging risks. Key Components Weekly Health Check Scheduler - Automated trigger based on supplier criticality levels Supplier Database Loader - Dynamic supplier portfolio management with risk-based monitoring frequency ScrapeGraphAI Website Analyzer - AI-powered extraction of financial health indicators from company websites Financial News Scraper - Intelligent monitoring of financial news and sentiment analysis Advanced Risk Scorer - Industry-adjusted risk calculation with failure probability modeling Alternative Supplier Finder - Automated identification and ranking of backup suppliers Multi-Channel Alert System - Email, Slack, and API notifications with escalation rules π Risk Analysis Specifications The template performs comprehensive financial health analysis with the following parameters: | Risk Factor | Weight | Score Impact | Description | |-------------|--------|--------------|-------------| | Financial Issues | 40% | +0-24 points | Revenue decline, debt levels, cash flow problems | | Operational Risks | 30% | +0-18 points | Management changes, restructuring, capacity issues | | Market Risks | 20% | +0-12 points | Industry disruption, regulatory changes, competition | | Reputational Risks | 10% | +0-6 points | Negative news, legal issues, public sentiment | Industry Risk Multipliers: Technology: 1.1x (Higher volatility) Manufacturing: 1.0x (Baseline) Energy: 1.2x (Regulatory risks) Financial: 1.3x (Market sensitivity) Logistics: 0.9x (Generally stable) Risk Levels & Actions: Critical Risk**: Score β₯ 75 (CEO/CFO escalation, immediate transition planning) High Risk**: Score β₯ 55 (Procurement director escalation, backup activation) Medium Risk**: Score β₯ 35 (Manager review, increased monitoring) Low Risk**: Score < 35 (Standard monitoring) π’ Supplier Management Features | Feature | Critical Suppliers | High Priority | Medium Priority | |---------|-------------------|---------------|-----------------| | Monitoring Frequency | Weekly | Bi-weekly | Monthly | | Risk Threshold | 35+ points | 40+ points | 50+ points | | Alert Recipients | C-Level + Directors | Directors + Managers | Managers only | | Alternative Suppliers | 3+ pre-qualified | 2+ identified | 1+ researched | | Transition Timeline | 24-48 hours | 1-2 weeks | 1-3 months | π οΈ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Gmail account for email alerts (or alternative email service) Slack workspace with webhook or bot token Supplier database or CRM system API access Basic understanding of procurement processes Step-by-Step Configuration 1. Configure ScrapeGraphAI Credentials Sign up for ScrapeGraphAI API account Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials with your API key Test the connection to ensure proper functionality 2. Set up Email Integration Add Gmail OAuth2 credentials in n8n Configure sender email and authentication Test email delivery with sample message Set up email templates for different risk levels 3. Configure Slack Integration Create Slack webhook URL or bot token Add Slack credentials to n8n Configure target channels for different alert types Customize Slack message formatting and buttons 4. Load Supplier Database Update the "Supplier Database Loader" node with your supplier data Configure supplier categories, contract values, and criticality levels Set monitoring frequencies based on supplier importance Add supplier website URLs and contact information 5. Customize Risk Parameters Adjust industry risk multipliers for your business context Modify risk scoring thresholds based on risk tolerance Configure economic factor adjustments Set failure probability calculation parameters 6. Configure Alternative Supplier Database Populate the alternative supplier database in the "Alternative Supplier Finder" node Add supplier ratings, capacities, and specialties Configure geographic coverage and certification requirements Set suitability scoring parameters 7. Set up Procurement System Integration Configure the procurement system webhook endpoint Add API authentication credentials Test webhook payload delivery Set up automated data synchronization 8. Test and Validate Run test scenarios with sample supplier data Verify ScrapeGraphAI extraction accuracy Check risk scoring calculations and thresholds Confirm all alert channels are working properly Test alternative supplier recommendations π Workflow Customization Options Modify Risk Analysis Add custom risk indicators specific to your industry Implement sector-specific economic adjustments Configure contract-specific risk factors Add ESG (Environmental, Social, Governance) scoring Extend Data Sources Integrate credit rating agency APIs (Dun & Bradstreet, Experian) Add financial database connections (Bloomberg, Reuters) Include social media sentiment analysis Connect to government regulatory databases Enhance Alternative Supplier Management Add automated supplier qualification workflows Implement dynamic pricing comparison Create supplier performance scorecards Add geographic risk assessment Advanced Analytics Implement predictive failure modeling Add supplier portfolio optimization Create supply chain risk heatmaps Generate automated compliance reports π Use Cases Supply Chain Risk Management**: Proactive monitoring of supplier financial stability Procurement Optimization**: Data-driven supplier selection and management Business Continuity Planning**: Automated backup supplier identification Financial Risk Assessment**: Early warning system for supplier defaults Contract Management**: Risk-based contract renewal and negotiation Vendor Diversification**: Strategic supplier portfolio management π¨ Important Notes Respect ScrapeGraphAI API rate limits and terms of service Implement appropriate delays between supplier assessments Keep all API credentials secure and rotate them regularly Monitor API usage to manage costs effectively Ensure compliance with data privacy regulations (GDPR, CCPA) Regularly update supplier databases and contact information Review and adjust risk parameters based on market conditions Maintain confidentiality of supplier financial information π§ Troubleshooting Common Issues: ScrapeGraphAI extraction errors: Check API key validity and rate limits Email delivery failures: Verify Gmail credentials and permissions Slack notification failures: Check webhook URL and channel permissions False positive alerts: Adjust risk scoring thresholds and industry multipliers Missing supplier data: Verify website URLs and accessibility Alternative supplier errors: Check supplier database completeness Monitoring Best Practices: Set up workflow execution monitoring and error alerts Regularly review and update supplier information Monitor API usage and costs across all integrations Validate risk scoring accuracy with historical data Test disaster recovery and backup procedures Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Procurement best practices and industry standards Financial risk assessment methodologies Supply chain management resources and tools
by Nima Salimi
Descriptionπ This n8n workflow is a complete marketing automation system that connects to your CDP (Customer Data Platform), selects which flows to send, and delivers personalized emails using Brevo. It's modular and extensible β you can also add SMS, push notifications, Telegram messages, or other channels. To build a full marketing automation system, you need four key components: Workflow Automation β using n8n (this workflow) CDP β store and manage user data (e.g., NocoDB, Metabase, Power BI, etc.) Database β track transactions, templates, and send statuses (e.g., NocoDB) BI / Analytics β monitor performance by flows, journeys, and sent events This workflow represents the Workflow Automation layer. You can connect it to your own data stack or use the included example databases (cdp-ecrm, n8n-templates-ecrm, and n8n-transaction-ecrm) to get started quickly. π€ Whoβs it for? Growth & CRM teams managing user engagement flows Ecommerce marketers running time-sensitive email journeys Marketing automation pros using low-code CRM stacks Data teams building custom campaign triggers from CDPs β Features π Two modular flows: "Insert user_id" and "Sending Email" π§ Select flow using flow_id from templates in NocoDB βοΈ Insert user data into n8n-transaction-ecrm with processing status π Filter duplicate users by user_id to avoid over-sending π§ Validate email fields and flag disposables π¨ Send personalized emails using Brevo template parameters π Track delivery with sent_result, sent_at, and status updates π Runs every 30 minutes via schedule trigger π How to Use Set your flow In the Setup Flow node, change the flow_id to match a row in your n8n-templates-ecrm table. Prepare your tables in NocoDB cdp-ecrm: contains users (user_id, email, first_name, phone_number) n8n-templates-ecrm: contains flows with metadata n8n-transaction-ecrm: stores and updates user send status Configure credentials NocoDB API Token Brevo (Sendinblue) API Key Trigger the flows Run βInsert user_idβ manually or on a schedule to prepare users βSending Emailβ runs automatically every 30 minutes π Notes Disposable email domains are filtered using regex Status: 0-processing β just inserted 1-sending β ready to send 2-sent β email sent successfully 3-no-email β missing email address 4-disposal-email β disposable or banned email Easily duplicate the "Insert user_id" flow to add more campaigns
by MRJ
:car: Business Value Proposition Accelerates ISO 26262 compliance for automotive/industrial systems by automating safety analysis while maintaining rigorous audit standards. :gear: How It Works graph TD A[Engineer uploadssystem description] --> B(LLM identifies hazards) B --> C(LLM scores risks per ISO 26262) C --> D(Generates mitigation strategies) D --> E(Produces audit-ready reports) :chart_with_upwards_trend: Key Benefits Time 50-70% faster than manual HAZOP/FMEA sessions Instant report generation vs. weeks of documentation Risk Mitigation Pre-validated templates reduce human error Auto-generated traceability simplifies audits :warning: Governance Controls Human-in-the-loop: All LLM outputs require engineer sign-off Version tracking: Full history of modifications Audit mode: Export all decision rationales :computer: Technical Requirements Runs on existing n8n instances Docker deployment (<1hr setup) Integrates with JAMA/DOORS (optional) :wrench: Setup and Usage Prerequisites Docker (Install Guide) Docker Compose (Install Guide) n8n instance (Free Self-Hosted or Cloud - Paid) OpenAI API key (Get Key) Enterprise-ready deployment: When supported by IT infrastructure teams, this solution transforms into a scalable AI safety assistant, providing real-time HARA guidance akin to engineering Co-pilot tools. :arrow_down: Installation and :play_or_pause_button: Running the Workflow For installation procedures and usage of workflow, refer the repository :warning: Validation & Limitations AI-Assisted Analysis Considerations | Advantage | Mitigation Strategy | Implementation Example | |-----------|---------------------|------------------------| | Rapid hazard identification | Human validation layer | Manual review nodes in workflow | | Consistent S/E/C scoring | Rule-based validation | ASIL-D β Redundancy check | | Edge case coverage | Cross-reference with historical data | Integration with incident databases | Critical Validation Steps AI Output Review node in n8n Example: (by code) { "type": "function", "parameters": { "functionCode": "if ($input.item.json.ASIL === 'D' && !$input.item.json.redundancy) throw new Error('ASIL D requires redundancy');" } } Version Control Prompt versions tied to ISO standard editions (e.g., ISO26262:2018-v1.2) Git-tracked changes to ai_models/training_data/ Audit trails Providing a log structure for audit trails Log structure /logs/ βββ YYYY-MM-DD/ βββ hazards_approved.log βββ hazards_rejected.log
by inderjeet Bhambra
Who is this for? This workflow is designed for travel bloggers, content creators, social media managers, and anyone who wants to transform their travel photos into engaging written narratives. It's perfect for travelers looking to create compelling stories from their photo collections without spending hours crafting content manually, families wanting to document memorable trips, and digital nomads who need to produce travel content efficiently. What problem is this workflow solving? Converting travel photos into engaging stories is time-consuming and requires both creative writing skills and the ability to analyze visual content meaningfully. This workflow solves the challenge of: Transforming visual memories into compelling written narratives Organizing photos chronologically to create logical story flow Generating professional-quality travel content without writing expertise Analyzing photo content to extract meaningful themes and emotions Creating day-by-day structured narratives from unorganized photo collections Reducing the time spent on manual content creation for travel documentation What this workflow does This AI-powered photo storyteller takes your travel photos and automatically generates immersive, first-person travel narratives. The workflow: Accepts multiple photos through a webhook endpoint Uses OpenAI Vision API (GPT-4o) to analyze each photo's content, emotions, and themes Automatically organizes photos chronologically by date and timestamp Groups photos by travel days and extracts daily themes Leverages GPT-4.1 (minimum required) to craft engaging, first-person travel stories with creative day titles Generates structured narratives with sensory details, cultural observations, and emotional insights Outputs JSON formatted content ready for formatting Creates day-by-day story structure with memorable moments and reflective conclusions Setup Required Credentials: OpenAI API key configured in n8n for both Vision Analysis and Story Generation nodes Ensure you have sufficient OpenAI credits for image analysis and text generation Webhook Configuration: The workflow creates a webhook endpoint at /tripteller-upload Configure your photo upload interface to POST photos array to this endpoint Photos should be sent as base64 encoded data with filename and metadata Photo Requirements: Supported formats: Standard image formats (JPEG, PNG, etc.) Photos should include timestamp metadata for chronological organization Caution Do not upload all photos at once. Start with a small number of photos, like 5 at a time. How to customize this workflow to your needs Story Style Customization: Modify the system prompt in the "Generate Travel Story" node to adjust writing tone (nostalgic, adventurous, poetic, etc.) Customize the story structure by editing the output format requirements Add specific cultural or geographical context prompts for location-specific storytelling Photo Analysis Enhancement: Adjust the Vision Analysis node prompt to focus on specific elements (architecture, food, people, landscapes) Modify the grouping logic in the "Group Photos by Day" node for different time-based organization Add location extraction from EXIF data for geographical context Output Format Adjustment: Customize the final response structure in the "Format Final Response" node Add integration with publishing platforms (blog APIs, social media, etc.) Include additional metadata like location tags, travel duration, or trip statistics Performance Optimization: Adjust the execution timeout based on your typical photo volume Modify the parallel processing approach for large photo collections Add progress tracking for longer processing workflows
by Autonomous Work
This workflow exports every table in a base as its own CSV, saves the files in a time-stamped folder in Amazon S3, pings you on Slack, and optionally prunes older copies. You get an automated weekly backup that is easy to inspect or re-import as needed. You can easily swap the S3 node for the storage provider of your choice. ++How it works++ Weekly Backup Schedule trigger fires weekly Sets and formats the week ex. [2025-W12] Create a folder in S3 bucket with the week Loops through all tables in Airtable base creating CSVs and uploading to the new path Slack message is sent on completion Monthly Prune Schedule trigger fires weekly Sets a cut-off date 4 weeks in the past Lists folders in S3 Deletes all folders > 4 weeks old ++Setup Steps++ Clone workflow Swap credentials for Airtable, AWS, and Slack Ensure AWS credential has appropriate IAM policy to manage bucket & objects Set workflow to "Active"
by DataMinex
π Real-Time Flight Data Analytics Bot with Dynamic Chart Generation via Telegram π Template Overview This advanced n8n workflow creates an intelligent Telegram bot that transforms raw CSV flight data into stunning, interactive visualizations. Users can generate professional charts on-demand through a conversational interface, making data analytics accessible to anyone via messaging. Key Innovation: Combines real-time data processing, Chart.js visualization engine, and Telegram's messaging platform to deliver instant business intelligence insights. π― What This Template Does Transform your flight booking data into actionable insights with four powerful visualization types: π Bar Charts**: Top 10 busiest airlines by flight volume π₯§ Pie Charts**: Flight duration distribution (Short/Medium/Long-haul) π© Doughnut Charts**: Price range segmentation with average pricing π Line Charts**: Price trend analysis across flight durations Each chart includes auto-generated insights, percentages, and key business metrics delivered instantly to users' phones. ποΈ Technical Architecture Core Components Telegram Webhook Trigger: Captures user interactions and button clicks Smart Routing Engine: Conditional logic for command detection and chart selection CSV Data Pipeline: File reading β parsing β JSON transformation Chart Generation Engine: JavaScript-powered data processing with Chart.js Image Rendering Service: QuickChart API for high-quality PNG generation Response Delivery: Binary image transmission back to Telegram Data Flow Architecture User Input β Command Detection β CSV Processing β Data Aggregation β Chart Configuration β Image Generation β Telegram Delivery π οΈ Setup Requirements Prerequisites n8n instance** (self-hosted or cloud) Telegram Bot Token** from @BotFather CSV dataset** with flight information Internet connectivity** for QuickChart API Dataset Source This template uses the Airlines Flights Data dataset from GitHub: π Dataset: Airlines Flights Data by Rohit Grewal Required Data Schema Your CSV file should contain these columns: airline,flight,source_city,departure_time,arrival_time,duration,price,class,destination_city,stops File Structure /data/ βββ flights.csv (download from GitHub dataset above) βοΈ Configuration Steps 1. Telegram Bot Setup Create a new bot via @BotFather on Telegram Copy your bot token Configure the Telegram Trigger node with your token Set webhook URL in your n8n instance 2. Data Preparation Download the dataset from Airlines Flights Data Upload the CSV file to /data/flights.csv in your n8n instance Ensure UTF-8 encoding Verify column headers match the dataset schema Test file accessibility from n8n 3. Workflow Activation Import the workflow JSON Configure all Telegram nodes with your bot token Test the /start command Activate the workflow π§ Technical Implementation Details Chart Generation Process Bar Chart Logic: // Aggregate airline counts const airlineCounts = {}; flights.forEach(flight => { const airline = flight.airline || 'Unknown'; airlineCounts[airline] = (airlineCounts[airline] || 0) + 1; }); // Generate Chart.js configuration const chartConfig = { type: 'bar', data: { labels, datasets }, options: { responsive: true, plugins: {...} } }; Dynamic Color Schemes: Bar Charts: Professional blue gradient palette Pie Charts: Duration-based color coding (lightβdark blue) Doughnut Charts: Price-tier specific colors (greenβpurple) Line Charts: Trend-focused red gradient with smooth curves Performance Optimizations Efficient Data Processing: Single-pass aggregations with O(n) complexity Smart Caching: QuickChart handles image caching automatically Minimal Memory Usage: Stream processing for large datasets Error Handling: Graceful fallbacks for missing data fields Advanced Features Auto-Generated Insights: Statistical calculations (percentages, averages, totals) Trend analysis and pattern detection Business intelligence summaries Contextual recommendations User Experience Enhancements: Reply keyboards for easy navigation Visual progress indicators Error recovery mechanisms Mobile-optimized chart dimensions (800x600px) π Use Cases & Business Applications Airlines & Travel Companies Fleet Analysis**: Monitor airline performance and market share Pricing Strategy**: Analyze competitor pricing across routes Operational Insights**: Track duration patterns and efficiency Data Analytics Teams Self-Service BI**: Enable non-technical users to generate reports Mobile Dashboards**: Access insights anywhere via Telegram Rapid Prototyping**: Quick data exploration without complex tools Business Intelligence Executive Reporting**: Instant charts for presentations Market Research**: Compare industry trends and benchmarks Performance Monitoring**: Track KPIs in real-time π¨ Customization Options Adding New Chart Types Create new Switch condition Add corresponding data processing node Configure Chart.js options Update user interface menu Data Source Extensions Replace CSV with database connections Add real-time API integrations Implement data refresh mechanisms Support multiple file formats Visual Customizations // Custom color palette backgroundColor: ['#your-colors'], // Advanced styling borderRadius: 8, borderSkipped: false, // Animation effects animation: { duration: 2000, easing: 'easeInOutQuart' } π Security & Best Practices Data Protection Validate CSV input format Sanitize user inputs Implement rate limiting Secure file access permissions Error Handling Graceful degradation for API failures User-friendly error messages Automatic retry mechanisms Comprehensive logging π Expected Outputs Sample Generated Insights "βοΈ Vistara leads with 350+ flights, capturing 23.4% market share" "π Long-haul flights dominate at 61.1% of total bookings" "π° Budget category (βΉ0-10K) represents 47.5% of all bookings" "π Average prices peak at βΉ14K for 6-8 hour duration flights" Performance Metrics Response Time**: <3 seconds for chart generation Image Quality**: 800x600px high-resolution PNG Data Capacity**: Handles 10K+ records efficiently Concurrent Users**: Scales with n8n instance capacity π Getting Started Download the workflow JSON Import into your n8n instance Configure Telegram bot credentials Upload your flight data CSV Test with /start command Deploy and share with your team π‘ Pro Tips Data Quality**: Clean data produces better insights Mobile First**: Charts are optimized for mobile viewing Batch Processing**: Handles large datasets efficiently Extensible Design**: Easy to add new visualization types Ready to transform your data into actionable insights? Import this template and start generating professional charts in minutes! π
by Dr. Firas
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automate Content Publishing to TikTok, YouTube, Instagram, Facebook via Blotato π― Who is this for? This workflow is perfect for: Content creators who post daily to multiple platforms Marketing teams managing brand presence across channels Solo entrepreneurs and social media managers looking to scale their output Anyone tired of uploading content manually across apps π‘ What problem is this solving? Managing content across platforms is time-consuming. You need to: Track posts per platform Upload videos manually Adapt captions and posting time Avoid repetitive mistakes This workflow solves all of that by centralizing everything in one place (Google Sheets) and automating it via Blotato. βοΈ What this workflow does Every hour, this workflow will: Check your Google Sheet for any post marked as "TO GO" Select one item at a time (avoids spam and overposting) Extract media from a shared Google Drive link Upload the media to Blotato Publish it automatically to: TikTok YouTube Shorts Instagram Facebook Update the post status in your Sheet to "Posted" π§° Setup Before running this template, make sure you have: β A Blotato account (Pro plan required for API key) π Generated your Blotato API key (Settings > API > Generate) π¦ Enabled Verified Community Nodes in n8n Admin Panel π§© Installed the Blotato node via the community nodes list π Created a Blotato credential in n8n using your API key βοΈ Made sure your media folder in Google Drive is set to Anyone with the link can view π Followed the 3 setup steps in the brown sticky notes inside the workflow π How to customize this workflow Add new platform nodes (LinkedIn, Threads, Pinterest, etc.) using Blotato Adjust the scheduling frequency from hourly to daily or weekly Add an approval layer (Slack/Telegram) before publishing Customize your captions dynamically using GPT or formulas in Sheets Use tags, categories, or campaign tracking for analytics π Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Mihai Farcas
Chat with local LLMs using n8n and Ollama This n8n workflow allows you to seamlessly interact with your self-hosted Large Language Models (LLMs) through a user-friendly chat interface. By connecting to Ollama, a powerful tool for managing local LLMs, you can send prompts and receive AI-generated responses directly within n8n. Use cases Private AI Interactions Ideal for scenarios where data privacy and confidentiality are important. Cost-Effective LLM Usage Avoid ongoing cloud API costs by running models on your own hardware. Experimentation & Learning A great way to explore and experiment with different LLMs in a local, controlled environment. Prototyping & Development Build and test AI-powered applications without relying on external services. How it works When chat message received: Captures the user's input from the chat interface. Chat LLM Chain: Sends the input to the Ollama server and receives the AI-generated response. Delivers the LLM's response back to the chat interface. Set up steps Make sure Ollama is installed and running on your machine before executing this workflow. Edit the Ollama address if different from the default.
by Alexey from Mingles.ai
AI Image Generator & Editor with GPT-4 Vision - Complete Workflow Template Description Transform text prompts into stunning images or edit existing visuals using OpenAI's latest GPT-4 Vision model through an intuitive web form interface. This comprehensive n8n automation provides three powerful image generation modes: π¨ Text-to-Image Generation Simply enter a descriptive prompt and generate high-quality images from scratch using OpenAI's gpt-image-1 model. Perfect for creating original artwork, concepts, or visual content. πΌοΈ Image-to-Image Editing Upload an existing image file and transform it based on your text prompt. The AI analyzes your input image and applies modifications while maintaining the original structure and context. π URL-Based Image Editing Provide a direct URL to any online image and edit it with AI. Great for quick modifications of web images or collaborative workflows. Key Features Smart Input Processing Flexible Form Interface**: User-friendly web form with authentication Multiple Input Methods**: File upload, URL input, or text-only generation Quality Control**: Selectable quality levels (low, medium, high) Format Support**: Accepts PNG, JPG, and JPEG formats Advanced AI Integration Latest GPT-4 Vision Model**: Uses gpt-image-1 for superior results Intelligent Switching**: Automatically detects input type and routes accordingly Context-Aware Editing**: Maintains image coherence during modifications Customizable Parameters**: Control size (1024x1024), quality, and generation settings Dual Storage Options Google Drive Integration**: Automatic upload with public sharing permissions ImgBB Hosting**: Alternative cloud storage for instant public URLs File Management**: Organized storage with timestamp-based naming Instant Telegram Delivery Real-time Notifications**: Results sent directly to your Telegram chat Rich Media Messages**: Includes generated image with prompt details Quick Access Links**: Direct links to view and download results Markdown Formatting**: Clean, professional message presentation Technical Workflow Form Submission β User submits prompt and optional image Smart Routing β System detects input type (text/file/URL) AI Processing β OpenAI generates or edits image based on mode Binary Conversion β Converts base64 response to downloadable file Cloud Upload β Stores in Google Drive or ImgBB with public access Telegram Delivery β Sends result with viewing links and metadata Perfect For Content Creators**: Generate unique visuals for social media and marketing Designers**: Quick concept development and image variations Developers**: Automated image processing for applications Teams**: Collaborative image editing and sharing workflows Personal Use**: Transform ideas into visual content effortlessly Setup Requirements OpenAI API Key**: Access to GPT-4 Vision model Google Drive API** (optional): For Google Drive storage ImgBB API Key** (optional): For alternative image hosting Telegram Bot**: For result delivery Basic Auth Credentials**: For form security What You Get β Complete image generation and editing pipeline β Secure web form with authentication β Dual cloud storage options β Instant Telegram notifications β Professional result formatting β Flexible input methods β Quality control settings β Automated file management Start creating AI-powered images in minutes with this production-ready template! Tags: #AI #ImageGeneration #OpenAI #GPT4 #ImageEditing #Telegram #GoogleDrive #Automation #ComputerVision #ContentCreation
by Jitesh Dugar
Transform chaotic employee departures into secure, insightful offboarding experiences - achieving zero security breaches, 100% equipment recovery, and actionable retention insights from every exit interview. What This Workflow Does Revolutionizes employee offboarding with AI-driven exit interview analysis and automated task orchestration: π Exit Interview Capture - Jotform collects resignation details, ratings, feedback, and equipment inventory π€ AI Sentiment Analysis - Advanced AI analyzes exit interviews for retention insights, red flags, and patterns β οΈ Red Flag Detection - Automatically identifies serious issues (harassment, discrimination, ethics) for immediate escalation π€ Manager Intelligence - Flags management issues and provides coaching recommendations π Access Revocation - Schedules automatic system access removal on last working day π¦ Equipment Tracking - Generates personalized equipment return checklist for each employee π Knowledge Transfer - Assesses knowledge transfer risk and creates handover plan π° Retention Analytics - Identifies preventable departures and competitive intelligence π§ Automated Notifications - Sends checklists to employees, action items to managers, IT requests π Boomerang Prediction - Calculates likelihood of rehire and maintains alumni relationships Key Features AI Exit Interview Analysis: GPT-4 provides 2+ analytical dimensions including sentiment, preventability, and red flags Preventability Scoring: AI calculates 0-100% score on whether departure was preventable Red Flag Escalation: Automatic detection of harassment, discrimination, ethics, or legal concerns Manager Performance Insights: Identifies management issues requiring coaching or intervention Sentiment Analysis: Analyzes tone, emotions, and overall sentiment from qualitative feedback Competitive Intelligence: Tracks where employees go and what competitors offer Knowledge Transfer Risk Assessment: Evaluates complexity and criticality of knowledge handover Boomerang Probability: Predicts likelihood (0-100%) of employee returning in future Department Trend Analysis: Identifies systemic issues in specific teams or departments Compensation Benchmarking: Flags compensation competitiveness issues Retention Recommendations: AI-generated actionable improvements prioritized by impact Equipment Tracking: Automatic inventory of laptops, phones, cards, and other company property Perfect For Growing Companies: 50-5,000 employees with monthly turnover requiring structured offboarding Tech Companies: Protecting IP and system access with departing engineers and developers Healthcare Organizations: Compliance-critical offboarding with HIPAA and patient data access Financial Services: Regulated industries requiring audit trails and secure access revocation Professional Services: Knowledge-intensive businesses where brain drain is costly Retail & Hospitality: High-turnover environments needing efficient, consistent offboarding Remote-First Companies: Distributed teams requiring coordinated equipment recovery What You'll Need Required Integrations Jotform - Exit interview and resignation form (free tier works) Create your form for free on Jotform using this link OpenAI API - GPT-4 for AI exit interview analysis (~$0.20-0.50 per exit interview) Gmail - Automated notifications to employees, managers, IT, and HR Google Sheets - Exit interview database and retention analytics Quick Start Import Template - Copy JSON and import into n8n Add OpenAI Credentials - Set up OpenAI API key (GPT-4 for best insights) Create Jotform Exit Interview - Build comprehensive form with these sections: Configure Gmail - Add Gmail OAuth2 credentials Setup Google Sheets: Create spreadsheet with "Exit_Interviews" sheet Replace YOUR_GOOGLE_SHEET_ID in workflow Columns will auto-populate on first submission Customization Options AI Prompt Refinement: Tailor analysis for your industry, company culture, and specific concerns Red Flag Categories: Customize what constitutes a red flag for your organization Equipment Types: Add specialized equipment (tools, uniforms, parking passes) Access Systems: Integrate with your specific IT systems for automated revocation Knowledge Transfer Templates: Create role-specific handover checklists Manager Notifications: Add more details based on department or seniority Exit Interview Questions: Add industry-specific or company-specific questions Retention Focus Areas: Adjust AI to focus on specific retention priorities Rehire Workflows: Add automatic alumni network invitations for boomerang candidates Severance Processing: Add nodes for severance agreement generation and tracking Reference Check Process: Include reference policy notifications Benefits COBRA: Automate COBRA benefits notification workflows Expected Results Zero security breaches from lingering access - automated revocation on last day 100% equipment recovery - automated tracking and follow-up 3x faster offboarding - 30 minutes vs 2 hours of manual coordination 85% actionable insights from exit interviews vs 20% with manual reviews 60% improvement in identifying preventable turnover 90% manager compliance with knowledge transfer (vs 40% manual) 50% reduction in repeat management issues through coaching identification 40% increase in boomerang rehires through positive offboarding experience Complete audit trail for legal compliance and investigations Department-level insights identifying systemic retention issues Use Cases Tech Startup (100 Employees, High Growth) Engineer resigns to join competitor. AI detects compensation issue (40% below market), flags manager micromanagement concerns, and identifies preventable departure (preventability: 85%). HR immediately initiates compensation review for engineering team, schedules manager coaching, and retains 3 other engineers considering leaving. Access to codebase revoked automatically on last day. Boomerang probability: 70% - maintains relationship for future recruiting. Healthcare System (500 Nurses) Nurse leaves citing burnout. AI identifies systemic staffing issues in ER department affecting 15% of departures. Flags potential HIPAA violation concern requiring investigation. Automatically revokes EHR access on final day. Equipment recovery (badge, pager, scrubs) tracked with 100% success. Exit insights lead to ER staffing model changes, reducing nurse turnover by 30%. Financial Services Firm Compliance officer departs. AI red flags potential ethics concern requiring immediate investigation. Legal team notified within minutes. Knowledge transfer flagged as "critical risk" due to regulatory expertise. Detailed 30-day handover plan auto-generated. All system access revoked immediately. Complete audit trail maintained for regulatory review. Investigation reveals process gap, not ethical issue, preventing regulatory exposure. Retail Chain (2,000 Employees) Store manager exits. AI aggregates insights across 50 recent retail departures, identifying district manager as common thread (manager rating consistently 2/5). Regional HR intervenes with district manager coaching. Equipment return (keys, registers codes, uniforms) automated via checklist. 95% equipment recovery vs previous 60%. Sentiment trends show seasonal staff prefer flexible scheduling - policy updated chain-wide. Remote Software Company Developer in different timezone resigns. Automated offboarding coordinates across 3 time zones: access revoked at EOD local time, equipment return label emailed internationally, knowledge transfer scheduled with overlap hours. AI detects "career growth" as preventable issue - company implements career ladder framework, reducing senior developer attrition by 45%. Pro Tips Timing Matters: Send Jotform link 1 week before last day for honest feedback (not on exit day) Anonymity Option: Consider anonymous feedback for more candid responses (separate form) Benchmark Scoring: After 50+ exits, calculate your company's average preventability score Manager Patterns: Track exits by manager to identify coaching needs early Department Trends: Monthly reviews of AI insights by department for systemic issues Compensation Data: Cross-reference "compensation issue" flags with market data Boomerang Program: Create formal alumni network for high-probability boomerang candidates Equipment Deposits: Consider requiring deposits for easier equipment recovery Exit Interview Training: Train managers on how to act on AI insights constructively Legal Review: Have legal team review red flag escalation categories quarterly Continuous Improvement: Use AI recommendations to create quarterly retention action plans Stay Interviews: Use exit interview insights to inform "stay interview" questions for current employees Learning Resources This workflow demonstrates advanced n8n automation patterns: AI Agents with complex structured output for multi-dimensional analysis, Sentiment analysis and natural language processing Conditional escalation based on severity and red flags Multi-stakeholder notifications with role-specific messaging Risk assessment algorithms for knowledge transfer and preventability Pattern recognition across qualitative feedback Equipment inventory management with dynamic list generation Compliance automation for access revocation scheduling Predictive analytics for boomerang probability Perfect for learning AI-powered HR automation and organizational analytics! π Workflow Architecture π Jotform Exit Interview Submission β π§Ύ Parse Offboarding Data β π€ AI Exit Interview Analysis (GPT-4) β ββ Retention analysis (preventability scoring) β ββ Sentiment analysis (tone, emotions) β ββ Manager performance evaluation β ββ Department insights β ββ Compensation benchmarking β ββ Knowledge transfer risk assessment β ββ Competitor intelligence β ββ Red flag detection β ββ Boomerang probability β ββ Action item generation β π Extract AI Analysis (JSON) β π§© Merge Exit Analysis with Data β ββ Calculate days until last day β ββ Build equipment checklist β ββ Assess urgency levels β β οΈ Has Red Flags? ββ TRUE β π¨ Send Red Flag Alert (HR Director/Legal) β β ββ FALSE β π§ Send Manager Action Items β βοΈ Send Employee Checklist β π Send IT Offboarding Request β π Log to Google Sheets Ready to transform employee offboarding? Import this template and turn departures into retention insights while maintaining security and professionalism. Every exit becomes a learning opportunity! πͺβ¨ Questions or customization needs? The workflow includes detailed sticky notes explaining each AI analysis component and routing decision.
by JKingma
π PDF-to-Order Automation for Magento2 (Adobe commerce / open source) Description This n8n template demonstrates how to automatically process PDF purchase orders received via email and convert them into sales orders in Adobe Commerce (Magento 2) using Company Credit as the payment method. This is especially useful for B2B companies receiving structured orders from clients by email. Use cases include: Automating incoming B2B orders Reducing manual entry for sales teams Ensuring fast order creation in Adobe Commerce Reliable error handling and customer validation Good to know This workflow is tailored for Adobe Commerce, using the Company Credit payment method. It requires that the customer already has an account in Adobe Commerce and is authorized to use Company Credit. The same flow can be easily adapted for other payment methods (e.g. Purchase Order, Bank Transfer). No third-party services are required aside from n8n and access to Adobe Commerce with API credentials. How it works Trigger β Monitors an email inbox for incoming emails with PDF attachments. Extract PDF β Downloads the attached PDF and parses order data (e.g. SKU, quantity, customer reference). Validate Customer β Checks if the sender matches an existing customer in Adobe Commerce and verifies Company Credit eligibility. Create Order β Generates a new order in Magento using the extracted product data and Company Credit. Handle Errors β Logs issues and can notify a designated channel (email, Slack, etc.) if something goes wrong. Optional Enhancements β Add logging to Airtable, send confirmations to customers, or attach parsed order data to CRM entries. How to use A manual trigger is included as an example, but you can replace it with an IMAP Email Trigger, Gmail Trigger, or Webhook, depending on your setup. Customize the PDF parser node to fit your specific document layout and field structure. Configure Adobe Commerce API credentials in the HTTP nodes (or use environment variables). Optionally connect error steps to Slack, Email, or dashboards for monitoring. Requirements β n8n instance (self-hosted or cloud) β Adobe Commerce (Magento 2) instance with API access and Company Credit enabled β Structured PDF templates used by your customers (Optional) Slack/Email/Airtable for notifications and logs Customising this workflow This workflow can be adapted for: Other payment methods (e.g. Purchase Orders, Online Payment) Magento open source ready. Just use your own payment method Alternate order sources (e.g. uploading PDFs via a portal instead of email) Parsing other document formats (e.g. CSV, Excel) Direct integration into ERP systems
by Dariusz Koryto
Google Drive to FTP Transfer Workflow - Setup Guide Overview This n8n workflow automatically transfers files from Google Drive to an FTP server on a scheduled basis. It includes comprehensive logging, email notifications, and error handling. Features Automated Scheduling**: Runs every 6 hours (customizable) Manual Trigger**: Webhook endpoint for on-demand transfers File Filtering**: Supports specific file types and size limits Comprehensive Logging**: Detailed transfer reports saved to Google Drive Email Notifications**: HTML reports sent after each run Error Handling**: Graceful handling of failed transfers Batch Processing**: Files processed individually to prevent rate limits Prerequisites Before setting up this workflow, ensure you have: n8n instance running (self-hosted or cloud) Google Drive account with files to transfer FTP server with upload permissions Email service for sending reports (SMTP) Step-by-Step Setup Instructions 1. Google Drive API Setup 1.1 Create Google Cloud Project Go to Google Cloud Console Create a new project or select existing one Enable the Google Drive API: Navigate to "APIs & Services" β "Library" Search for "Google Drive API" Click "Enable" 1.2 Create OAuth2 Credentials Go to "APIs & Services" β "Credentials" Click "Create Credentials" β "OAuth client ID" Configure consent screen if prompted Choose "Web application" as application type Add your n8n instance URL to authorized redirect URIs: https://your-n8n-instance.com/rest/oauth2-credential/callback Note down the Client ID and Client Secret 1.3 Configure n8n Credential In n8n, go to "Credentials" β "Add Credential" Select "Google Drive OAuth2 API" Enter your Client ID and Client Secret Complete OAuth flow by clicking "Connect my account" Set credential ID as: your-google-drive-credentials-id 2. FTP Server Setup 2.1 FTP Server Requirements Ensure FTP server is accessible from your n8n instance Verify you have upload permissions Note the server details: Host/IP address Port (usually 21 for FTP) Username and password Destination directory path 2.2 Configure n8n FTP Credential In n8n, go to "Credentials" β "Add Credential" Select "FTP" Enter your FTP server details: Host: your-ftp-server.com Port: 21 (or your custom port) Username: your-ftp-username Password: your-ftp-password Set credential ID as: your-ftp-credentials-id 3. Email Setup (SMTP) 3.1 Choose Email Provider Configure SMTP settings for one of these providers: Gmail**: smtp.gmail.com, port 587, use App Password Outlook**: smtp-mail.outlook.com, port 587 Custom SMTP**: Your organization's SMTP server 3.2 Configure n8n Email Credential In n8n, go to "Credentials" β "Add Credential" Select "SMTP" Enter your SMTP details: Host: smtp.gmail.com (or your provider) Port: 587 Security: STARTTLS Username: your-email@example.com Password: your-app-password Set credential ID as: your-email-credentials-id 4. Workflow Configuration 4.1 Import Workflow Copy the workflow JSON from the artifact above In n8n, click "Import from JSON" Paste the workflow JSON and import 4.2 Update Credential References Google Drive nodes: Verify credential ID matches your-google-drive-credentials-id FTP node: Verify credential ID matches your-ftp-credentials-id Email node: Verify credential ID matches your-email-credentials-id 4.3 Customize Parameters FTP Server Settings (Upload to FTP node) { "host": "your-ftp-server.com", // Replace with your FTP host "username": "your-ftp-username", // Replace with your FTP username "password": "your-ftp-password", // Replace with your FTP password "path": "/remote/directory/{{ $json.validFiles[$json.batchIndex].name }}", // Update destination path "port": 21 // Change if using different port } Email Settings (Send Report Email node) { "sendTo": "admin@yourcompany.com", // Replace with your email address "subject": "Google Drive to FTP File Transfer - Report" } File Filter Settings (Filter & Validate Files node) In the JavaScript code, update these settings: const transferNotes = { settings: { maxFileSizeMB: 50, // Change maximum file size allowedExtensions: [ // Add/remove allowed file types '.pdf', '.doc', '.docx', '.txt', '.jpg', '.png', '.zip', '.xlsx' ], autoDeleteAfterTransfer: false, // Set to true to delete from Drive after transfer verifyTransfer: true // Keep true for verification } }; Google Drive Notes Storage (Upload Notes to Drive node) { "parents": { "parentId": "your-notes-folder-id" // Replace with actual folder ID from Google Drive } } 5. Schedule Configuration 5.1 Modify Schedule Trigger In the "Schedule Trigger" node, adjust the interval: { "rule": { "interval": [ { "field": "hours", "hoursInterval": 6 // Change to desired interval (hours) } ] } } Alternative schedule options: Daily**: "field": "days", "daysInterval": 1 Weekly**: "field": "weeks", "weeksInterval": 1 Custom cron**: Use cron expression for complex schedules 5.2 Webhook Configuration The webhook trigger is available at: POST https://your-n8n-instance.com/webhook/webhook-transfer-status Use this for manual triggers or external integrations. 6. Testing and Validation 6.1 Test Connections Test Google Drive: Run "Get Drive Files" node manually Test FTP: Upload a test file using "Upload to FTP" node Test Email: Send a test email using "Send Report Email" node 6.2 Run Test Transfer Activate the workflow Click "Execute Workflow" to run manually Monitor execution in the workflow editor Check for any error messages or failed nodes 6.3 Verify Results FTP Server**: Confirm files appear in destination directory Email**: Check you receive the transfer report Google Drive**: Verify transfer notes are saved to specified folder 7. Monitoring and Maintenance 7.1 Workflow Monitoring Execution History**: Review past runs in n8n interface Error Logs**: Check failed executions for issues Performance**: Monitor execution times and resource usage 7.2 Regular Maintenance Credential Renewal**: Google OAuth tokens may need periodic renewal Storage Cleanup**: Consider archiving old transfer notes Performance Tuning**: Adjust batch sizes or schedules based on usage 8. Troubleshooting 8.1 Common Issues Google Drive Authentication Errors: Verify OAuth2 credentials are correctly configured Check if Google Drive API is enabled Ensure redirect URI matches n8n instance URL FTP Connection Failures: Verify FTP server credentials and connectivity Check firewall settings allow FTP connections Confirm destination directory exists and has write permissions Email Delivery Issues: Verify SMTP credentials and server settings Check if email provider requires app-specific passwords Ensure sender email is authorized File Transfer Failures: Check file size limits in filter settings Verify allowed file extensions include your file types Monitor FTP server disk space 8.2 Debug Mode Enable debug mode by: Adding console.log statements in code nodes Using "Execute Workflow" with step-by-step execution Checking node outputs for data validation 9. Advanced Customizations 9.1 Additional File Filters Add custom filtering logic in the "Filter & Validate Files" node: // Example: Filter by modification date const isRecentFile = new Date(file.modifiedTime) > new Date(Date.now() - 7 * 24 * 60 * 60 * 1000); // Last 7 days // Example: Filter by folder location const isInSpecificFolder = file.parents && file.parents.includes('specific-folder-id'); 9.2 Enhanced Reporting Customize the email report template in "Send Report Email" node: π File Transfer Report Summary Date: {{ new Date().toLocaleString('en-US') }} Success Rate: {{ Math.round((successfulTransfers / totalFiles) * 100) }}% 9.3 Integration with Other Services Add nodes to integrate with: Slack**: Send notifications to team channels Discord**: Post updates to Discord servers Webhook**: Trigger other workflows or systems Database**: Log transfers to MySQL, PostgreSQL, etc. 10. Security Considerations 10.1 Credential Security Use environment variables for sensitive data Regularly rotate FTP and email passwords Implement least-privilege access for service accounts 10.2 Network Security Use SFTP instead of FTP when possible Implement VPN connections for sensitive transfers Monitor network traffic for unusual patterns 10.3 Data Privacy Ensure compliance with data protection regulations Implement data retention policies for transfer logs Consider encryption for sensitive file transfers Support and Resources Documentation Links n8n Documentation Google Drive API Documentation n8n Community Forum Getting Help If you encounter issues: Check the troubleshooting section above Review n8n execution logs for error details Search the n8n community forum for similar issues Create a support ticket with detailed error information Note: Replace all placeholder values (URLs, credentials, IDs) with your actual configuration before running the workflow.