by WeblineIndia
Incident Reporting & Management Workflow (Form + Google Sheets + Email) This workflow automates incident reporting and management for operations teams by connecting a public reporting form with real-time logging in Google Sheets and instant alert emails to your support team. Whenever an incident is reported via the n8n form/webhook, all details are saved securely and immediately and the right people are notified the moment issues occur. It's a fast, scalable solution for reliable incident handling. Who’s It For Renewable energy operators (solar/wind/green energy). Facility and plant managers. Environmental, EHS and safety teams. Technical support and incident response crews. Maintenance & field operations teams. Anyone aiming for compliance-ready, audit-friendly digital issue reporting. How It Works Form Submission: An n8n-powered form (or webhook endpoint) receives incident reports, capturing all key details: reporter info, contact, location, date/time, type, severity, actions taken, photos and more. Log to Google Sheets: Each report is instantly appended as a new row in a secure Google Sheet, creating a searchable, timestamped audit trail. Email Alert (Gmail): An automatic email with incident summary and critical details lands in the support team’s inbox seconds after submission—ensuring your response is always prompt. Workflow Automation: These nodes are linked in n8n, enabling no-code/low-code back-end automation for complete visibility and control. How to Set Up Import Workflow: In n8n, use "Import from File" to upload the workflow JSON provided. Edit Configuration: Update form fields as needed (label, validations, options for severity/category). Enter your Google Sheet ID and sharing settings. Configure Gmail/SMTP credentials and recipient address (example: supportteam@mailinator.com or your own team). Deploy Webhook: Copy your n8n webhook URL and connect it to your reporting interface (form, app, device, etc.). Activate: Enable the workflow in n8n. Submissions are now handled in real time. Test: Submit a sample incident to make sure data logs in Google Sheets and the alert email arrives as expected. Requirements | Tool | Purpose | |-----------------|-----------------------------------| | n8n Instance | Orchestrates the workflow | | Google Account | To access/use Google Sheets | | Gmail/SMTP | For sending incident alerts | | Incident Source | n8n Form, webhook, app or device| How to Customize Form Fields**: Add/remove fields or validations in the n8n form for organization-specific needs (e.g., add photos, custom categories). Alert Routing**: Use IF nodes to send critical alerts via Slack, SMS or escalate to on-call teams based on severity/type. Backend**: Replace Google Sheets with Notion, Airtable, PostgreSQL or other databases. Reporting**: Add PDF nodes to auto-generate and send report summaries. Integrations**: Push incidents to ticketing, asset tracking or calendar scheduling workflows. Add‑Ons (Optional Extensions) | Feature | Description | |------------------|------------------------------------------------------| | Slack Alerts | Instantly notify Slack channels on critical issues | | Database Logging | Store reports in SQL/NoSQL systems | | PDF Generation | Email ready-to-use incident reports as PDFs | | Calendar Events | Schedule follow-ups or deadline reminders | | AI Categorization| Auto-classify incidents by severity/type | | Task Creation | Open tickets in Jira, Trello, ClickUp or Asana | Use Case Examples Field engineers report solar inverter faults with mobile forms. Security personnel log site intrusions, with photos and severity. IoT sensors auto-post equipment failures as incidents. Compliance or EHS teams capture safety observations in real time. Technicians submit maintenance or post-repair issues instantly. Common Troubleshooting | Issue | Possible Cause | Solution | |--------------------------|---------------------------------------|----------------------------------------------------------| | Form not triggering | Webhook URL incorrect | Double-check webhook endpoint and method (POST) | | Email not delivered | Wrong SMTP/Gmail credentials | Re-enter credentials or verify SMTP access | | Google Sheets not updated| Sheets ID wrong/missing permissions | Use correct ID, share sheet with service account or make accessible | | Missing report fields in log | Form field names or types mismatched| Align JSON/form data keys with workflow node mappings | | Attachments not saved | Field not set for file type | Review form field definitions and adjust as needed | Need Help? Want a tailored setup, advanced automations or powerful add-ons (AI, SLAs, PDF logs, ticketing integration)? Our n8n workflow experts at WeblineIndia are ready to help you engineer seamless incident management for any industry. 👉 Contact WeblineIndia — Your Automation partner for smart preventive maintenance and calendar-driven ops!
by vinci-king-01
AI Conference Intelligence & Networking Optimizer with ScrapeGraphAI > ⚠️ IMPORTANT: This template requires a self-hosted n8n instance with ScrapeGraphAI integration. It cannot be used with n8n Cloud due to web scraping capabilities. This workflow automatically discovers industry conferences and provides AI-powered networking intelligence to maximize your event ROI. How it works This workflow automatically discovers industry conferences and provides AI-powered networking intelligence to maximize your event ROI. Key Steps Scheduled Discovery - Runs weekly to find new industry conferences from Eventbrite and other sources. AI-Powered Scraping - Uses ScrapeGraphAI to extract comprehensive conference information including speakers, agenda, and networking opportunities. Speaker Intelligence - Analyzes speakers to identify high-priority networking targets based on their role, company, and expertise. Agenda Analysis - Extracts and maps the complete conference schedule to optimize your time and networking strategy. Networking Strategy - Generates AI-powered recommendations for maximizing networking ROI with prioritized contact lists and approach strategies. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for web scraping capabilities. Customize conference sources - Update the Eventbrite URL to target specific industries or locations. Adjust monitoring frequency - Modify the weekly trigger to match your conference discovery needs. Review networking priorities - The system automatically prioritizes speakers, but you can customize the criteria. Technical Configuration Prerequisites Self-hosted n8n instance (version 1.0+) ScrapeGraphAI API credentials Eventbrite API access (optional, for enhanced data) API Configuration ScrapeGraphAI Setup Sign up at https://scrapegraph.ai Generate API key from dashboard Add credentials in n8n: Settings > Credentials > Add Credential > ScrapeGraphAI Customization Examples Modify Conference Sources: // In Eventbrite Scraper node, update the URL: const targetUrl = "https://www.eventbrite.com/d/united-states/technology/"; const industryFilter = "?q=artificial+intelligence"; Adjust Networking Priorities: // In Speaker Intelligence node, modify scoring weights: const priorityWeights = { executive_level: 0.4, company_size: 0.3, industry_relevance: 0.2, speaking_topic: 0.1 }; Customize Output Format: // In Networking Strategy node, modify output structure: const outputFormat = { high_priority: speakers.filter(s => s.score > 8), medium_priority: speakers.filter(s => s.score > 6 && s.score <= 8), networking_plan: generateApproachStrategy(speakers) }; Data Storage & Output Formats Storage Options Local JSON files** - Default storage for conference data Google Drive** - For sharing reports with team Database** - PostgreSQL/MySQL for enterprise deployments Cloud Storage** - AWS S3, Google Cloud Storage Output Formats JSON** - Raw data for API integration CSV** - For spreadsheet analysis PDF Reports** - Executive summaries Markdown** - Documentation and sharing Sample Output Structure { "conference_data": { "event_name": "AI Summit 2024", "date": "2024-06-15", "location": "San Francisco, CA", "speakers": [ { "name": "Dr. Sarah Chen", "title": "CTO, TechCorp", "company": "TechCorp Inc", "networking_score": 9.2, "priority": "high", "approach_strategy": "Connect via LinkedIn, mention shared AI interests" } ], "networking_plan": { "high_priority_targets": 5, "recommended_approach": "Focus on AI ethics panel speakers", "schedule_optimization": "Attend morning keynotes, network during breaks" } } } Key Features Automated Conference Discovery** - Finds relevant industry events from multiple sources Speaker Intelligence Analysis** - Identifies high-value networking targets with contact priority scoring Strategic Agenda Mapping** - Optimizes your conference schedule for maximum networking impact AI-Powered Recommendations** - Provides personalized networking strategies and approach methods Priority Contact Lists** - Ranks speakers by business value and networking potential Troubleshooting Common Issues ScrapeGraphAI Rate Limits - Implement delays between requests Website Structure Changes - Update scraping prompts in ScrapeGraphAI nodes API Authentication - Verify credentials and permissions Performance Optimization Adjust trigger frequency based on conference season Implement caching for repeated data Use batch processing for large conference lists Support & Customization For advanced customization or enterprise deployments, consider: Custom speaker scoring algorithms Integration with CRM systems (Salesforce, HubSpot) Advanced analytics and reporting dashboards Multi-language support for international conferences
by PDF Vector
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description: Unified Academic Search Across Major Research Databases This powerful workflow enables researchers to search multiple academic databases simultaneously, automatically deduplicate results, and export formatted bibliographies. By leveraging PDF Vector's multi-database search capabilities, researchers can save hours of manual searching and ensure comprehensive literature coverage across PubMed, ArXiv, Google Scholar, Semantic Scholar, and ERIC databases. Target Audience & Problem Solved This template is designed for: Graduate students** conducting systematic literature reviews Researchers** ensuring comprehensive coverage of their field Librarians** helping patrons with complex searches Academic teams** building shared bibliographies It solves the critical problem of fragmented academic search by providing a single interface to query all major databases, eliminating duplicate results, and standardizing output formats. Prerequisites n8n instance with PDF Vector node installed PDF Vector API credentials with search permissions Basic understanding of academic search syntax Optional: PostgreSQL for search history logging Minimum 50 API credits for comprehensive searches Step-by-Step Setup Instructions Configure PDF Vector Credentials Go to n8n Credentials section Create new PDF Vector credentials Enter your API key from pdfvector.io Test the connection to verify setup Import the Workflow Template Copy the template JSON code In n8n, click "Import Workflow" Paste the JSON and save Review all nodes for any configuration needs Customize Search Parameters Open the "Set Search Parameters" node Modify the default search query for your field Adjust the year range (default: 2020-present) Set results per source limit (default: 25) Configure Export Options Choose your preferred export formats (BibTeX, CSV, JSON) Set the output directory for files Configure file naming conventions Enable/disable specific export types Test Your Configuration Run the workflow with a sample query Check that all databases return results Verify deduplication is working correctly Confirm export files are created properly Implementation Details The workflow implements a sophisticated search pipeline: Parallel Database Queries: Searches all configured databases simultaneously for efficiency Smart Deduplication: Uses DOI matching and fuzzy title comparison to remove duplicates Relevance Scoring: Combines citation count, title relevance, and recency for ranking Format Generation: Creates properly formatted citations in multiple styles Batch Processing: Handles large result sets without memory issues Customization Guide Adding Custom Databases: // In the PDF Vector search node, add to providers array: "providers": ["pubmed", "semantic_scholar", "arxiv", "google_scholar", "eric", "your_custom_db"] Modifying Relevance Algorithm: Edit the "Rank by Relevance" node to adjust scoring weights: // Adjust these weights for your needs: const titleWeight = 10; // Title match importance const citationWeight = 5; // Citation count importance const recencyWeight = 10; // Recent publication bonus const fulltextWeight = 15; // Full-text availability bonus Custom Export Formats: Add new format generators in the workflow: // Example: Add APA format export const apaFormat = papers.map(p => { const authors = p.authors.slice(0, 3).join(', '); return ${authors} (${p.year}). ${p.title}. ${p.journal || 'Preprint'}.; }); Advanced Filtering: Implement additional filters: Journal impact factor thresholds Open access only options Language restrictions Methodology filters for systematic reviews Search Features: Query multiple databases in parallel Advanced filtering and deduplication Citation format export (BibTeX, RIS, etc.) Relevance ranking across sources Full-text availability checking Workflow Process: Input: Search query and parameters Parallel Search: Query all databases Merge & Deduplicate: Combine results Rank: Sort by relevance/citations Enrich: Add full-text links Export: Multiple format options
by Jitesh Dugar
Validated RSVP Confirmation with Automated Badge Generation Overview: This comprehensive workflow automates the entire event RSVP process from form submission to attendee confirmation, including real-time email validation and personalized digital badge generation. ✨ KEY FEATURES: • Real-time Email Validation - Verify attendee emails using VerifiEmail API to prevent fake registrations • Automated Badge Generation - Create beautiful, personalized event badges with attendee details • Smart Email Routing - Send confirmation emails with badges for valid emails, rejection notices for invalid ones • Comprehensive Logging - Track all RSVPs (both valid and invalid) in Google Sheets for analytics • Dual Path Logic - Handle valid and invalid submissions differently with conditional branching • Anti-Fraud Protection - Detect disposable emails and invalid domains automatically 🔧 WORKFLOW COMPONENTS: Webhook Trigger - Receives RSVP submissions Email Validation - Verifies email authenticity using VerifiEmail API Conditional Logic - Separates valid from invalid submissions Badge Creator - Generates HTML-based personalized event badges Image Converter - Converts HTML badges to shareable PNG images using HTMLCssToImage Email Sender - Delivers confirmation with badge or rejection notice via Gmail Data Logger - Records all attempts in Google Sheets for tracking and analytics 🎯 PERFECT FOR: • Conference organizers managing hundreds of RSVPs • Corporate event planners requiring verified attendee lists • Webinar hosts preventing fake registrations • Workshop coordinators issuing digital badges • Community event managers tracking attendance 💡 BENEFITS: • Reduces manual verification time by 95% • Eliminates fake email registrations • Creates professional branded badges automatically • Provides real-time RSVP tracking and analytics • Improves attendee experience with instant confirmations • Maintains clean, verified contact lists 🛠️ REQUIRED SERVICES: • n8n (cloud or self-hosted) • VerifiEmail API (https://verifi.email) • HTMLCssToImage API (https://htmlcsstoimg.com) • Gmail account (OAuth2) • Google Sheets 📈 USE CASE SCENARIO: When someone submits your event RSVP form, this workflow instantly validates their email, generates a personalized badge with their details, and emails them a confirmation—all within seconds. Invalid emails receive a helpful rejection notice, and every submission is logged for your records. No manual work required! 🎨 BADGE CUSTOMIZATION: The workflow includes a fully customizable HTML badge template featuring: • Gradient background with modern design • Attendee name, designation, and organization • Event name and date • Email address and validation timestamp • Google Fonts (Poppins) for professional typography 📊 ANALYTICS INCLUDED: Track metrics like: • Total RSVPs received • Valid vs invalid email ratio • Event-wise registration breakdown • Temporal patterns • Organization/company distribution ⚡ PERFORMANCE: • Processing time: ~3-5 seconds per RSVP • Scales to handle 100+ concurrent submissions • Email delivery within 10 seconds • Real-time Google Sheets updates 🔄 EASY SETUP: Import the workflow JSON Configure your credentials (detailed instructions included) Create your form with required fields (name, email, event, designation, organization) Connect the webhook Activate and start receiving validated RSVPs! 🎓 LEARNING VALUE: This workflow demonstrates: • Webhook integration patterns • API authentication methods • Conditional workflow branching • HTML-to-image conversion • Email automation best practices • Data logging strategies • Error handling techniques
by Samuel Heredia
This n8n workflow securely processes contact form submissions by validating user input, formatting the data, and storing it in a MongoDB database. The flow ensures data consistency, prevents unsafe entries, and provides a confirmation response back to the user. Workflow 1. Form Submission Node Purpose: Serves as the workflow’s entry point. Functionality: Captures user input from the contact form, which typically includes: name last name email phone number 2. Code Node (Validation Layer) Purpose: Ensures that collected data is valid and secure. Validations performed: Removes suspicious characters to mitigate risks like SQL injection or script injection. Validates the phone_number field format (numeric, correct length, etc.). If any field fails validation, the entry is marked as “is_not_valid” to block it from database insertion. 3. Edit Fields Node (Data Formatting) Purpose: Normalizes data before database insertion. Transformations applied: Converts field names to snake_case (first_name, last_name, phone_number). Standardizes field naming convention for consistency in MongoDB storage. 4. MongoDB Node (Insert Documents) Purpose: Persists validated data in MongoDB Atlas. Process: Inserts documents into the target collection with the cleaned and formatted fields. Connection is established securely using a MongoDB Atlas connection string (URI). 🔧 How to Set Up MongoDB Atlas Connection URL a. Create a Cluster b. Log in to MongoDB Atlas and create a new cluster. c. Configure Database Access: Add a database user with a secure username and password, Assign appropriate roles (e.g., Atlas Admin for full access or Read/Write for limited). d. Obtain Connection String (URI) From Atlas, go to Clusters → Connect → Drivers. Copy the provided connection string, which looks like: mongodb+srv://<username>:<password>@cluster0.abcd123.mongodb.net/myDatabase?retryWrites=true&w=majority Configure in n8n In the MongoDB node, paste the URI. Replace <username>, <password>, and myDatabase with your actual credentials and database name. Test the connection to ensure it is successful. 5. Form Ending Node Purpose: Provides closure to the workflow. Functionality: Sends a confirmation response back to the user, indicating that their contact details were successfully submitted and stored. ✅ Result: With this workflow, all contact form submissions are safely validated, normalized, and stored in MongoDB Atlas, ensuring both data integrity and security basic.
by Davide
This workflow automates the process of monitoring Amazon product prices and sending alerts when a product’s price drops below a defined threshold. It integrates ScrapeGraphAI, Google Sheets, and Telegram to provide a complete end-to-end price tracking system. Key Advantages 💡 Intelligent Scraping Uses ScrapeGraphAI to extract structured data (product prices) from complex Amazon pages — even those with dynamic JavaScript rendering. 📊 Centralized Tracking All products and price history are managed in a Google Sheet, making it easy to review and update data. ⚡ Real-Time Alerts Sends instant Telegram notifications when a product’s price drops below its previous minimum — helping users take quick advantage of deals. 🔁 Fully Automated Once set up, it runs on a schedule with no manual input required, automatically updating and alerting users. 🧩 Modular & Extensible Built entirely with n8n nodes, making it easy to customize — for example, adding new alert channels (email, Slack) or additional data checks. 🕒 Time-Efficient Eliminates the need for manual price checking, saving significant time for users monitoring multiple products. How it Works This automated workflow tracks Amazon product prices and sends an alert via Telegram when a product hits a new lowest price. Here's the process: Trigger & Data Fetch: The workflow is initiated either on a scheduled basis (every 10 minutes) or manually. It first connects to a designated Google Sheet, which acts as a database, to fetch a list of products to monitor. Each product's details (Name, URL, and current "MIN PRICE") are read. Price Scraping & Comparison: The workflow loops through each product from the sheet. For each product, it uses ScrapeGraphAI to navigate to the Amazon product page, render JavaScript-heavy content, and extract the current price. This newly scraped price is then compared to the "MIN PRICE" value stored in the Google Sheet for that product. Conditional Alert & Update: If the new price is lower, two actions are triggered: a. Sheet Update: The Google Sheet is updated with the new, lower "MIN PRICE" and the current date. b. Telegram Notification: A message is sent to a specified Telegram chat, announcing that the product has hit a new lowest price, including the product name and a link. If the price is not lower, no action is taken for that product, and the workflow moves on to the next one in the loop. Set up Steps To implement this workflow yourself, follow these steps: Prepare the Google Sheet: Create a copy of the provided template spreadsheet. In the sheet, fill in the columns for PRODUCT (name), URL (the full Amazon product link), and MIN PRICE. When adding a new product, set the MIN PRICE to a very high value (e.g., 9999) to ensure the first real price triggers an alert. Configure n8n Credentials: Google Sheets: Set up a "Google Sheets account" credential in n8n using OAuth2 to grant the workflow access to your copied spreadsheet. ScrapeGraphAI: Configure the "ScrapegraphAI account" credential with your API key from the ScrapeGraphAI service. Telegram: Set up a "Telegram account" credential with a Bot Token obtained from the BotFather in Telegram. You will also need your specific chatId for the node. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Oneclick AI Squad
This n8n workflow automates the prioritization and scheduling of sales orders based on customer SLAs, urgency, and profitability. It ensures that high-priority and SLA-critical orders are picked, packed, and dispatched first—improving fulfillment speed, customer satisfaction, and operational efficiency across warehouses and logistics. Key Features Automated Order Fetching:** Periodically retrieves all pending or confirmed sales orders from ERP systems. Dynamic SLA-Based Prioritization:** Calculates order priority scores using urgency, customer tier, order value, and profit margin. Intelligent SLA Monitoring:** Identifies SLA breaches and automatically flags them for immediate handling. Automated Task Creation:** Generates urgent picking tasks and alerts warehouse managers in real-time. Smart Scheduling:** Optimizes picking and dispatch timelines based on urgency and capacity. Seamless ERP & TMS Integration:** Updates order statuses and schedules dispatches automatically. Operational Transparency:** Sends end-of-cycle summary reports via email to operations teams. Workflow Process Schedule Trigger Runs every 15 minutes to ensure orders are frequently evaluated. Maintains real-time responsiveness without overloading systems. Fetch Pending Orders Retrieves all orders in pending or confirmed state from ERP API. Configurable limit (e.g., 100 orders per run) for controlled processing. Fetch Customer SLA Data Collects SLA tiers, delivery timeframes, and customer-specific priorities from ERP or CRM API. Supports dynamic customer segmentation (Gold, Silver, Bronze tiers). Calculate Priority Scores Assigns weighted priority scores based on multiple criteria: Urgency: 40% Tier: 30% Order Value: 20% Profit Margin: 10% Produces a composite score used for sorting and scheduling. Check if SLA Critical Detects if any order is close to or past SLA deadlines. Routes SLA-breached orders for immediate escalation. Create Urgent Picking Task Generates warehouse picking tasks for critical orders. Assigns to senior pickers or rapid response teams. Alert Warehouse Manager Sends instant SMS and email alerts for SLA-critical or high-priority orders. Ensures immediate managerial attention. Batch Normal Orders Groups non-critical orders into batches of 10 for optimized processing. Reduces load on WMS while maintaining steady throughput. Generate Picking Schedule Creates smart picking schedules based on urgency: High Priority: Same-day Normal: Within 1 day Low: Within 2–3 days Create Bulk Picking Tasks Pushes picking tasks into WMS (Warehouse Management System). Uses auto-assignment and route optimization logic. Generate Dispatch Schedule Builds dispatch timelines according to delivery method: Express, Priority, or Standard. Syncs with transport capacity data. Schedule Dispatches in TMS Sends dispatch requests to TMS (Transport Management System). Books carriers and reserves capacity for each batch. Update Order Statuses Updates ERP with new order statuses, schedules, and expected delivery dates. Maintains a unified view across systems. Generate Summary Report Creates a summary JSON report including: Total orders processed SLA-critical cases Dispatch breakdowns Next deadlines Send Summary Notification Emails the operations team with execution summary and performance metrics. Ensures team alignment and SLA visibility. Industries That Benefit This automation is especially valuable for: E-commerce & Retail:** To prioritize same-day or express deliveries for VIP customers. Logistics & 3PL Providers:** For meeting tight SLAs across multiple clients and delivery tiers. Manufacturing & B2B Distribution:** Ensures high-value or contractual orders are prioritized. Pharma & Healthcare:** Critical for time-sensitive and compliance-driven deliveries. Consumer Goods & FMCG:** Helps manage high-volume dispatch with smart scheduling. Prerequisites ERP system with API access (e.g., SAP, Odoo, NetSuite). WMS and TMS integrations with order/task APIs. SMTP and SMS gateway credentials for notifications. n8n instance with HTTP, Function, Email, and Scheduler nodes installed. Modification Options Customize priority scoring weights per business type. Integrate AI for predictive SLA breach forecasting. Add Slack/Teams channels for real-time operational alerts. Implement escalation routing for unassigned urgent tasks. Extend reports to include OTIF (On-Time-In-Full) metrics. Explore More AI-Powered Workflows: Contact us for customized supply chain and order management automation.
by Mohammed Abid
Shopify Order Data to Airtable This n8n template demonstrates how to capture incoming Shopify order webhooks, transform the data into a structured format, and insert each product line item as a separate record in an Airtable sheet. It provides both high-level order information and detailed product-level metrics, making it ideal for analytics, reporting, inventory management, and customer insights. Good to Know Airtable API Rate Limits: By default, Airtable allows 5 requests per second per base. Consider batching or adding delays if you process high volumes of orders. Shopify Webhook Configuration: Ensure you have configured the orders/create webhook in your Shopify Admin to point to the n8n webhook node. Field Mapping: The template maps standard Shopify fields; if your store uses custom order or line item properties, update the Function nodes accordingly. How It Works Webhook Trigger: A Shopify orders/create webhook fires when a new order is placed. Normalize Order Data: The Function node extracts core order, customer, shipping, and billing details and computes financial totals (subtotal, tax, shipping, discounts). Line Item Breakdown: A second Function node builds an array of objects—one per line item—calculating per-item totals, tax/shipping allocation, and product attributes (color, size, material). Check Customer Record: Optionally check against an Airtable "Customers" sheet to flag new vs existing customers. Auto-Increment Record ID: A Function node generates a running serial number for each Airtable record. Insert Records: The Airtable node writes each line item object into the target base and table, creating rich records with both order-level and product-level details. How to Use Clone the Template: Click "Use Template" in your n8n instance to import this workflow. Configure Credentials: Shopify Trigger: Add your Shopify store domain and webhook secret. Airtable Node: Set up your Airtable API key and select the base and table. Review Field Names: Match the field names in the Function nodes to the columns in your Airtable table. Activate Workflow: Turn on the workflow and place a test order in your Shopify store. Verify Records: Check your Airtable sheet to see the new order and its line items. Requirements n8n@latest Shopify Store with orders/create webhook configured Airtable Account with a base and table ready to receive records Customizing This Workflow Add Custom Fields: Extend the Functions to include additional Shopify metafields, discounts, or customer tags. Alternative Destinations: Replace the Airtable node with Google Sheets, Supabase, or another database by swapping in the corresponding node. Error Handling: Insert If/Wait nodes to retry on API failures or send notifications on errors. Multi-Currency Support: Adapt the currency logic to convert totals based on dynamic exchange rates.
by Oneclick AI Squad
This n8n workflow automates email blasts with follow-ups and response tracking by reading contact data from a Google Sheet daily, looping through contacts to send personalized emails based on follow-up stages via Gmail, updating the sheet with status changes, and monitoring replies for logging. Why Use It This workflow streamlines email marketing campaigns by automating personalized email distribution, managing follow-up sequences, and tracking responses without manual intervention, saving time, improving engagement, and providing actionable insights into contact interactions. How to Import It Download the Workflow JSON: Obtain the workflow file from the n8n template or create it based on this document. Import into n8n: In your n8n instance, go to "Workflows," click the three dots, select "Import from File," and upload the JSON. Configure Credentials: Set up Gmail and Google Sheets credentials in n8n. Run the Workflow: Activate the scheduled trigger and test with a sample Google Sheet. System Architecture Email Blast Pipeline**: Daily Trigger - 9 AM: Initiates the workflow daily at 9 AM via Cron. Read Contact Data from Google Sheet: Fetches contact details from the sheet. Loop Through Contacts: Processes each contact individually. Determine Follow-Up Stage: Identifies the current stage for each contact. Send Main/Follow-Up Email: Delivers the appropriate email via Gmail. Update Sheet Status: Updates the Google Sheet with the latest status. Response Tracking Flow**: Check Gmail for Replies: Monitors Gmail for email responses. Log Responses: Records responses in the Google Sheet. Google Sheet File Structure Sheet Name**: EmailCampaign Range**: A1:F10 (or adjust based on needs) | A | B | C | D | E | F | |------------|------------|---------------|---------------|---------------|---------------| | name | email | stage | last_email_date | status | response | | John Doe | john@example.com | Initial | 2025-08-07 | Pending | | | Jane Smith | jane@example.com | Follow-Up 1 | 2025-08-06 | Sent | "Interested" | | Bob Jones | bob@example.com | Follow-Up 2 | 2025-08-05 | Replied | "Follow up later" | Columns**: name: Contact’s full name. email: Contact’s email address for sending emails. stage: Current follow-up stage (e.g., Initial, Follow-Up 1, Follow-Up 2). last_email_date: Date of the last email sent. status: Current status (e.g., Pending, Sent, Replied). response: Logged response from the contact (updated after reply detection). Customization Ideas Adjust Schedule**: Change the Cron trigger to hourly or weekly based on campaign needs. Add Email Templates**: Customize email content for different stages or audiences. Incorporate SMS**: Add WhatsApp or SMS follow-ups using additional nodes. Enhance Tracking**: Integrate a dashboard (e.g., Google Data Studio) for real-time campaign analytics. Automate Segmentation**: Add logic to segment contacts by industry or interest for targeted emails. Requirements to Run This Workflow Google Sheets Account**: For storing and managing contact data and responses. Gmail Account**: For sending emails and checking replies (requires IMAP enabled). n8n Instance**: With Google Sheets and Gmail connectors configured. Cron Service**: For scheduling the daily trigger. Internet Connection**: To access Google Sheets and Gmail APIs. API Credentials**: Gmail OAuth2 and Google Sheets API credentials set up in n8n. Notes Ensure the Google Sheet is shared with the n8n service account or has appropriate permissions. Test the workflow with a small contact list to verify email delivery and response logging. Adjust the stage logic in the "Determine Follow-Up Stage" node to match your campaign structure.
by WeblineIndia
Automate Video Upload → Auto-Thumbnail → Google Drive This workflow accepts a video via HTTP upload, verifies it’s a valid video, extracts a thumbnail frame at the 5-second mark using FFmpeg (auto-installs a static build if missing), uploads the image to a specified Google Drive folder and returns a structured JSON response containing the new file’s details. Who’s it for Marketing / Social teams** who need ready-to-publish thumbnails from raw uploads. Developers** who want an API-first thumbnail microservice without standing up extra infrastructure. Agencies / Creators** standardizing assets in a shared Drive. How it works Accept Video Upload (Webhook) Receives multipart/form-data with file in field media at /mediaUpload. Response is deferred until the final node. Validate Upload is Video (IF) Checks {{$binary.media.mimeType}} contains video/. Non-video payloads can be rejected with HTTP 400. Persist Upload to /tmp (Write Binary File) Writes the uploaded file to /tmp/<originalFilename or input.mp4> for stable processing. Extract Thumbnail with FFmpeg (Execute Command) Uses system ffmpeg if available; otherwise downloads a static binary to /tmp/ffmpeg. Runs: ffmpeg -y -ss 5 -i -frames:v 1 -q:v 2 /tmp/thumbnail.jpg Load Thumbnail from Disk (Read Binary File) Reads /tmp/thumbnail.jpg into the item’s binary as thumbnail. Upload Thumbnail to Drive (Google Drive) Uploads to your target folder. File name is <original>-thumb.jpg. Return API Response (Respond to Webhook) Sends JSON to the client including Drive file id, name, links, size, and checksums (if available). How to set up Import the workflow JSON into n8n. Google Drive Create (or choose) a destination folder; copy its Folder ID. Add Google Drive OAuth2 credentials in n8n and select them in the Drive node. Set the folder in the Drive “Upload” node. Webhook The endpoint is POST /webhook/mediaUpload. Test: curl -X POST https://YOUR-N8N-URL/webhook/mediaUpload \ -F "media=@/path/to/video.mp4" FFmpeg Nothing to install manually: the Execute Command node auto-installs a static ffmpeg if it’s not present. (Optional) If running n8n in Docker and you want permanence, use an image that includes ffmpeg. Response body The Respond node returns JSON with file metadata. You can customize the fields as needed. (Optional) Non-video branch On the IF node’s false output, add a Respond node with HTTP 400 and a helpful message. Requirements n8n instance with Execute Command node enabled (self-hosted/container/VM). Outbound network** access (to fetch static FFmpeg if not installed). Google Drive OAuth2** credential with permission to the destination folder. Adequate temp space in /tmp for the uploaded video and generated thumbnail. How to customize Timestamp**: change -ss 5 to another second, or parameterize it via query/body (e.g., timestamp=15). Multiple thumbnails**: duplicate the FFmpeg + Read steps with -ss 5, -ss 15, -ss 30, suffix names -thumb-5.jpg, etc. File naming**: include the upload time or Drive file ID: {{ base + '-' + $now + '-thumb.jpg' }}. Public sharing: add a **Drive → Permission: Create node (Role: reader, Type: anyone) and return webViewLink. Output target: replace the Drive node with **S3 Upload or Zoho WorkDrive (HTTP Request) if needed. Validation**: enforce max file size/MIME whitelist in a small Function node before writing to disk. Logging**: append a row to Google Sheets/Notion with sourceFile, thumbId, size, duration, status. Add-ons Slack / Teams notification** with the uploaded thumbnail link. Image optimization** (e.g., convert to WebP or resize variants). Retry & alerts** via error trigger workflow. Audit log** to a database (e.g., Postgres) for observability. Use Case Examples CMS ingestion**: Editors upload videos; workflow returns a thumbnail URL to store alongside the article. Social scheduling**: Upload longform to generate a quick hero image for a post. Client portals**: Clients drop raw footage; you keep thumbnails uniform in one Drive folder. Common troubleshooting | Issue | Possible Cause | Solution | | ----------------------------------------------------- | ------------------------------------------------------ | ---------------------------------------------------------------------------------------------------------------- | | ffmpeg: not found | System lacks ffmpeg and static build couldn’t download | Ensure outbound HTTPS allowed; keep the auto-installer lines intact; or use a Docker image that includes ffmpeg. | | Webhook returns 400 “not a video” | Wrong field name or non-video MIME | Send file in media field; ensure it’s video/*. | | Drive upload fails (403 / insufficient permissions) | OAuth scope or account lacks folder access | Reconnect Drive credential; verify the destination Folder ID and sharing/ownership. | | Response missing webViewLink / webContentLink | Drive node not returning link fields | Enable link fields in the Drive node or build URLs using the returned id. | | 413 Payload Too Large at reverse proxy | Proxy limits on upload size | Increase body size limits in your proxy (e.g., Nginx client_max_body_size). | | Disk full / ENOSPC | Large uploads filling /tmp | Increase temp storage; keep Cleanup step; consider size caps and early rejection. | | Corrupt thumbnail or black frame | Timestamp lands on a black frame | Change -ss or use -ss before -i vs. after; try different seconds (e.g., 1–3s). | | Slow extraction | Large or remote files; cold FFmpeg download | Warm the container; host near upload source; keep static ffmpeg cached in image. | | Duplicate outputs | Repeat requests with same video/name | Add a de-dup check (query Drive for existing <base>-thumb.jpg before upload). | Need Help? Want this wired to S3 or Zoho WorkDrive or to generate multiple timestamps and public links out of the box? We're happy to help.
by Milan Vasarhelyi - SmoothWork
What this workflow does This workflow automates backend setup tasks for real estate client portals. When a new property transaction is added to your Google Sheets database with a buyer email but no document folder assigned, the workflow automatically creates a dedicated Google Drive folder, updates the spreadsheet with the folder URL, and adds an initial task prompting the client to upload documents. This automation eliminates manual folder creation and task assignment, ensuring every new transaction has its documentation infrastructure ready from day one. Your clients can access their dedicated folder directly from the portal, keeping all property-related documents organized and accessible in one place. Key benefits Eliminate manual setup**: No more creating folders and tasks individually for each transaction Consistent client experience**: Every buyer gets the same professional onboarding process Organized documentation**: Each transaction has its own Google Drive folder automatically shared with the client Time savings**: Focus on closing deals instead of administrative setup Setup requirements Important: You must make a copy of the reference Google Sheets spreadsheet to your own Google account before using this workflow. Your spreadsheet needs at minimum two tabs: Transactions tab**: Columns for ID, Buyer Email, Documents URL, Property Address, and Status Tasks tab**: Columns for Transaction ID, Task Name, Task Description, and Status Configuration steps Authenticate your Google Sheets and Google Drive accounts in n8n Update the Google Sheets trigger node to point to your copied spreadsheet Set the parent folder ID in the "Create Client Documents Folder" node (where transaction folders should be created) Customize the initial task name and description in the "Add Initial Upload Task" node Verify all sheet names match your spreadsheet tabs The workflow triggers every minute checking for new transactions that meet the criteria (has buyer email, missing documents URL).
by Alysson Neves
Canvas: Send students their pending assignments How it works Trigger the workflow and set the Canvas base URL and target course name. Fetch all instructor courses and locate the course ID that matches the name. Retrieve enrolled students and their unsubmitted submissions for the course, handling paginated results. Merge student records with submission data, convert due dates to local time, and build a per-student summary. Send a Canvas conversation to each student with a personalized list of pending assignments and links. Setup [ ] Connect Canvas API credentials (Bearer and header auth used by the workflow). [ ] Enter your Canvas base URL (e.g. https://your_educational_institution.instructure.com). [ ] Set the exact course name to check for pending work. [ ] Confirm the teacher account can view students and send conversations. [ ] Run the workflow manually to verify output and delivery. [ ] Edit the message subject or body template if you need different wording.