by Iternal Technologies
Blockify® Technical Manual Data Optimization Workflow Blockify Optimizes Data for Technical Manual RAG and Agents - Giving Structure to Unstructured Data for ~78X Accuracy, when pairing Blockify Ingest and Blockify Distill Learn more at https://iternal.ai/blockify Get Free Demo API Access here: https://console.blockify.ai/signup Read the Technical Whitepaper here: https://iternal.ai/blockify-results See example Accuracy Comparison here: https://iternal.ai/case-studies/medical-accuracy/ Blockify is a data optimization tool that takes messy, unstructured text, like hundreds of sales‑meeting transcripts or long proposals, and intelligently optimizes the data into small, easy‑to‑understand "IdeaBlocks." Each IdeaBlock is just a couple of sentences in length that capture one clear idea, plus a built‑in contextualized question and answer. With this approach, Blockify improves accuracy of LLMs (Large Language Models) by an average aggregate 78X, while shrinking the original mountain of text to about 2.5% of its size while keeping (and even improving) the important information. When Blockify's IdeaBlocks are compared with the usual method of breaking text into equal‑sized chunks, the results are dramatic. Answers pulled from the distilled IdeaBlocks are roughly 40X more accurate, and user searches return the right information about 52% more accurate. In short, Blockify lets you store less data, spend less on computing, and still get better answers- turning huge documents into a concise, high‑quality knowledge base that anyone can search quickly. Blockify works by processing chunks of text to create structured data from an unstructured data source. Blockify® replaces the traditional "dump‑and‑chunk" approach with an end‑to‑end pipeline that cleans and organizes content before it ever hits a vector store. Admins first define who should see what, then the system ingests any file type—Word, PDF, slides, images—inside public cloud, private cloud, or on‑prem. A context‑aware splitter finds natural breaks, and a series of specially developed Blockify LLM model turns each segment into a draft IdeaBlock. GenAI systems fed with this curated data return sharper answers, hallucinate far less, and comply with security policies out of the box. The result: higher trust, lower operating cost, and a clear path to enterprise‑scale RAG without the cleanup headaches that stall most AI rollouts.
by Mohamed Abdelwahab
1. Overview The IngestionDocs workflow is a fully automated **document ingestion and knowledge management system* built with *n8n**. Its purpose is to continuously ingest organizational documents from Google Drive, transform them into vector embeddings using OpenAI, store them in Pinecone, and make them searchable and retrievable through an AI-powered Q&A interface. This ensures that employees always have access to the most up-to-date knowledge base without requiring manual intervention. 2. Key Objectives Automated Ingestion** → Seamlessly process new and updated documents from Google Drive.\ Change Detection** → Track and differentiate between new, updated, and previously processed documents.\ Knowledge Base Construction** → Convert documents into embeddings for semantic search.\ AI-Powered Assistance** → Provide an intelligent Q&A system for employees to query manuals.\ Scalable & Maintainable** → Modular design using n8n, LangChain, and Pinecone. 3. Workflow Breakdown A. Document Monitoring and Retrieval The workflow begins with two Google Drive triggers: File Created Trigger → Fires when a new document is uploaded.\ File Updated Trigger → Fires when an existing document is modified.\ A search operation lists the files in the designated Google Drive folder.\ Non-downloadable items (e.g., subfolders) are filtered out.\ For valid files: The file is downloaded.\ A SHA256 hash is generated to uniquely identify the file's content. B. Record Management (Google Sheets Integration) To keep track of ingestion states, the workflow uses a **Google Sheets--based Record Manager**:\ Each file entry contains:\ Id** (Google Drive file ID)\ Name** (file name)\ hashId** (SHA256 checksum)\ The workflow compares the current file's hash with the stored one:\ New Document** → File not found in records → Inserted into the Record Manager.\ Already Processed** → File exists and hash matches → Skipped.\ Updated Document** → File exists but hash differs → Record is updated. This guarantees that only new or modified content is processed, avoiding duplication. C. Document Processing and Vectorization Once a document is marked as new or updated:\ Default Data Loader extracts its content (binary files supported).\ Pages are split into individual chunks.\ Metadata such as file ID and name are attached.\ Recursive Character Text Splitter divides the content into manageable segments with overlap.\ OpenAI Embeddings (text-embedding-3-large) transform each text chunk into a semantic vector.\ Pinecone Vector Store stores these vectors in the configured index:\ For new documents, embeddings are inserted into a namespace based on the file name.\ For updated documents, the namespace is cleared first, then re-ingested with fresh embeddings. This process builds a scalable and queryable knowledge base. D. Knowledge Base Q&A Interface The workflow also provides an **interactive form-based user interface**:\ Form Trigger** → Collects employee questions.\ LangChain AI Agent**:\ Receives the question.\ Retrieves relevant context from Pinecone using vector similarity search.\ Processes the response using OpenAI Chat Model (gpt-4.1-mini).\ Answer Formatting**:\ Responses are returned in HTML format for readability.\ A custom CSS theme ensures a modern, user-friendly design.\ Answers may include references to page numbers when available. This creates a self-service knowledge base assistant that employees can query in natural language. 4. Technologies Used n8n** → Orchestration of the entire workflow.\ Google Drive API** → File monitoring, listing, and downloading.\ Google Sheets API** → Record manager for tracking file states.\ OpenAI API**: text-embedding-3-large for semantic vector creation.\ gpt-4.1-mini for conversational Q&A.\ Pinecone** → Vector database for embedding storage and retrieval.\ LangChain** → Document loaders, text splitters, vector store connectors, and agent logic.\ Crypto (SHA256)** → File hash generation for change detection.\ Form Trigger + Form Node** → Employee-facing Q&A submission and answer display.\ Custom CSS** → Provides a modern, responsive, styled UI for the knowledge base. 5. End-to-End Data Flow Employee uploads or updates a document → Google Drive detects the change.\ Workflow downloads and hashes the file → Ensures uniqueness and detects modifications.\ Record Manager (Google Sheets) → Decides whether to skip, insert, or update the record.\ Document Processing → Splitting + Embedding + Storing into Pinecone.\ Knowledge Base Updated → The latest version of documents is indexed.\ Employee asks a question via the web form.\ AI Agent retrieves embeddings from Pinecone + uses GPT-4.1-mini → Generates a contextual answer.\ Answer displayed in styled HTML → Delivered back to the employee through the form interface. 6. Benefits Always Up-to-Date** → Automatically syncs documents when uploaded or changed.\ No Duplicates** → Smart hashing ensures only relevant updates are reprocessed.\ Searchable Knowledge Base** → Employees can query documents semantically, not just by keywords.\ Enhanced Productivity** → Answers are immediate, reducing time spent browsing manuals.\ Scalable** → New documents and users can be added without workflow redesign. ✅ In summary, IngestionDocs is a **robust AI-driven document ingestion and retrieval system* that integrates *Google Drive, Google Sheets, OpenAI, and Pinecone* within *n8n**. It continuously builds and maintains a knowledge base of manuals while offering employees an intelligent, user-friendly Q&A assistant for fast and accurate knowledge retrieval.
by Rahul Joshi
Description: Keep your customer knowledge base up to date with this n8n automation template. The workflow connects Zendesk with Google Sheets, automatically fetching tickets tagged as “howto,” enriching them with requester details, and saving them into a structured spreadsheet. This ensures your internal or public knowledge base reflects the latest customer how-to queries—without manual copy-pasting. Perfect for customer support teams, SaaS companies, and service providers who want to streamline documentation workflows. What This Template Does (Step-by-Step) ⚡ Manual Trigger or Scheduling Run the workflow manually for testing/troubleshooting, or configure a schedule trigger for daily/weekly updates. 📥 Fetch All Zendesk Tickets Connects to your Zendesk account and retrieves all available tickets. 🔍 Filter for "howto" Tickets Only Processes only tickets that contain the “howto” tag, ensuring relevance. 👤 Enrich User Data Fetches requester details (name, email, profile info) to provide context. 📊 Update Google Sheets Knowledge Base Saves ticket data—including Ticket No., Description, Status, Tag, Owner Name, and Email. ✔️ Smart update prevents duplicates by matching on description. 🔁 Continuous Sync Each new or updated “howto” ticket is synced automatically into your knowledge base sheet. Key Features 🔍 Tag-based filtering for precise categorization 📊 Smart append-or-update logic in Google Sheets ⚡ Zendesk + Google Sheets integration with OAuth2 ♻️ Keeps knowledge base fresh without manual effort 🔐 Secure API credential handling Use Cases 📖 Maintain a live “how-to” guide from real customer queries 🎓 Build self-service documentation for support teams 📩 Monitor and track recurring help topics 💼 Equip knowledge managers with a ready-to-export dataset Required Integrations Zendesk API (for ticket fetch + user info) Google Sheets (for storing/updating records) Why Use This Template? ✅ Automates repetitive data entry ✅ Ensures knowledge base accuracy & freshness ✅ Reduces support team workload ✅ Easy to extend with more tags, filters, or sheet logic
by Jitesh Dugar
Accelerate your real estate marketing by moving from "photo capture" to "published listing" in seconds. This workflow automates the entire listing process by hosting property photos via UploadToURL, using GPT-4o Vision to write professional MLS descriptions, and parallel-publishing the results to WordPress and Airtable. 🎯 What This Workflow Does Turns on-site property photos into fully-enriched digital listings: 📝 Captures Property Media - Receives a photo (binary or URL) and basic address metadata via Webhook. ☁️ Instant CDN Hosting - UploadToURL converts the photo into a permanent, high-speed CDN link for your website. 👁️ Intelligent Property Analysis - GPT-4o Vision analyzes the image to detect room type, condition scores (1–10), professional feature tags, and lighting quality. ✍️ Automated Copywriting - Generates a 2-3 sentence, MLS-compliant description using professional real estate terminology. ⚡ Parallel Publishing - Simultaneously creates a draft post in WordPress (using Gutenberg blocks) and a new record in your Airtable MLS database. 📲 Instant Agent Confirmation - Sends a Telegram message to the agent with live links to the WordPress draft and Airtable record for immediate review. ✨ Key Features UploadToURL Integration**: Native community node hosting ensures your property photos are web-ready instantly without manual resizing or cloud storage management. Vision-Powered Insights**: AI automatically detects "premium finishes" or "renovation age," providing pricing signals without manual entry. Parallel Execution**: Uses a split-branch logic to publish to multiple platforms at once, significantly reducing total execution time. Unified Response**: A Merge node assembles the final IDs and URLs from all platforms into a single, clean JSON response. Audit-Ready MLS**: Every Airtable record is timestamped and includes the original high-res CDN link for external syndication. 💼 Perfect For Real Estate Agencies**: Managing high-volume listings across multiple agents and neighborhoods. Property Managers**: Quickly documenting unit conditions and updates for internal tracking. Independent Realtors**: Automating their personal website and CRM directly from their smartphone while on-site. Property Photographers**: Delivering "ready-to-publish" assets to clients with AI-generated metadata already attached. 🔧 What You'll Need Required Integrations UploadToURL** - To host property photos and provide CDN links. n8n Community Node** - n8n-nodes-uploadtourl must be installed. OpenAI API** - GPT-4o Vision for professional image analysis and copywriting. WordPress** - Basic Auth or Application Password to create draft posts. Airtable** - Personal Access Token to manage your MLS database. Optional Integrations Telegram** - To receive real-time notifications with links to your new listings. 🚀 Quick Start Import Template - Copy the JSON and import it into your n8n canvas. Install Node - Ensure the UploadToURL community node is installed. Set Credentials - Link your UploadToURL, OpenAI, WordPress, Airtable, and Telegram accounts. Define Variables - Update n8n variables: WP_BASE_URL, AIRTABLE_BASE_ID, and TELEGRAM_CHAT_ID. Prepare Airtable - Ensure your table has columns for Listing ID, Address, Price, and MLS Description. Deploy - Activate the workflow and start publishing properties instantly from the field. 🎨 Customization Options Watermarking**: Insert a node to add your agency logo to photos before they are uploaded to the CDN. Zillow/MLS Sync**: Add a branch to push the AI-generated data to external listing services via API. Virtual Staging**: Route photos through an AI staging service before hosting them on UploadToURL. Price Formatting**: Update the code node to support different currencies or regional price display formats. 📈 Expected Results Save 20-30 minutes per listing** by eliminating manual uploading, writing, and platform syncing. Improved SEO**: Every property photo includes AI-generated Alt-Text and descriptive filenames. Better Accuracy**: AI consistently captures features (like "crown molding" or "natural light") that agents might miss in a rush. Zero Friction**: Agents can go from taking a photo to having a draft live on the website before they leave the property. 🏆 Use Cases Rapid Market Entry A team of agents can document an entire apartment complex in one afternoon, with every room automatically categorized and described in WordPress by the time they get back to the office. Internal Quality Audits Property managers use the condition score (1–10) generated by the AI to prioritize maintenance and renovations across a portfolio. Social Media Teasers The AI-generated "Marketing Blurb" can be used to instantly trigger a second workflow that posts a property "sneak peek" to Instagram or LinkedIn. 💡 Pro Tips Structured Filenames**: The workflow automatically renames files to {listingId}_{address}.jpg for better organization and SEO. Draft Status**: Listings are created as "Drafts" in WordPress by default, allowing for a quick "Human-in-the-loop" review before going live. Lighting Analysis**: Use the AI-detected "Lighting Quality" field to identify photos that might need professional retouching. Ready to automate your real estate pipeline? Import this template and connect UploadToURL to start publishing professional listings faster. Questions about the Airtable schema? The workflow includes detailed sticky notes explaining the exact field types required for the MLS record sync.
by Jitesh Dugar
Transform your morning routine with an automated personal assistant that delivers everything you need to know directly to WhatsApp. This workflow aggregates live data from multiple sources and uses OpenAI to greet you with a context-aware, motivational message based on your specific day. 🎯 What This Workflow Does This template creates a highly personalized "Daily Digest" that saves you from checking multiple apps every morning: ⚡ Dual Entry Points The briefing can be automated to fire every morning at 7 AM via a Schedule Trigger, or it can be requested on-demand by texting the word brief to your WATI number. 🌤️ Real-time Environment Data Fetches current weather conditions (temperature, humidity, and wind speed) for your specific city using the OpenWeatherMap API. 📰 Custom News Feed Pulls the top 3 headlines based on your personal interests (e.g., technology, business) via NewsAPI. 📅 Agenda Integration Syncs with Google Calendar and Google Tasks to list your upcoming meetings and high-priority to-do items for the day. 🤖 AI-Powered Greetings OpenAI (GPT-4o) analyzes your weather and schedule to write a unique, 15-word opener that sets the tone for your day. ✨ Key Features Self-Service Subscription:** Users can join or leave the service themselves by texting subscribe or stop. Subscriber-Specific Config:** Supports multiple users, each with their own city and news interest preferences. Intelligent Assembly:** A central Code Node formats all data into a clean, emoji-rich WhatsApp card for easy reading. Reliability Fallbacks:** Designed with error-handling to ensure the briefing still sends even if one data source (like weather) is temporarily unavailable. 💼 Perfect For Busy Professionals:** Getting a snapshot of your day before your first coffee. Remote Workers:** Staying connected to global news and local weather. Productivity Enthusiasts:** Consolidating multiple task and calendar apps into one interface. Community Groups:** Providing a daily value-add service to WhatsApp group members. 🔧 What You'll Need Required Integrations WATI:** For WhatsApp messaging and command triggers. Google OAuth2:** For Calendar and Tasks access. OpenAI API:** For generating the daily personalized opener. Configuration Steps API Keys: Obtain free keys for OpenWeatherMap and NewsAPI. Subscriber List: Update the Load User Config node with your phone number and city. Credentials: Connect your Google and OpenAI accounts in n8n. Ready to wake up to a better morning? Import this template and connect your accounts to start receiving your daily briefings!
by DIGITAL BIZ TECH
SharePoint → Supabase → Google Drive Sync Workflow Overview This workflow is a multi-system document synchronization pipeline built in n8n, designed to automatically sync and back up files between Microsoft SharePoint, Supabase/Postgres, and Google Drive. It runs on a scheduled trigger, compares SharePoint file metadata against your Supabase table, downloads new or updated files, uploads them to Google Drive, and marks records as completed — keeping your databases and storage systems perfectly in sync. Workflow Structure Data Source:** SharePoint REST API for recursive folder and file discovery. Processing Layer:** n8n logic for filtering, comparison, and metadata normalization. Destination Systems:** Supabase/Postgres for metadata, Google Drive for file backup. SharePoint Sync Flow (Frontend Flow) Trigger:** Schedule Trigger Runs at fixed intervals (customizable) to start synchronization. Fetch Files:** Microsoft SharePoint HTTP Request Recursively retrieves folders and files using SharePoint’s REST API: /GetFolderByServerRelativeUrl(...)?$expand=Files,Folders,Folders/Files,Folders/Folders/Folders/Files Filter Files:** filter files A Code node that flattens nested folders and filters unwanted file types: Excludes system or temporary files (~$) Excludes extensions: .db, .msg, .xlsx, .xlsm, .pptx Normalize Metadata:** normalize last modified date Ensures consistent Last_modified_date format for accurate comparison. Fetch Existing Records:** Supabase (Get) Retrieves current entries from n8n_metadata to compare against SharePoint files. Compare Datasets:** Compare Datasets Detects new or modified files based on UniqueId, Last_modified_date, and Exists. Routes only changed entries forward for processing. File Processing Engine (Backend Flow) Loop:** Loop Over Items2 Iterates through each new or updated file detected. Build Metadata:** get metadata and Set metadata Constructs final metadata fields: file_id, file_title, file_url, file_type, foldername, last_modified_date Generates fileUrl using UniqueId and ServerRelativeUrl if missing. Upsert Metadata:** Insert Document Metadata Inserts or updates file records in Supabase/Postgres (n8n_metadata table). Operation: upsert with id as the primary matching key. Download File:** Microsoft SharePoint HTTP Request1 Fetches the binary file directly from SharePoint using its ServerRelativeUrl. Rename File:** rename files Renames each downloaded binary file to its original file_title before upload. Upload File:** Upload file Uploads the renamed file to Google Drive (My Drive → root folder). Mark Complete:** Postgres Updates the Supabase/Postgres record setting Loading Done = true. Optional Cleanup:** Supabase1 Deletes obsolete or invalid metadata entries when required. Integrations Used | Service | Purpose | Credential | |----------|----------|-------------| | Microsoft SharePoint | File retrieval and download | microsoftSharePointOAuth2Api | | Supabase / Postgres | Metadata storage and synchronization | Supabase account 6 ayan | | Google Drive | File backup and redundancy | Google Drive account 6 rn dbt | | n8n Core | Flow control, dataset comparison, batch looping | Native | System Prompt Summary > “You are a SharePoint document synchronization workflow. Fetch all files, compare them to database entries, and only process new or modified files. Download files, rename correctly, upload to Google Drive, and mark as completed in Supabase.” Workflow rule summary: > “Maintain data integrity, prevent duplicates, handle retries gracefully, and continue on errors. Skip excluded file types and ensure reliable backups between all connected systems.” Key Features Scheduled automatic sync across SharePoint, Supabase, and Google Drive Intelligent comparison to detect only new or modified files Idempotent upsert for consistent metadata updates Configurable file exclusion filters Safe rename + upload pipeline for clean backups Error-tolerant and fully automated operation Summary > A reliable, SharePoint-to-Google Drive synchronization workflow built with n8n, integrating Supabase/Postgres for metadata management. It automates file fetching, filtering, downloading, uploading, and marking as completed — ensuring your data stays mirrored across platforms. Perfect for enterprises managing document automation, backup systems, or cross-cloud data synchronization. Need Help or More Workflows? Want to customize this workflow for your organization? Our team at Digital Biz Tech can extend it for enterprise-scale document automation, RAGs and social media automation. We can help you set it up for free — from connecting credentials to deploying it live. Contact: rajeet.nair@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help.
by Robert Breen
This n8n workflow template automatically monitors your Google Sheets for new entries and uses AI to generate detailed descriptions for each topic. Perfect for content creators, researchers, project managers, or anyone who needs automatic content generation based on simple topic inputs. What This Workflow Does This automated workflow: Monitors a Google Sheet for new rows added to the "data" tab Takes the topic from each new row Uses OpenAI GPT to generate a detailed description of that topic Updates the same row with the AI-generated description Logs all activity in a separate "actions" tab for tracking The workflow runs every minute, checking for new entries and processing them automatically. Tools & Services Used N8N** - Workflow automation platform OpenAI API** - AI-powered description generation (GPT-4.1-mini) Google Sheets** - Data input, storage, and activity logging Google Sheets Trigger** - Real-time monitoring for new rows Prerequisites Before implementing this workflow, you'll need: N8N Instance - Self-hosted or cloud version OpenAI API Account - For AI description generation Google Account - For Google Sheets integration Google Sheets API Access - For both reading and writing to sheets Step-by-Step Setup Instructions Step 1: Set Up OpenAI API Access Visit OpenAI's API platform Create an account or log in Navigate to API Keys section Generate a new API key Copy and securely store your API key Step 2: Set Up Your Google Sheets Option 1: Use Our Pre-Made Template (Recommended) Copy our template: AI Description Generator Template Click "File" → "Make a copy" to create your own version Rename it as desired (e.g., "My AI Content Generator") Note your new sheet's URL - you'll need this for the workflow Option 2: Create From Scratch Go to Google Sheets Create a new spreadsheet Set up the main "data" tab: Rename "Sheet1" to "data" Set up column headers in row 1: A1: topic B1: description Create an "actions" tab: Add a new sheet and name it "actions" Set up column headers: A1: Update Copy your sheet's URL Step 3: Configure Google API Access Enable Google Sheets API Go to Google Cloud Console Create a new project or select existing one Enable "Google Sheets API" Enable "Google Drive API" Create Service Account (for N8N) In Google Cloud Console, go to "IAM & Admin" → "Service Accounts" Create a new service account Download the JSON credentials file Share your Google Sheet with the service account email address Step 4: Import and Configure the N8N Workflow Import the Workflow Copy the workflow JSON from the template In your N8N instance, go to Workflows → Import from JSON Paste the JSON and import Configure OpenAI Credentials Click on the "OpenAI Chat Model" node Set up credentials using your OpenAI API key Test the connection to ensure it works Configure Google Sheets Integration For the Trigger Node: Click on "Row added - Google Sheet" node Set up Google Sheets Trigger OAuth2 credentials Select your spreadsheet from the dropdown Choose the "data" sheet Set polling to "Every Minute" (already configured) For the Update Node: Click on "Update row in sheet" node Use the same Google Sheets credentials Select your spreadsheet and "data" sheet Verify column mapping (topic → topic, description → AI output) For the Actions Log Node: Click on "Append row in sheet" node Use the same Google Sheets credentials Select your spreadsheet and "actions" sheet Step 5: Customize the AI Description Generator The workflow uses a simple prompt that can be customized: Click on the "Description Writer" node Modify the system message to change the AI behavior: write a description of the topic. output like this. { "description": "description" } Need Help with Implementation? For professional setup, customization, or troubleshooting of this workflow, contact: Robert - Ynteractive Solutions Email**: robert@ynteractive.com Website**: www.ynteractive.com LinkedIn**: linkedin.com/in/robert-breen-29429625/ Specializing in AI-powered workflow automation, business process optimization, and custom integration solutions.
by Muhammad Asadullah
Short Description (for listing) Import products from Google Sheets to Shopify with automatic handling of single products and multi-variant products (sizes, colors, etc.). Includes SKU management, inventory tracking, and image uploads via GraphQL API. Category E-commerce Productivity Data Import/Export Full Description Overview This workflow automates the process of importing products from a Google Sheet into your Shopify store. It intelligently detects and handles both simple products and products with multiple variants (like different sizes or colors), creating them with proper SKU management, pricing, inventory levels, and images. Key Features ✅ Dual Product Support: Handles single products and multi-variant products automatically ✅ Smart SKU Parsing: Automatically groups variants by parsing SKU format (e.g., 12345-SM, 12345-MD) ✅ Inventory Management: Sets stock levels for each variant at your default location ✅ Image Upload: Attaches product images from URLs ✅ GraphQL API: Uses Shopify's modern GraphQL API for reliable product creation ✅ Batch Processing: Process multiple products in one workflow run Use Cases Initial store setup with bulk product import Regular inventory updates from spreadsheet Migrating products from another platform Managing seasonal product catalogs Synchronizing products with external systems Requirements Shopify store with Admin API access Google Sheets API credentials n8n version 1.0+ Basic understanding of GraphQL (helpful but not required) What You'll Need to Configure Shopify Admin API token Your Shopify store URL (in 'set store url' node) Google Sheets connection (Optional) Vendor name and product type defaults Input Format Your Google Sheet should contain columns: Product Name SKU (format: BASESKU-VARIANT for variants) Size (or other variant option) Price On hand Inventory Product Image (URL) Products with the same name are automatically grouped as variants. How It Works Reads product data from your Google Sheet Groups products by name and detects if they have variants Switches to appropriate creation path (single or variant) Creates product in Shopify with options and variants Updates each variant with SKU and pricing Sets inventory levels at your location Uploads product images Technical Details Uses Shopify GraphQL Admin API (2025-04) Handles up to 100 variants per product Processes variants individually for accurate data mapping Includes error handling for missing data Supports one inventory location per run Common Modifications Change vendor name and product type Add more variant options (color, material, etc.) Customize product status (draft vs active) Modify inventory location selection Add product descriptions Perfect For Shopify store owners managing large catalogs E-commerce managers doing bulk imports Agencies setting up client stores Developers building automated product workflows Difficulty: Intermediate Estimated Setup Time: 15-30 minutes Nodes Used: 16 External Services: Shopify, Google Sheets
by Beex
Summary This workflow detects ticket classification events in Beex where the communication channel was WhatsApp, extracts the messages from the interaction, and logs them as activity in the corresponding HubSpot contact. How it works Beex Trigger Receives the ticket classification event On Managment Create via a pre-configured callback. Filter by Channel The automation only considers classification events where the communication channel was a WhatsApp message. Get Phone The phone number is used to find the contact to whom the activity should be assigned in HubSpot. The country code must be configured manually. Search Contact Finds the contact in HubSpot using the phone number. Get Messages When a ticket is categorized, its ID and all messages from the interaction can be retrieved from trigger node. Routing, Formatting, and Consolidation We route messages based on their content, whether text, image, or audio. Each message is formatted in HTML, compatible with HubSpot activities. Sort Messages Messages are sorted according to their created_at field. Consolidate Chats Individual messages are consolidated into a single record (all in HTML format). Send all chat content to hs_communication_body in API HubSpot. Setup Instructions Install Beex Nodes: Before importing the template, install the Beex trigger and node packages using the following package names: n8n-beex-nodes Configure HubSpot Credentials: Configure your HubSpot connection with the following: Access token (typically from a private application) Read and write permissions for the Contacts objects Configure Beex Credentials: For Beex users with platform access (for testing requests, contact frank@beexcc.com): Go to Platform Settings → API Key and Callback. Copy your API key and paste it into the Beex node (Get Messages) in n8n. Activate Typing Registry in Callback Integration option Configure the Webhook URL: Copy the Webhook URL (Test/Production) from the Beex activation node and paste it into the Callback Integration section in Beex. Save your changes. Requirements HubSpot: An account with a private application token and read/write permissions for **Contacts objects. Beex: An account with permissions to receive **Typing Registry events in Callback Integration. Customization Options You can customize the HTML format provided to text, audio, or image messages.
by Yatharth Chauhan
Feedback Sentiment Workflow (Typeform → GCP → Notion/Slack/Trello) This template ingests feedback from Typeform, runs Google Cloud Natural Language sentiment analysis, routes based on sentiment, and then creates a Notion database page and posts a Slack notification for positive items, or creates a Trello card for negative items. The flow is designed for quick setup and safe sharing using placeholders for IDs and credentials. How it Works Typeform Trigger Captures each new submission and exposes answers like Name and the long-text Feedback field. Google Cloud Natural Language Analyzes the feedback text and returns a sentiment score in: documentSentiment.score Check Sentiment Score (IF) True branch: Score > 0 → Positive False branch: Score ≤ 0 → Non-positive Add Feedback to Notion (True branch) Creates a new page in a Notion database with mapped properties. Notify Slack (after Notion) Posts the feedback, author, and score to a Slack channel for visibility. Create Trello Card (False branch) Logs non-positive items to a Trello list for follow-up. Required Accounts Google Cloud Natural Language API** enabled (OAuth2 or service credentials). Notion integration** with database access to create pages. Slack app/bot token** with permission to post to the target channel. Typeform account** with a form including: Long Text feedback question Name field Notion Database Columns Name (title):** Person name or responder label Feedback (rich_text):** Full feedback text Sentiment Score (number):** Numeric score from GCP ∈ [-1, 1] Source (select/text):** "Typeform" for provenance Submitted At (date):** Timestamp from the trigger Customization Options Sentiment Threshold:** Adjust IF condition (e.g., ≥ 0.25) for stricter positivity. Slack Routing:** Change channel, add blocks/attachments for richer summaries. Trello Path:** Point to a triage list and include labels for priority. Field Mapping:** Update the expression for feedback question to match Typeform label. Database Schema:** Add tags, product area, or customer tier for reporting. Setup Steps Connect credentials: Typeform, GCP Natural Language, Notion, Slack, Trello. Replace placeholders in workflow JSON: Form ID Database ID Slack Channel Trello List ID Map fields: Set Feedback + Name expressions from Typeform Trigger output into Notion and Slack. Adjust IF threshold for your definition of "positive". Test with a sample response and confirm: Notion page creation Slack notification Trello card logging
by Margo Rey
Generate and send MadKudu Account Brief into Outreach This workflow generates an account brief tailored to your company using MadKudu MCP and OpenAI and syncs it to a custom field in Outreach. Its for Sales who want to give reps rich account context right inside Outreach, and draft Outreach email with Outreach Revenue Agent based on MadKudu account brief. ✨ Who it's for RevOps or GTM teams using MadKudu + Salesforce + Outreach Sales teams needing dynamic, AI-generated context for target accounts 🔧 How it works 1. Select Accounts: Use a Salesforce node to define which accounts to brief. Filter logic can be updated to match ICP or scoring rules (e.g., MadKudu Fit + LTB). 2. Generate Brief with MadKudu MCP & AI MadKudu MCP provides the account brief instructions, research online for company recent news and provides structured account context from your integrations connected to MadKudu + external signals (firmographics, past opportunities, active contacts, job openings...) The AI agent (OpenAI model) turns this into a readable account brief. 3. Send to Outreach Match account in Outreach via domain. Update a custom field (e.g., custom49) with the brief text. 📋 How to set up Connect your Salesforce account Used to pull accounts that need a brief. Set your OpenAI credentials Required for the AI Agent to generate the brief. Create a n8n Variable to store your MadKudu API key named madkudu_api_key used for the MadKudu MCP tool The AI Agent pulls the account brief instructions and all the context necessary to generate the briefs. Create an Oauth2 API credential to connect your Outreach account Used to sync to brief to Outreach. Customize the Salesforce filter In the “Get accounts” node, define which accounts should get a brief (e.g. Fit > 90). Map your Outreach custom field Update the JSON Body request with your actual custom field ID (e.g. custom49). 🔑 How to connect Outreach In n8n, add a new Oauth2 API credential and copy the callback URL Now go to Outreach developer portal Click “Add” to create a new app In Feature selection add Outreach API (OAuth) In API Access (Oauth) set the redirect URI to the n8n callback Select the following scopes accounts.read, accounts.write Save in Outreach Now enter the Outreach Application ID into n8n Client Id and the Outreach Application Secret into n8n Client secret Save in n8n and connect via Oauth your Outreach Account ✅ Requirements MadKudu account with access to API Key Salesforce Oauth Outreach Admin permissions to create an app OpenAI API Key 🛠 How to customize the workflow Change the targeting logic** Edit the Salesforce filter to control which accounts are eligible. Rewrite the prompt** Tweak the prompt in the AI Agent node to adjust format, tone, or insights included in the brief. Change the Outreach account field** Update the Outreach field where the brief is sync-ed if you're using a different custom field (e.g. custom48, custom32, etc). Use a different trigger** Swap the manual trigger for a Schedule or Webhook to automate the flow end-to-end.
by Mantaka Mahir
Automate Google Classroom: Topics, Assignments & Student Tracking Automate Google Classroom via the Google Classroom API to efficiently manage courses, topics, teachers, students, announcements, and coursework. Use Cases Educational Institution Management Sync rosters, post weekly announcements, and generate submission reports automatically. Remote Learning Coordination Batch-create assignments, track engagement, and auto-notify teachers on new submissions. Training Program Automation Automate training modules, manage enrollments, and generate completion/compliance reports. Prerequisites n8n (cloud or self-hosted) Google Cloud Console access for OAuth setup Google Classroom API enabled Google Gemini API key** (free) for the agent brain — or swap in any other LLM if preferred Setup Instructions Step 1: Google Cloud Project Create a new project in Google Cloud Console. Enable Google Classroom API. Create OAuth 2.0 Client ID credentials. Add your n8n OAuth callback URL as a redirect URI. Note down the Client ID and Client Secret. Step 2: OAuth Setup in n8n In n8n, open HTTP Request Node → Authentication → Predefined Credential Type. Select Google OAuth2 API. Enter your Client ID and Client Secret. Click Connect my account to complete authorization. Test the connection. Step 3: Import & Configure Workflow Import this workflow template into n8n. Link all Google Classroom nodes to your OAuth credential. Configure the webhook if using external triggers. Test each agent for API connectivity. Step 4: Customization You can customize each agent’s prompt to your liking for optimal results, or copy and modify node code to expand functionality. All operations use HTTP Request nodes, so you can integrate more tools via the Google Classroom API documentation. This workflow provides a strong starting point for deeper automation and integration. Features Course Topics List, create, update, or delete topics within a course. Teacher & Student Management List, retrieve, and manage teachers and students programmatically. Course Posts List posts, retrieve details and attachments, and access submission data. Announcements List, create, update, or delete announcements across courses. Courses List all courses, get detailed information, and view grading periods. Coursework List, retrieve, or analyze coursework within any course. Notes Once OAuth and the LLM connection are configured, this workflow automates all Google Classroom operations. Its modular structure lets you activate only what you need—saving API quota and improving performance.