by Iternal Technologies
Blockify® Technical Manual Data Optimization Workflow Blockify Optimizes Data for Technical Manual RAG and Agents - Giving Structure to Unstructured Data for ~78X Accuracy, when pairing Blockify Ingest and Blockify Distill Learn more at https://iternal.ai/blockify Get Free Demo API Access here: https://console.blockify.ai/signup Read the Technical Whitepaper here: https://iternal.ai/blockify-results See example Accuracy Comparison here: https://iternal.ai/case-studies/medical-accuracy/ Blockify is a data optimization tool that takes messy, unstructured text, like hundreds of sales‑meeting transcripts or long proposals, and intelligently optimizes the data into small, easy‑to‑understand "IdeaBlocks." Each IdeaBlock is just a couple of sentences in length that capture one clear idea, plus a built‑in contextualized question and answer. With this approach, Blockify improves accuracy of LLMs (Large Language Models) by an average aggregate 78X, while shrinking the original mountain of text to about 2.5% of its size while keeping (and even improving) the important information. When Blockify's IdeaBlocks are compared with the usual method of breaking text into equal‑sized chunks, the results are dramatic. Answers pulled from the distilled IdeaBlocks are roughly 40X more accurate, and user searches return the right information about 52% more accurate. In short, Blockify lets you store less data, spend less on computing, and still get better answers- turning huge documents into a concise, high‑quality knowledge base that anyone can search quickly. Blockify works by processing chunks of text to create structured data from an unstructured data source. Blockify® replaces the traditional "dump‑and‑chunk" approach with an end‑to‑end pipeline that cleans and organizes content before it ever hits a vector store. Admins first define who should see what, then the system ingests any file type—Word, PDF, slides, images—inside public cloud, private cloud, or on‑prem. A context‑aware splitter finds natural breaks, and a series of specially developed Blockify LLM model turns each segment into a draft IdeaBlock. GenAI systems fed with this curated data return sharper answers, hallucinate far less, and comply with security policies out of the box. The result: higher trust, lower operating cost, and a clear path to enterprise‑scale RAG without the cleanup headaches that stall most AI rollouts.
by Rahul Joshi
Description: Keep your customer knowledge base up to date with this n8n automation template. The workflow connects Zendesk with Google Sheets, automatically fetching tickets tagged as “howto,” enriching them with requester details, and saving them into a structured spreadsheet. This ensures your internal or public knowledge base reflects the latest customer how-to queries—without manual copy-pasting. Perfect for customer support teams, SaaS companies, and service providers who want to streamline documentation workflows. What This Template Does (Step-by-Step) ⚡ Manual Trigger or Scheduling Run the workflow manually for testing/troubleshooting, or configure a schedule trigger for daily/weekly updates. 📥 Fetch All Zendesk Tickets Connects to your Zendesk account and retrieves all available tickets. 🔍 Filter for "howto" Tickets Only Processes only tickets that contain the “howto” tag, ensuring relevance. 👤 Enrich User Data Fetches requester details (name, email, profile info) to provide context. 📊 Update Google Sheets Knowledge Base Saves ticket data—including Ticket No., Description, Status, Tag, Owner Name, and Email. ✔️ Smart update prevents duplicates by matching on description. 🔁 Continuous Sync Each new or updated “howto” ticket is synced automatically into your knowledge base sheet. Key Features 🔍 Tag-based filtering for precise categorization 📊 Smart append-or-update logic in Google Sheets ⚡ Zendesk + Google Sheets integration with OAuth2 ♻️ Keeps knowledge base fresh without manual effort 🔐 Secure API credential handling Use Cases 📖 Maintain a live “how-to” guide from real customer queries 🎓 Build self-service documentation for support teams 📩 Monitor and track recurring help topics 💼 Equip knowledge managers with a ready-to-export dataset Required Integrations Zendesk API (for ticket fetch + user info) Google Sheets (for storing/updating records) Why Use This Template? ✅ Automates repetitive data entry ✅ Ensures knowledge base accuracy & freshness ✅ Reduces support team workload ✅ Easy to extend with more tags, filters, or sheet logic
by Mohamed Abdelwahab
1. Overview The IngestionDocs workflow is a fully automated **document ingestion and knowledge management system* built with *n8n**. Its purpose is to continuously ingest organizational documents from Google Drive, transform them into vector embeddings using OpenAI, store them in Pinecone, and make them searchable and retrievable through an AI-powered Q&A interface. This ensures that employees always have access to the most up-to-date knowledge base without requiring manual intervention. 2. Key Objectives Automated Ingestion** → Seamlessly process new and updated documents from Google Drive.\ Change Detection** → Track and differentiate between new, updated, and previously processed documents.\ Knowledge Base Construction** → Convert documents into embeddings for semantic search.\ AI-Powered Assistance** → Provide an intelligent Q&A system for employees to query manuals.\ Scalable & Maintainable** → Modular design using n8n, LangChain, and Pinecone. 3. Workflow Breakdown A. Document Monitoring and Retrieval The workflow begins with two Google Drive triggers: File Created Trigger → Fires when a new document is uploaded.\ File Updated Trigger → Fires when an existing document is modified.\ A search operation lists the files in the designated Google Drive folder.\ Non-downloadable items (e.g., subfolders) are filtered out.\ For valid files: The file is downloaded.\ A SHA256 hash is generated to uniquely identify the file's content. B. Record Management (Google Sheets Integration) To keep track of ingestion states, the workflow uses a **Google Sheets--based Record Manager**:\ Each file entry contains:\ Id** (Google Drive file ID)\ Name** (file name)\ hashId** (SHA256 checksum)\ The workflow compares the current file's hash with the stored one:\ New Document** → File not found in records → Inserted into the Record Manager.\ Already Processed** → File exists and hash matches → Skipped.\ Updated Document** → File exists but hash differs → Record is updated. This guarantees that only new or modified content is processed, avoiding duplication. C. Document Processing and Vectorization Once a document is marked as new or updated:\ Default Data Loader extracts its content (binary files supported).\ Pages are split into individual chunks.\ Metadata such as file ID and name are attached.\ Recursive Character Text Splitter divides the content into manageable segments with overlap.\ OpenAI Embeddings (text-embedding-3-large) transform each text chunk into a semantic vector.\ Pinecone Vector Store stores these vectors in the configured index:\ For new documents, embeddings are inserted into a namespace based on the file name.\ For updated documents, the namespace is cleared first, then re-ingested with fresh embeddings. This process builds a scalable and queryable knowledge base. D. Knowledge Base Q&A Interface The workflow also provides an **interactive form-based user interface**:\ Form Trigger** → Collects employee questions.\ LangChain AI Agent**:\ Receives the question.\ Retrieves relevant context from Pinecone using vector similarity search.\ Processes the response using OpenAI Chat Model (gpt-4.1-mini).\ Answer Formatting**:\ Responses are returned in HTML format for readability.\ A custom CSS theme ensures a modern, user-friendly design.\ Answers may include references to page numbers when available. This creates a self-service knowledge base assistant that employees can query in natural language. 4. Technologies Used n8n** → Orchestration of the entire workflow.\ Google Drive API** → File monitoring, listing, and downloading.\ Google Sheets API** → Record manager for tracking file states.\ OpenAI API**: text-embedding-3-large for semantic vector creation.\ gpt-4.1-mini for conversational Q&A.\ Pinecone** → Vector database for embedding storage and retrieval.\ LangChain** → Document loaders, text splitters, vector store connectors, and agent logic.\ Crypto (SHA256)** → File hash generation for change detection.\ Form Trigger + Form Node** → Employee-facing Q&A submission and answer display.\ Custom CSS** → Provides a modern, responsive, styled UI for the knowledge base. 5. End-to-End Data Flow Employee uploads or updates a document → Google Drive detects the change.\ Workflow downloads and hashes the file → Ensures uniqueness and detects modifications.\ Record Manager (Google Sheets) → Decides whether to skip, insert, or update the record.\ Document Processing → Splitting + Embedding + Storing into Pinecone.\ Knowledge Base Updated → The latest version of documents is indexed.\ Employee asks a question via the web form.\ AI Agent retrieves embeddings from Pinecone + uses GPT-4.1-mini → Generates a contextual answer.\ Answer displayed in styled HTML → Delivered back to the employee through the form interface. 6. Benefits Always Up-to-Date** → Automatically syncs documents when uploaded or changed.\ No Duplicates** → Smart hashing ensures only relevant updates are reprocessed.\ Searchable Knowledge Base** → Employees can query documents semantically, not just by keywords.\ Enhanced Productivity** → Answers are immediate, reducing time spent browsing manuals.\ Scalable** → New documents and users can be added without workflow redesign. ✅ In summary, IngestionDocs is a **robust AI-driven document ingestion and retrieval system* that integrates *Google Drive, Google Sheets, OpenAI, and Pinecone* within *n8n**. It continuously builds and maintains a knowledge base of manuals while offering employees an intelligent, user-friendly Q&A assistant for fast and accurate knowledge retrieval.
by SpaGreen Creative
Bulk WhatsApp Campaign Automation with Rapiwa API (Unofficial Integration) Who’s it for This n8n workflow lets you send bulk WhatsApp messages using your own number through Rapiwa API, avoiding the high cost and limitations of the official WhatsApp API. It integrates seamlessly with Google Sheets, where you can manage your contacts and messages with ease. Ideal for easy-to-maintain bulk messaging solution using their own personal or business WhatsApp number. This solution is perfect for small businesses, marketers, or teams looking for a cost-effective way to manage WhatsApp communication at scale. How it Works / What It Does Reads data from a Google Sheet where the Status column is marked as "pending". Cleans each phone number (removes special characters, spaces, etc.). Verifies if the number is a valid WhatsApp user using the Rapiwa API. If valid: Sends the message via Rapiwa. Updates Status = sent and Verification = verified. If invalid: Skips message sending. Updates Status = not sent and Verification = unverified. Waits for a few seconds (rate-limiting). Loops through the next item. The entire process is triggered automatically every 5 minutes. How to Set Up Duplicate the Sample Sheet: Use this format. Fill Contacts: Add columns like WhatsApp No, Name, Message, Image URL, and set Status = pending. Connect Google Sheets: Authenticate and link Google Sheets node inside n8n. Subscribe to Rapiwa: Go to Rapiwa.com and get your API key. Paste API Key: Use the HTTP Bearer token credential in n8n. Activate the Workflow: Let n8n take care of the automation. Requirements Google Sheets API credentials Configured Google Sheet (template linked above) WhatsApp (Personal or Business) n8n instance with credentials setup How to Customize the Workflow Add delay between messages**: Use the Wait node to introduce pauses (e.g., 5–10 seconds). Change message format**: Modify the HTTP Request node to send media or templates. Personalize content**: Include dynamic fields like Name, Image URL, etc. Error handling**: Add IF or SET nodes to capture failed attempts, retry, or log errors. Workflow Highlights Triggered every 5 minutes** using the Schedule Trigger node. Filters messages** with Status = pending. Cleans numbers* and *verifies WhatsApp existence** before sending. Sends WhatsApp messages** via Rapiwa (Unofficial API). Updates Google Sheets** to mark Status = sent or not sent and Verification = verified/unverified. Wait node** prevents rapid-fire sending that could lead to being flagged by WhatsApp. Setup in n8n 1. Connect Google Sheets Add a Google Sheets node Authenticate using your Google account Select the document and worksheet Use filter: Status = pending 2. Loop Through Rows Use SplitInBatches or a Code node to process rows in small chunks (e.g., 5 rows) Add a Wait node to delay 5 seconds between messages 3. Send Message via HTTP Node How the "Send Message Using Rapiwa" Node Sends Messages This node makes an HTTP POST request to the Rapiwa API endpoint: https://app.rapiwa.com/api/send-message It uses Bearer Token Authentication with your Rapiwa API key. When this node runs, it sends a WhatsApp message to the specified number with the given text and optional image. The Rapiwa API handles message delivery using your own WhatsApp number connected to their service. JSON Body**: { "number": "{{ $json['WhatsApp No'] }}", "message": "{{ $json['Message'] }}" } Sample Google Sheet Structure A Google Sheet formatted like this sample | SL | WhatsApp No | Name | Message | Image URL | Verification | Status | |----|----------------|------------------------|----------------------|---------------------------------------------------------------------------|--------------|---------| | 1 | 8801322827799 | SpaGreen Creative | This is Test Message | https://spagreen.sgp1.cdn.digitaloceanspaces.com/... | verified | sent | | 2 | 8801725402187 | Abdul Mannan Zinnat | This is Test Message | https://spagreen.sgp1.cdn.digitaloceanspaces.com/... | verified | sent | Tips Modify the Limit node to increase/decrease messages per cycle. Adjust the Wait node to control how fast messages are sent (e.g., 5–10s delay). Make sure WhatsApp numbers are properly formatted (e.g., 8801XXXXXXXXX, no +, no spaces). Store your Rapiwa API key securely using n8n credentials. Use publicly accessible image URLs if sending images. Always mark processed messages as "sent" to avoid duplicates. Use the Error workflow in n8n to catch failed sends for retry. Test with a small batch before going full-scale. Schedule the Trigger node for every 5 minutes to keep automation running. Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Community Need help setting up or customizing the workflow? Reach out here: WhatsApp: Chat with Support Discord: Join SpaGreen Server Facebook Group: SpaGreen Community Website: SpaGreen Creative Envato: SpaGreen Portfolio
by Robert Breen
This n8n workflow template automatically monitors your Google Sheets for new entries and uses AI to generate detailed descriptions for each topic. Perfect for content creators, researchers, project managers, or anyone who needs automatic content generation based on simple topic inputs. What This Workflow Does This automated workflow: Monitors a Google Sheet for new rows added to the "data" tab Takes the topic from each new row Uses OpenAI GPT to generate a detailed description of that topic Updates the same row with the AI-generated description Logs all activity in a separate "actions" tab for tracking The workflow runs every minute, checking for new entries and processing them automatically. Tools & Services Used N8N** - Workflow automation platform OpenAI API** - AI-powered description generation (GPT-4.1-mini) Google Sheets** - Data input, storage, and activity logging Google Sheets Trigger** - Real-time monitoring for new rows Prerequisites Before implementing this workflow, you'll need: N8N Instance - Self-hosted or cloud version OpenAI API Account - For AI description generation Google Account - For Google Sheets integration Google Sheets API Access - For both reading and writing to sheets Step-by-Step Setup Instructions Step 1: Set Up OpenAI API Access Visit OpenAI's API platform Create an account or log in Navigate to API Keys section Generate a new API key Copy and securely store your API key Step 2: Set Up Your Google Sheets Option 1: Use Our Pre-Made Template (Recommended) Copy our template: AI Description Generator Template Click "File" → "Make a copy" to create your own version Rename it as desired (e.g., "My AI Content Generator") Note your new sheet's URL - you'll need this for the workflow Option 2: Create From Scratch Go to Google Sheets Create a new spreadsheet Set up the main "data" tab: Rename "Sheet1" to "data" Set up column headers in row 1: A1: topic B1: description Create an "actions" tab: Add a new sheet and name it "actions" Set up column headers: A1: Update Copy your sheet's URL Step 3: Configure Google API Access Enable Google Sheets API Go to Google Cloud Console Create a new project or select existing one Enable "Google Sheets API" Enable "Google Drive API" Create Service Account (for N8N) In Google Cloud Console, go to "IAM & Admin" → "Service Accounts" Create a new service account Download the JSON credentials file Share your Google Sheet with the service account email address Step 4: Import and Configure the N8N Workflow Import the Workflow Copy the workflow JSON from the template In your N8N instance, go to Workflows → Import from JSON Paste the JSON and import Configure OpenAI Credentials Click on the "OpenAI Chat Model" node Set up credentials using your OpenAI API key Test the connection to ensure it works Configure Google Sheets Integration For the Trigger Node: Click on "Row added - Google Sheet" node Set up Google Sheets Trigger OAuth2 credentials Select your spreadsheet from the dropdown Choose the "data" sheet Set polling to "Every Minute" (already configured) For the Update Node: Click on "Update row in sheet" node Use the same Google Sheets credentials Select your spreadsheet and "data" sheet Verify column mapping (topic → topic, description → AI output) For the Actions Log Node: Click on "Append row in sheet" node Use the same Google Sheets credentials Select your spreadsheet and "actions" sheet Step 5: Customize the AI Description Generator The workflow uses a simple prompt that can be customized: Click on the "Description Writer" node Modify the system message to change the AI behavior: write a description of the topic. output like this. { "description": "description" } Need Help with Implementation? For professional setup, customization, or troubleshooting of this workflow, contact: Robert - Ynteractive Solutions Email**: robert@ynteractive.com Website**: www.ynteractive.com LinkedIn**: linkedin.com/in/robert-breen-29429625/ Specializing in AI-powered workflow automation, business process optimization, and custom integration solutions.
by Mantaka Mahir
Automate Google Classroom: Topics, Assignments & Student Tracking Automate Google Classroom via the Google Classroom API to efficiently manage courses, topics, teachers, students, announcements, and coursework. Use Cases Educational Institution Management Sync rosters, post weekly announcements, and generate submission reports automatically. Remote Learning Coordination Batch-create assignments, track engagement, and auto-notify teachers on new submissions. Training Program Automation Automate training modules, manage enrollments, and generate completion/compliance reports. Prerequisites n8n (cloud or self-hosted) Google Cloud Console access for OAuth setup Google Classroom API enabled Google Gemini API key** (free) for the agent brain — or swap in any other LLM if preferred Setup Instructions Step 1: Google Cloud Project Create a new project in Google Cloud Console. Enable Google Classroom API. Create OAuth 2.0 Client ID credentials. Add your n8n OAuth callback URL as a redirect URI. Note down the Client ID and Client Secret. Step 2: OAuth Setup in n8n In n8n, open HTTP Request Node → Authentication → Predefined Credential Type. Select Google OAuth2 API. Enter your Client ID and Client Secret. Click Connect my account to complete authorization. Test the connection. Step 3: Import & Configure Workflow Import this workflow template into n8n. Link all Google Classroom nodes to your OAuth credential. Configure the webhook if using external triggers. Test each agent for API connectivity. Step 4: Customization You can customize each agent’s prompt to your liking for optimal results, or copy and modify node code to expand functionality. All operations use HTTP Request nodes, so you can integrate more tools via the Google Classroom API documentation. This workflow provides a strong starting point for deeper automation and integration. Features Course Topics List, create, update, or delete topics within a course. Teacher & Student Management List, retrieve, and manage teachers and students programmatically. Course Posts List posts, retrieve details and attachments, and access submission data. Announcements List, create, update, or delete announcements across courses. Courses List all courses, get detailed information, and view grading periods. Coursework List, retrieve, or analyze coursework within any course. Notes Once OAuth and the LLM connection are configured, this workflow automates all Google Classroom operations. Its modular structure lets you activate only what you need—saving API quota and improving performance.
by Margo Rey
Generate and send MadKudu Account Brief into Outreach This workflow generates an account brief tailored to your company using MadKudu MCP and OpenAI and syncs it to a custom field in Outreach. Its for Sales who want to give reps rich account context right inside Outreach, and draft Outreach email with Outreach Revenue Agent based on MadKudu account brief. ✨ Who it's for RevOps or GTM teams using MadKudu + Salesforce + Outreach Sales teams needing dynamic, AI-generated context for target accounts 🔧 How it works 1. Select Accounts: Use a Salesforce node to define which accounts to brief. Filter logic can be updated to match ICP or scoring rules (e.g., MadKudu Fit + LTB). 2. Generate Brief with MadKudu MCP & AI MadKudu MCP provides the account brief instructions, research online for company recent news and provides structured account context from your integrations connected to MadKudu + external signals (firmographics, past opportunities, active contacts, job openings...) The AI agent (OpenAI model) turns this into a readable account brief. 3. Send to Outreach Match account in Outreach via domain. Update a custom field (e.g., custom49) with the brief text. 📋 How to set up Connect your Salesforce account Used to pull accounts that need a brief. Set your OpenAI credentials Required for the AI Agent to generate the brief. Create a n8n Variable to store your MadKudu API key named madkudu_api_key used for the MadKudu MCP tool The AI Agent pulls the account brief instructions and all the context necessary to generate the briefs. Create an Oauth2 API credential to connect your Outreach account Used to sync to brief to Outreach. Customize the Salesforce filter In the “Get accounts” node, define which accounts should get a brief (e.g. Fit > 90). Map your Outreach custom field Update the JSON Body request with your actual custom field ID (e.g. custom49). 🔑 How to connect Outreach In n8n, add a new Oauth2 API credential and copy the callback URL Now go to Outreach developer portal Click “Add” to create a new app In Feature selection add Outreach API (OAuth) In API Access (Oauth) set the redirect URI to the n8n callback Select the following scopes accounts.read, accounts.write Save in Outreach Now enter the Outreach Application ID into n8n Client Id and the Outreach Application Secret into n8n Client secret Save in n8n and connect via Oauth your Outreach Account ✅ Requirements MadKudu account with access to API Key Salesforce Oauth Outreach Admin permissions to create an app OpenAI API Key 🛠 How to customize the workflow Change the targeting logic** Edit the Salesforce filter to control which accounts are eligible. Rewrite the prompt** Tweak the prompt in the AI Agent node to adjust format, tone, or insights included in the brief. Change the Outreach account field** Update the Outreach field where the brief is sync-ed if you're using a different custom field (e.g. custom48, custom32, etc). Use a different trigger** Swap the manual trigger for a Schedule or Webhook to automate the flow end-to-end.
by Muhammad Asadullah
Short Description (for listing) Import products from Google Sheets to Shopify with automatic handling of single products and multi-variant products (sizes, colors, etc.). Includes SKU management, inventory tracking, and image uploads via GraphQL API. Category E-commerce Productivity Data Import/Export Full Description Overview This workflow automates the process of importing products from a Google Sheet into your Shopify store. It intelligently detects and handles both simple products and products with multiple variants (like different sizes or colors), creating them with proper SKU management, pricing, inventory levels, and images. Key Features ✅ Dual Product Support: Handles single products and multi-variant products automatically ✅ Smart SKU Parsing: Automatically groups variants by parsing SKU format (e.g., 12345-SM, 12345-MD) ✅ Inventory Management: Sets stock levels for each variant at your default location ✅ Image Upload: Attaches product images from URLs ✅ GraphQL API: Uses Shopify's modern GraphQL API for reliable product creation ✅ Batch Processing: Process multiple products in one workflow run Use Cases Initial store setup with bulk product import Regular inventory updates from spreadsheet Migrating products from another platform Managing seasonal product catalogs Synchronizing products with external systems Requirements Shopify store with Admin API access Google Sheets API credentials n8n version 1.0+ Basic understanding of GraphQL (helpful but not required) What You'll Need to Configure Shopify Admin API token Your Shopify store URL (in 'set store url' node) Google Sheets connection (Optional) Vendor name and product type defaults Input Format Your Google Sheet should contain columns: Product Name SKU (format: BASESKU-VARIANT for variants) Size (or other variant option) Price On hand Inventory Product Image (URL) Products with the same name are automatically grouped as variants. How It Works Reads product data from your Google Sheet Groups products by name and detects if they have variants Switches to appropriate creation path (single or variant) Creates product in Shopify with options and variants Updates each variant with SKU and pricing Sets inventory levels at your location Uploads product images Technical Details Uses Shopify GraphQL Admin API (2025-04) Handles up to 100 variants per product Processes variants individually for accurate data mapping Includes error handling for missing data Supports one inventory location per run Common Modifications Change vendor name and product type Add more variant options (color, material, etc.) Customize product status (draft vs active) Modify inventory location selection Add product descriptions Perfect For Shopify store owners managing large catalogs E-commerce managers doing bulk imports Agencies setting up client stores Developers building automated product workflows Difficulty: Intermediate Estimated Setup Time: 15-30 minutes Nodes Used: 16 External Services: Shopify, Google Sheets
by Robert Breen
This n8n workflow template creates an efficient data analysis system that uses Google Gemini AI to interpret user questions about spreadsheet data and processes them through a specialized sub-workflow for optimized token usage and faster responses. What This Workflow Does Smart Query Parsing**: Uses Gemini AI to understand natural language questions about your data Efficient Processing**: Routes calculations through a dedicated sub-workflow to minimize token consumption Structured Output**: Automatically identifies the column, aggregation type, and grouping levels from user queries Multiple Aggregation Types**: Supports sum, average, count, count distinct, min, and max operations Flexible Grouping**: Can aggregate data by single or multiple dimensions Token Optimization**: Processes large datasets without overwhelming AI context limits Tools Used Google Gemini Chat Model** - Natural language query understanding and response formatting Google Sheets Tool** - Data access and column metadata extraction Execute Workflow** - Sub-workflow processing for data calculations Structured Output Parser** - Converts AI responses to actionable parameters Memory Buffer Window** - Basic conversation context management Switch Node** - Routes to appropriate aggregation method Summarize Nodes** - Performs various data aggregations 📋 MAIN WORKFLOW - Query Parser What This Workflow Does The main workflow receives natural language questions from users and converts them into structured parameters that the sub-workflow can process. It uses Google Gemini AI to understand the intent and extract the necessary information. Prerequisites for Main Workflow Google Cloud Platform account with Gemini API access Google account with access to Google Sheets n8n instance (cloud or self-hosted) Main Workflow Setup Instructions 1. Import the Main Workflow Copy the main workflow JSON provided In your n8n instance, go to Workflows → Import from JSON Paste the JSON and click Import Save with name: "Gemini Data Query Parser" 2. Set Up Google Gemini Connection Go to Google AI Studio Sign in with your Google account Go to Get API Key section Create a new API key or use an existing one Copy the API key Configure in n8n: Click on Google Gemini Chat Model node Click Create New Credential Select Google PaLM API Paste your API key Save the credential 3. Set Up Google Sheets Connection for Main Workflow Go to Google Cloud Console Create a new project or select existing one Enable the Google Sheets API Create OAuth 2.0 Client ID credentials In n8n, click on Get Column Info node Create Google Sheets OAuth2 API credential Complete OAuth flow 4. Configure Your Data Source Option A: Use Sample Data The workflow is pre-configured for: Sample Marketing Data Make a copy to your Google Drive Option B: Use Your Own Sheet Update Get Column Info node with your Sheet ID Ensure you have a "Columns" sheet for metadata Update sheet references as needed 5. Set Up Workflow Trigger Configure how you want to trigger this workflow (webhook, manual, etc.) The workflow will output structured JSON for the sub-workflow ⚙️ SUB-WORKFLOW - Data Processor What This Workflow Does The sub-workflow receives structured parameters from the main workflow and performs the actual data calculations. It handles fetching data, routing to appropriate aggregation methods, and formatting results. Sub-Workflow Setup Instructions 1. Import the Sub-Workflow Create a new workflow in n8n Copy the sub-workflow JSON (embedded in the Execute Workflow node) Import as a separate workflow Save with name: "Data Processing Sub-Workflow" 2. Configure Google Sheets Connection for Sub-Workflow Apply the same Google Sheets OAuth2 credential you created for the main workflow Update the Get Data node with your Sheet ID Ensure it points to your data sheet (e.g., "Data" sheet) 3. Configure Google Gemini for Output Formatting Apply the same Gemini API credential to the Google Gemini Chat Model1 node This handles final result formatting 4. Link Workflows Together In the main workflow, find the Execute Workflow - Summarize Data node Update the workflow reference to point to your sub-workflow Ensure the sub-workflow is set to accept execution from other workflows Sub-Workflow Components When Executed by Another Workflow**: Trigger that receives parameters Get Data**: Fetches all data from Google Sheets Type of Aggregation**: Switch node that routes based on aggregation type Multiple Summarize Nodes**: Handle different aggregation types (sum, avg, count, etc.) Bring All Data Together**: Combines results from different aggregation paths Write into Table Output**: Formats final results using Gemini AI Example Usage Once both workflows are set up, you can ask questions like: Overall Metrics: "Show total Spend ($)" "Show total Clicks" "Show average Conversions" Single Dimension: "Show total Spend ($) by Channel" "Show total Clicks by Campaign" Two Dimensions: "Show total Spend ($) by Channel and Campaign" "Show average Clicks by Channel and Campaign" Data Flow Between Workflows Main Workflow: User question → Gemini AI → Structured JSON output Sub-Workflow: Receives JSON → Fetches data → Performs calculations → Returns formatted table Contact Information For support, customization, or questions about this template: Email**: robert@ynteractive.com LinkedIn**: Robert Breen Need help implementing these workflows, want to remove limitations, or require custom modifications? Reach out for professional n8n automation services and AI integration support.
by Robert Breen
This n8n workflow template creates an intelligent data analysis system that converts natural language questions into Google Sheets SQL queries using OpenAI's GPT-4o model. The system generates proper Google Sheets query URLs and executes them via HTTP requests for efficient data retrieval. What This Workflow Does Natural Language to SQL**: Converts user questions into Google Sheets SQL syntax Direct HTTP Queries**: Bypasses API limits by using Google Sheets' built-in query functionality Column Letter Mapping**: Automatically maps column names to their corresponding letters (A, B, C, etc.) Structured Query Generation**: Outputs properly formatted Google Sheets query URLs Real-time Data Access**: Retrieves live data directly from Google Sheets Memory Management**: Maintains conversation context for follow-up questions Tools Used OpenAI Chat Model (GPT-4o)** - SQL query generation and natural language understanding OpenAI Chat Model (GPT-4.1 Mini)** - Result formatting and table output Google Sheets Tool** - Column metadata extraction and schema understanding HTTP Request Node** - Direct data retrieval via Google Sheets query API Structured Output Parser** - Formats AI responses into executable queries Memory Buffer Window** - Conversation history management Chat Trigger** - Webhook-based conversation interface Step-by-Step Setup Instructions 1. Prerequisites Before starting, ensure you have: An n8n instance (cloud or self-hosted) An OpenAI account with API access and billing setup A Google account with access to Google Sheets The target Google Sheet must be publicly accessible or shareable via link 2. Import the Workflow Copy the workflow JSON provided In your n8n instance, go to Workflows → Import from JSON Paste the JSON and click Import Save with a descriptive name like "Google Sheets SQL Query Generator" 3. Set Up OpenAI Connections Get API Key: Go to OpenAI Platform Sign in or create an account Navigate to API Keys section Click Create new secret key Copy the generated API key Important: Add billing information and credits to your OpenAI account Configure Both OpenAI Nodes: OpenAI Chat Model1 (GPT-4o): Click on the node Click Create New Credential Select OpenAI API Paste your API key Save the credential OpenAI Chat Model2 (GPT-4.1 Mini): Apply the same OpenAI API credential This handles result formatting 4. Set Up Google Sheets Connection Create OAuth2 Credentials: Go to Google Cloud Console Create a new project or select existing one Enable the Google Sheets API Go to Credentials → Create Credentials → OAuth 2.0 Client IDs Set application type to Web Application Add authorized redirect URIs (get this from n8n credentials setup) Copy the Client ID and Client Secret Configure in n8n: Click on the Get Column Info2 node Click Create New Credential Select Google Sheets OAuth2 API Enter your Client ID and Client Secret Complete the OAuth flow by clicking Connect my account Authorize the required permissions 5. Prepare Your Google Sheet Option A: Use the Sample Data Sheet Access the pre-configured sheet: Sample Marketing Data Make a copy to your Google Drive Important**: Set sharing to "Anyone with the link can view" Critical: Set sharing to "Anyone with the link can view" for HTTP access Copy the Sheet ID from the URL Update the Get Column Info2 node with your Sheet ID and column metadata sheet 6. Configure Sheet References Get Column Info2 Node: Set Document ID to your Google Sheet ID Set Sheet Name to your columns metadata sheet (e.g., "Columns") This provides the AI with column letter mappings HTTP Request Node: No configuration needed - it uses dynamic URLs from the AI agent Ensure your sheet has proper sharing permissions 7. Update System Prompt (If Using Custom Sheet) If using your own Google Sheet, update the system prompt in the AI Agent3 node: Replace the URL in the system message with your Google Sheet URL Update the GID (sheet ID) to match your data sheet Keep the same query structure format Contact Information Robert Ynteractive For support, customization, or questions about this template: Email**: robert@ynteractive.com LinkedIn**: Robert Breen Need help implementing this workflow, want to add security features, or require custom modifications? Reach out for professional n8n automation services and AI integration support.
by Yatharth Chauhan
Feedback Sentiment Workflow (Typeform → GCP → Notion/Slack/Trello) This template ingests feedback from Typeform, runs Google Cloud Natural Language sentiment analysis, routes based on sentiment, and then creates a Notion database page and posts a Slack notification for positive items, or creates a Trello card for negative items. The flow is designed for quick setup and safe sharing using placeholders for IDs and credentials. How it Works Typeform Trigger Captures each new submission and exposes answers like Name and the long-text Feedback field. Google Cloud Natural Language Analyzes the feedback text and returns a sentiment score in: documentSentiment.score Check Sentiment Score (IF) True branch: Score > 0 → Positive False branch: Score ≤ 0 → Non-positive Add Feedback to Notion (True branch) Creates a new page in a Notion database with mapped properties. Notify Slack (after Notion) Posts the feedback, author, and score to a Slack channel for visibility. Create Trello Card (False branch) Logs non-positive items to a Trello list for follow-up. Required Accounts Google Cloud Natural Language API** enabled (OAuth2 or service credentials). Notion integration** with database access to create pages. Slack app/bot token** with permission to post to the target channel. Typeform account** with a form including: Long Text feedback question Name field Notion Database Columns Name (title):** Person name or responder label Feedback (rich_text):** Full feedback text Sentiment Score (number):** Numeric score from GCP ∈ [-1, 1] Source (select/text):** "Typeform" for provenance Submitted At (date):** Timestamp from the trigger Customization Options Sentiment Threshold:** Adjust IF condition (e.g., ≥ 0.25) for stricter positivity. Slack Routing:** Change channel, add blocks/attachments for richer summaries. Trello Path:** Point to a triage list and include labels for priority. Field Mapping:** Update the expression for feedback question to match Typeform label. Database Schema:** Add tags, product area, or customer tier for reporting. Setup Steps Connect credentials: Typeform, GCP Natural Language, Notion, Slack, Trello. Replace placeholders in workflow JSON: Form ID Database ID Slack Channel Trello List ID Map fields: Set Feedback + Name expressions from Typeform Trigger output into Notion and Slack. Adjust IF threshold for your definition of "positive". Test with a sample response and confirm: Notion page creation Slack notification Trello card logging
by Sayone Technologies
⭐ Google Review Sentiment Analysis & Slack Notification Workflow This workflow automates the process of collecting Google Business Profile reviews 🏪, analyzing customer sentiment with Google Gemini 🤖✨, and sending structured reports to Slack 💬. 🔑 Key Advantages 📥 Fetches Google Business Profile reviews for a given business and time period 🧠 Runs sentiment analysis using Gemini AI 📊 Consolidates comments, ratings, and trends into a JSON-based summary 🧩 Restructures results into Slack Block Kit format for easy readability 🚀 Sends automated sentiment reports directly to a Slack channel ⚙️ Set Up Essentials You’ll Need 🔑 Google Business Profile API access with project approval ✅ Enabled Google Business Profile API service 🔐 Gemini API credentials 💬 Slack workspace & channel for receiving reports 🚀 How to Get Started 🔧 Configure your Google Business Profile API and enable access 👤 Set the owner name and 📍 location to fetch reviews ⏳ Define the review time period using the Set Time Period node 🔗 Connect your Slack account and select a channel for notifications 🕒 Deploy and let the workflow run on schedule for automated insights