by Juan Carlos Cavero Gracia
Description This automation template is designed for content creators, social media managers, and influencers who want to streamline their video publishing workflow. It automatically detects new videos uploaded to a specific Google Drive folder, generates AI-powered descriptions based on video audio content, and simultaneously publishes them across Instagram, TikTok, and YouTube while tracking everything in Airtable. Note: This workflow uses upload-post.com API (free trial no credit card required) for multi-platform video distribution and requires API tokens for each service. The AI-generated descriptions are created using OpenAI's transcription and chat models to analyze video audio content.* Who Is This For? Content Creators & Influencers:** Automatically publish your videos across all major social platforms without manual work. Social Media Managers:** Maintain consistent posting schedules across multiple platforms with AI-generated, platform-optimized descriptions. Marketing Teams:** Scale video content distribution with automated workflows that include tracking and status monitoring. Video Producers:** Focus on creating content while the system handles the tedious task of multi-platform publishing and description generation. What Problem Does This Workflow Solve? Publishing the same video content across Instagram, TikTok, and YouTube is time-consuming and repetitive. You need to manually upload each video, write unique descriptions, and track publication status. This workflow addresses these challenges by: Automated Video Distribution:** Detects new videos in Google Drive and automatically uploads them to all three platforms simultaneously. AI-Powered Content Generation:** Uses OpenAI to transcribe video audio and generate engaging, platform-appropriate descriptions automatically. Centralized Tracking:** Maintains detailed records in Airtable including upload status, URLs, and metadata for each platform. Error Monitoring:** Provides real-time error notifications via Telegram to ensure you're always aware of any issues. How It Works Video Upload Detection: The workflow monitors a specific Google Drive folder for new video uploads using automated triggers. Content Analysis: Downloads the video, extracts audio, and uses OpenAI to transcribe and generate compelling descriptions. Airtable Integration: Creates and updates records to track video metadata, descriptions, and publication status. Multi-Platform Publishing: Simultaneously uploads the video to Instagram, TikTok, and YouTube using the upload-post.com API. Status Tracking: Updates Airtable records with publication status and platform-specific URLs for each successful upload. Setup Google Drive Configuration: Set up the Google Drive trigger to monitor your specific folder Configure OAuth2 credentials for Google Drive access OpenAI Integration: Add your OpenAI API key to enable audio transcription and description generation Airtable Setup: Create an Airtable base with fields for Video Name, Description, Platform Status, URLs, and Upload Date Add your Airtable API token and configure base/table IDs in the "Set Variables" node Upload-Post.com Account: Create an account at upload-post.com to get your API token Configure the token in the HTTP request nodes for each platform Set your user ID in the variables section Platform Accounts: Ensure your Instagram, TikTok, and YouTube accounts are connected to upload-post.com Error Notifications: (Optional) Configure Telegram bot credentials for error notifications Requirements Accounts:** Google Drive, OpenAI, Airtable, upload-post.com, Telegram (optional) API Keys & Credentials:** Google Drive OAuth2, OpenAI API Key, Airtable API Token, upload-post.com API Token Platform Setup:** Instagram, TikTok, and YouTube accounts connected to upload-post.com Transform your video publishing workflow from hours of manual work to a fully automated system that handles everything from content analysis to multi-platform distribution and tracking.
by David Ashby
🛠️ Philips Hue Tool MCP Server Complete MCP server exposing all Philips Hue Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Philips Hue Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Philips Hue Tool tool with full error handling 📋 Available Operations (4 total) Every possible Philips Hue Tool operation is included: 🔧 Light (4 operations) • Delete a light • Get a light • Get many lights • Update a light 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Philips Hue Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Philips Hue Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Cecilia
Enable smart, real-time answers in your WhatsApp groups using a custom webhook, Pinecone vector database, and no Facebook Business setup. > 🟡 Note: This template uses a custom WhatsApp webhook. It does not use the official WhatsApp Business API. 👥 Who is this for? This workflow is designed for individuals and teams who want to enable smart WhatsApp group automation — without going through Meta’s official WhatsApp Business API. Ideal for small businesses, internal teams, communities, and personal power users. ❓ What problem is this solving? Setting up WhatsApp bots with intelligent responses often requires approval from Meta and a verified business account. This workflow removes those barriers by using a self-hosted webhook to handle incoming messages and respond using a document-trained AI via Pinecone. ⚙️ What this workflow does Connects a regular WhatsApp number to a custom webhook Adds the bot to any group chat (it stays silent unless mentioned) Indexes documents from Google Drive into Pinecone Responds with intelligent, context-aware answers from your custom knowledge base Auto-updates its knowledge every minute as the document changes 🛠️ Setup Step 1: Connect Google Drive Set up your Google Drive credentials in n8n Step 2: Configure Pinecone Create an index in Pinecone Dimension: 1536 Select this index in both Pinecone nodes Click Test Workflow to ingest your document into Pinecone Step 3: Get Access to the WhatsApp Webhook Fill out this form to request access You’ll receive a WhatsApp confirmation for linking Step 4: Test WhatsApp Integration ✅ One-on-one test: Send a message from another number 👥 Group test: Add the bot to a group; it will only respond when tagged 🧩 How to customize this workflow Modify the system prompt inside the AI agent node to control tone and behavior Update the connected Google Doc to match your specific domain (e.g. FAQs, SOPs, product manuals) Adjust the Pinecone sync frequency if you want updates more or less often 📚 Use cases Customer Support**: Instant, intelligent replies in WhatsApp without live agents Team Knowledge Bot**: Tag the bot for quick access to SOPs and internal docs Community Groups**: Automate common questions while keeping noise low Personal AI Assistant**: A WhatsApp chatbot trained on your notes and files 📝 Sticky Note Suggestion 💬 What this template does: > Enables an AI bot in your WhatsApp group that answers questions based on a Google Doc you provide. It uses a custom webhook, Google Drive, and Pinecone. 🔧 Requirements: > Google Drive account > Pinecone account with an index (dimension 1536) > Access to the custom WhatsApp webhook (see setup steps)
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for This workflow automates the real-time extraction of Job Descriptions and Salary Information from job listing pages using Bright Data MCP and analyzes content using OpenAI GPT-4o mini. This workflow is ideal for: Recruiters & HR Tech Startups**: Automate job data collection from public listings Market Intelligence Teams**: Analyze compensation trends across companies or geographies Job Boards & Aggregators**: Power search results with structured, enriched listings AI Workflow Builders**: Extend to other career platforms or automate resume-job match analysis Analysts & Researchers**: Track hiring signals and salary benchmarks in real time What problem is this workflow solving? Traditional scraping of job portals can be challenging due to cluttered content, anti-scraping measures, and inconsistent formatting. Manually analyzing salary ranges and job descriptions is tedious and error-prone. This workflow solves the problem by: Simulating user behavior using Bright Data MCP Client to bypass anti-scraping systems Extracting structured, clean job data in Markdown format Using OpenAI GPT-4o mini to analyze and extract precise salary details and refined job descriptions Merging and formatting the result for easy consumption Delivering final output via webhook, Google Sheets, or file system What this workflow does Components & Flow Input Nodes job_search_url: The job listing or search result URL job_role: The title or role being searched for (used in logging/formatting) MCP Client Operations MCP Salary Data Extractor Simulates browser behavior and scrapes salary-related content (if available) MCP Job Description Extractor Extracts full job description as structured Markdown content OpenAI GPT-4o mini Nodes Salary Information Extractor Uses GPT-4o mini to detect, clean, and standardize salary range data (if any) Job Description Refiner Extracts role responsibilities, qualifications, and benefits from unstructured text Company Information Extractor Uses Bright Data MCP and GPT-4o mini to extract the company information Merge Node Combines the refined job description and extracted salary information into a unified JSON response object Aggregate node Aggregates the job description and salary information into a single JSON response object Final Output Handling The output is handled in three different formats depending on your downstream needs: Save to Disk** Output stored with filename including timestamp and job role Google Sheet Update** Adds a new row with job role, salary, summary, and link Webhook Notification** Pushes merged response to an external system Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. In n8n, configure the OpenAi account credentials. In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Modify Input Source Change the job_search_url to point to any job board or aggregator Customize job_role to reflect the type of jobs being analyzed Tweak LLM Prompts (Optional) Refine GPT-4o mini prompts to extract additional fields like benefits, tech stacks, remote eligibility Change Output Format Customize the merged object to output JSON, CSV, or Markdown based on downstream needs Add additional destinations (e.g., Slack, Airtable, Notion) via n8n nodes
by David Ashby
Complete MCP server exposing all ProfitWell Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every ProfitWell Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n ProfitWell Tool tool with full error handling 📋 Available Operations (2 total) Every possible ProfitWell Tool operation is included: 🔧 Company (1 operations) • Get settings for your company 🔧 Metric (1 operations) • Get a metric 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native ProfitWell Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every ProfitWell Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all Mocean Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Mocean Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Mocean Tool tool with full error handling 📋 Available Operations (2 total) Every possible Mocean Tool operation is included: 🔧 Sms (1 operations) • Send an SMS 🔧 Voice (1 operations) • Send an SMS 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Mocean Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Mocean Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
🛠️ Plivo Tool MCP Server Complete MCP server exposing all Plivo Tool operations to AI agents. Zero configuration needed - all 3 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Plivo Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Plivo Tool tool with full error handling 📋 Available Operations (3 total) Every possible Plivo Tool operation is included: 🔧 Call (1 operations) • Make a call 🔧 Mms (1 operations) • Send an MMS 🔧 Sms (1 operations) • Send an SMS 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Plivo Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Plivo Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Jimleuk
This n8n template showcases a cool feature of n8n Forms where the form itself can be defined dynamically using the form fields schema. It may be debateable how useful this template actually is since both Airtable and Baserow provide form interfaces already but still a great exercise and demonstration if ever the use-case comes around. How it works A form trigger is used to dynamically select a database/table from which to build the n8n form from. the table's schema is imported into the workflow and using the code node, is converted into the n8n form fields schema. This let's us dynamically build the fields in our n8n form when we choose to define the form using the JSON option. Once the n8n form submits, we convert the values back into our table's API schema so that we can create a new row. Note any files/attachments fields are removed as they need to be handled separately. Files are processed separately as they may first need to be stored. Once complete, the reference is saved into the newly created row. Check out the example Airtable here - https://airtable.com/appfP15Xd0aVZR9xV/shrGFgXLyQ4Jg58SU How to use The n8n form is autogenerated which means you only need provide access to the table. Using this approach, this template can be reused for any number of Airtable and/or Baserow tables. Requirements You'll need either an Airtable account or a Baserow account to use this template. Accessible n8n instance to your users Customising this workflow Not using either Airtable or Baserow? Theoretically any datastore which provides a fields schema can be used with this template. If you're feeling creative, split the table into multiple forms for a better user experience.
by Bela
Purpose of the workflow Most scraping workflows get blocked by anti-bot technologies. To avoid this, you can use Scrappey to scrape every website you want. How it works: We use Test Data and make a API Call to the Scrappey service. We get the scraped website data back as a result. Setup Steps: Replace YOUR_API_KEY in the "Scrappey API Call" node with your Scrappey API Key (Register For Free) Replace the test data with your production data. You can plug in any type of data connector at this point of your workflow.
by Jimleuk
This n8n workflows builds another example of creating a knowledgebase assistant but demonstrates how a more deliberate and targeted approach to ingesting the data can produce much better results for your chatbot. In this example, a government tax code policy document is used. Whilst we could split the document into chunks by content length, we often lose the context of chapters and sections which may be required by the user. Our approach then is to first split the document into chapters and sections before importing into our vector store. Additionally, using metadata correctly is key to allow filtering and scoped queries. Example Human: "Tell me about what the tax code says about cargo for intentional commerce?" AI: "Section 11.25 of the Texas Property Tax Code pertains to "MARINE CARGO CONTAINERS USED EXCLUSIVELY IN INTERNATIONAL COMMERCE." In this section, a person who is a citizen of a foreign country or an en..." How it works The tax code policy document is downloaded as a zip file from the government website and its pages are extracted as separate chapters. Each chapter is then parsed and split into its sections using data manipulation expressions. Each section is then inserted into our Qdrant vector store tagged with its source, chapter and section numbers as metadata. When our AI Agent needs to retrieve data from our vector store, we use a custom workflow tool to perform the query to Qdrant. Because we're relying on Qdrant's advanced filtering capabilities, we perform the search using the Qdrant API rather than the Qdrant node. When the AI Agent, needs to pull full wording or extracts, we can use Qdrant's scroll API and metadata filtering to do so. This makes Qdrant behave like a key-value store for our document. Requirements A Qdrant instance is required for the vector store and specifically for it's filtering functionality. Mistral.ai account for Embeddings and AI models. Customising this workflow Depending on your use-case, consider returning actual PDF pages (or links) to the user for the extra confirmation and to build trust. Not using Mistral? You are able to replace but note to match the distance and dimension size of Qdrant collection to your chosen embedding model.
by Oneclick AI Squad
An intelligent food menu update notification system that automatically detects changes in your restaurant's special menu and sends personalized notifications to customers via multiple channels - WhatsApp, Email, and SMS. This workflow ensures your customers are always informed about new dishes, price changes, and menu availability in real-time. What's the Goal? Automatically monitor special menu updates from Google Sheets Detect menu changes and generate alert messages using AI Send multi-channel notifications (WhatsApp, Email, SMS) based on customer preferences Maintain comprehensive notification logs for tracking and analytics Provide seamless customer communication for menu updates Enable restaurant owners to keep customers engaged with latest offerings By the end, you'll have a fully automated menu notification system that keeps your customers informed and engaged with your latest culinary offerings. Why Does It Matter? Manual menu update communication is time-consuming and often missed by customers. Here's why this workflow is essential for restaurants: Real-Time Updates**: Customers receive instant notifications about menu changes Multi-Channel Reach**: WhatsApp, Email, and SMS ensure maximum customer reach Personalized Experience**: Customers receive notifications via their preferred channels Increased Sales**: Immediate awareness of new items drives orders Customer Retention**: Regular updates keep customers engaged and coming back Operational Efficiency**: Eliminates manual notification tasks for staff Data-Driven Insights**: Comprehensive logging for marketing analytics Think of it as your restaurant's digital menu announcer that never misses an update. How It Works Here's the complete workflow process: Step 1: Menu Monitoring Node**: Daily Menu Update Scheduler Function**: Triggers the workflow on a scheduled basis Frequency**: Configurable (hourly, daily, or real-time) Step 2: Data Retrieval Node**: Fetch Special Menu Data Function**: Pulls current menu data from Google Sheets (Sheet 1) Data**: Retrieves item details, prices, descriptions, and availability Step 3: Change Detection Node**: Detect Menu Changes Function**: Compares current data with previous state Logic**: Identifies new items, price changes, or availability updates Step 4: AI Content Generation Node**: Generate Menu Alert Message Function**: Creates engaging notification content using AI Output**: Formatted message with new items, descriptions, and prices Step 5: Customer Data Processing Node**: Fetch Customer Contact List Function**: Retrieves customer preferences from Google Sheets (Sheet 2) Filter**: Segments customers by notification preferences Step 6: Multi-Channel Delivery The workflow splits into three parallel notification channels: WhatsApp Branch Node**: Filter WhatsApp Users Function**: Identifies customers with WhatsApp notifications enabled Node**: Send WhatsApp Notification Function**: Delivers menu updates via WhatsApp Node**: Log WhatsApp Status Function**: Records delivery status in Sheet 3 Email Branch Node**: Filter Email Users Function**: Identifies customers with email notifications enabled Node**: Send Menu Email Function**: Delivers formatted email notifications Node**: Log Email Status Function**: Records delivery status in Sheet 3 SMS Branch Node**: Filter SMS Users Function**: Identifies customers with SMS notifications enabled Node**: Send Twilio SMS Alert Function**: Delivers text message notifications via Twilio Node**: Log SMS Status Function**: Records delivery status in Sheet 3 Step 7: Comprehensive Logging All notification activities are logged in Sheet 3 for tracking and analytics. Google Sheets Structure Sheet 1: Special Menu | Column | Description | Example | |--------|-------------|---------| | Item ID | Unique identifier for menu item | "ITEM001" | | Item Name | Name of the dish | "Truffle Risotto" | | Price | Item price | "$28.99" | | Description | Detailed item description | "Creamy arborio rice with black truffle, parmesan, and wild mushrooms" | | Nutritions | Nutritional information | "Calories: 450, Protein: 15g" | | Category | Menu category | "Main Course" | | Available | Availability status | "Yes" / "No" | Sheet 2: Customer Database | Column | Description | Example | |--------|-------------|---------| | Customer Name | Customer's full name | "ABC" | | Email | Customer's email address | "abc@gmail.com" | | Phone Number | Customer's phone number | "91999999999" | | WhatsApp Number | Customer's WhatsApp number | "91999999999" | | Email Notifications | Email preference | "Yes" / "No" | | SMS Notifications | SMS preference | "Yes" / "No" | | WhatsApp Notifications | WhatsApp preference | "Yes" / "No" | Sheet 3: Notification Logs | Column | Description | Example | |--------|-------------|---------| | Timestamp | Notification send time | "2025-07-09T12:51:09.587Z" | | Customer Name | Recipient name | "ABC" | | Notification Type | Channel used | "Email" / "SMS" / "WhatsApp" | | Status | Delivery status | "Sent" / "Failed" / "Pending" | | Message | Content sent | "SPECIAL MENU UPDATE..." | How to Use the Workflow Prerequisites Google Sheets Setup: Create three sheets with the required structure n8n Account: Access to n8n workflow platform WhatsApp Business API: WhatsApp Business account with API access Email Service: Gmail or SMTP service for email notifications Twilio Account: Twilio account for SMS functionality AI Model Access: OpenAI or similar AI service for content generation Importing the Workflow in n8n Step 1: Obtain the Workflow JSON Export the workflow from your n8n instance or obtain the JSON file Ensure you have the complete workflow configuration Step 2: Access n8n Workflow Editor Log in to your n8n instance (Cloud or self-hosted) Navigate to the Workflows section Click "Add Workflow" to create a new workflow Step 3: Import the Workflow Option A: Import from Clipboard Click the three dots (⋯) in the top-right corner Select "Import from Clipboard" Paste the JSON code into the text box Click "Import" to load the workflow Option B: Import from File Click the three dots (⋯) in the top-right corner Select "Import from File" Choose the .json file from your computer Click "Open" to import the workflow Configuration Setup Google Sheets Integration Authentication: Connect your Google account in n8n Sheet 1 Configuration: Set spreadsheet ID and range for menu data Sheet 2 Configuration: Set spreadsheet ID and range for customer data Sheet 3 Configuration: Set spreadsheet ID and range for notification logs WhatsApp Integration WhatsApp Business API: Set up WhatsApp Business API credentials Webhook Configuration: Configure webhook URLs for message delivery Message Templates: Create approved message templates for menu updates Email Integration Gmail/SMTP Setup: Configure email service credentials Email Templates: Design HTML email templates for menu notifications Sender Configuration: Set sender name and email address Twilio SMS Integration Twilio Account: Set up Twilio Account SID and Auth Token Phone Number: Configure Twilio phone number for SMS sending Message Templates: Create SMS message templates AI Content Generation API Configuration: Set up OpenAI or preferred AI service credentials Prompt Customization: Configure prompts for menu update content Content Parameters: Set message tone, length, and style Workflow Execution Automatic Execution Scheduled Triggers: Set up cron expressions for regular checks Webhook Triggers: Configure real-time triggers for immediate updates Manual Triggers: Enable manual execution for testing Monitoring and Maintenance Execution Logs: Monitor workflow execution through n8n interface Error Handling: Set up error notifications and retry mechanisms Performance Monitoring: Track execution times and success rates Sample Notification Message SPECIAL MENU UPDATE 🍽️ NEW ITEMS: • Truffle Risotto - $28.99 Creamy arborio rice with black truffle, parmesan, and wild mushrooms • Chocolate Lava Cake - $18.99 Warm chocolate cake with molten center, vanilla ice cream Total Menu Items: 2 Updated: 7/9/2025, 12:10:50 PM Visit our restaurant or call to place your order! 📞 Best Practices Data Management Regularly validate customer contact information Keep menu data updated and accurate Maintain clean customer preference settings Notification Strategy Send notifications during optimal hours (lunch/dinner time) Limit frequency to avoid customer fatigue Personalize messages based on customer preferences Content Quality Use engaging language and emojis appropriately Include clear pricing and descriptions Add call-to-action for immediate orders Performance Optimization Batch process notifications to avoid rate limits Implement retry logic for failed deliveries Monitor API quotas and usage limits Troubleshooting Common Issues Authentication Errors**: Verify API credentials and permissions Rate Limiting**: Implement delays between notifications Message Delivery**: Check phone number formats and email addresses Sheet Access**: Ensure proper sharing permissions Error Handling Set up notification alerts for workflow failures Implement fallback mechanisms for service outages Maintain backup notification methods Analytics and Reporting Key Metrics Delivery Rates**: Track successful notifications by channel Customer Engagement**: Monitor response rates and feedback Menu Performance**: Analyze which items generate most interest Channel Effectiveness**: Compare performance across WhatsApp, Email, and SMS Reporting Features Automated daily/weekly reports Customer preference analytics Notification performance dashboards Revenue correlation with menu updates Security and Compliance Data Protection Secure storage of customer contact information Compliance with GDPR and local privacy laws Regular security audits of API access Rate Limiting Respect platform rate limits (WhatsApp, Twilio, Email) Implement queuing systems for high-volume notifications Monitor and adjust sending frequencies Conclusion The Food Menu Update Notifier transforms restaurant communication from reactive to proactive, ensuring customers are always informed about your latest offerings. By leveraging multiple communication channels and AI-generated content, this workflow creates a seamless bridge between your kitchen innovations and customer awareness. This system not only improves customer engagement but also drives immediate sales through timely notifications about new menu items, special offers, and seasonal dishes. The comprehensive logging and analytics capabilities provide valuable insights for menu optimization and marketing strategy refinement.
by Ranjan Dailata
Who this is for? The Structured Data Extract & Data Mining workflow is crafted for researchers, content analysts, SEO strategists, and AI developers who need to transform semi-structured web data (like markdown content or scraped HTML) into actionable structured datasets. It is ideal for: Content Analysts** - Organizing and mining large volumes of markdown or HTML content. SEO & Trend Researchers** - Exploring topics by location and category. AI Engineers & NLP Developers** - Looking to automate insight extraction from unstructured inputs. Growth Marketers** - Tracking topic-level trends for strategic campaigns. Automation Specialists** - Streamlining workflows from scrape to storage. What problem is this workflow solving? Extracting insights from markdown or HTML documents typically requires manual review, formatting, and parsing. This becomes unscalable when dealing with large datasets or when real-time response is needed. Additionally, trend and topic extraction usually involves external tools, custom scripts, and inconsistent formatting. This workflow solves: Automatic text extraction from markdown or structured content. Location and category-based trend mining with semantic grouping. AI-driven topic extraction and summarization Real-time notification via webhook with rich structured payloads. Persistent storage of mined data to disk for audits or further processing. What this workflow does Receives input: Sets the URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. A Markdown/Text Extractor node parses the content into clean plaintext The cleaned data is passed to Google Gemini to: Identify trends by location and category Extract key topics and themes Format the response into structured JSON The structured insights are sent via Webhook Notification to external systems (e.g., Slack, Web apps, Zapier) The final output is saved to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source** : Update the workflow input to read from Google Sheet or Airbase for dynamically tracking multiple brands or topics. Gemini Prompt Customization** : Extract trends within a custom category (e.g., E-commerce design patterns in the US) Output topics with popularity metrics Structure the output as per your database schema (e.g., [{ topic, trend_score, location }]) Webhook Output** : Send notifications to - Slack – with AI summaries in rich blocks Internal APIs – for use in dashboards Zapier/Make – for multi-step automation Persistence** Save output to: Remote FTP or SFTP storage Amazon S3, Google Cloud Storage etc.