by David Olusola
A complete, ready-to-deploy Telegram chatbot template for food delivery businesses. This intelligent assistant handles orders, payments, customer service, and order tracking with human-in-the-loop payment verification. โจ Key Features ๐ค AI-Powered Conversations - Natural language order processing using Google Gemini ๐ฑ Telegram Integration - Seamless customer interaction via Telegram ๐ณ Payment Verification - Screenshot-based payment confirmation with admin approval ๐ Order Tracking - Automatic Google Sheets logging of all orders ๐ง Memory Management - Contextual conversation memory for better customer experience ๐ Multi-Currency Support - Easily customizable for any currency (USD, EUR, GBP, etc.) ๐ Location Flexible - Adaptable to any city/country ๐ Human Oversight - Manual payment approval workflow for security ๐ ๏ธ What This Template Includes Core Workflow Customer Interaction - AI assistant takes orders via Telegram Order Confirmation - Summarizes order with total and payment details Information Collection - Gathers customer name, phone, and delivery address Payment Processing - Handles payment screenshots and verification Admin Approval - Human verification of payments before order confirmation Order Tracking - Automatic logging to Google Sheets with delivery estimates Technical Components AI Agent Node - Google Gemini-powered conversation handler Memory System - Maintains conversation context per customer Google Sheets Integration - Automatic order logging and tracking Telegram Nodes - Customer and admin communication Payment Verification - Screenshot detection and approval workflow Conditional Logic - Smart routing based on message types ๐ Quick Setup Guide Prerequisites n8n instance (cloud or self-hosted) Telegram Bot Token Google Sheets API access Google Gemini API key Step 1: Replace Placeholders Search and replace the following placeholders throughout the template: Business Information [YOUR_BUSINESS_NAME] โ Your restaurant/food business name [ASSISTANT_NAME] โ Your bot's name (e.g., "Alex", "Bella", "Chef Bot") [YOUR_CITY] โ Your city [YOUR_COUNTRY] โ Your country [YOUR_ADDRESS] โ Your business address [YOUR_PHONE] โ Your business phone number [YOUR_EMAIL] โ Your business email [YOUR_HOURS] โ Your operating hours (e.g., "9AM - 11PM daily") Currency & Localization [YOUR_CURRENCY] โ Your currency name (e.g., "USD", "EUR", "GBP") [CURRENCY_SYMBOL] โ Your currency symbol (e.g., "$", "โฌ", "ยฃ") [YOUR_TIMEZONE] โ Your timezone (e.g., "EST", "PST", "GMT") [PREFIX] โ Order ID prefix (e.g., "FB" for "Food Business") Menu Items (Customize Completely) [CATEGORY_1] โ Food category (e.g., "Burgers", "Pizza", "Sandwiches") [ITEM_1] through [ITEM_8] โ Your menu items [PRICE_1] through [DELIVERY_FEE] โ Your prices Add or remove categories and items as needed Payment & Support [YOUR_PAYMENT_DETAILS] โ Your payment information [YOUR_PAYMENT_PROVIDER] โ Your payment method (e.g., "Venmo", "PayPal", "Bank Transfer") [YOUR_SUPPORT_HANDLE] โ Your Telegram support username Step 2: Configure Credentials Telegram Bot - Add your bot token to Telegram credentials Google Sheets - Connect your Google account and create/select your orders spreadsheet Google Gemini - Add your Gemini API key Sheet ID - Replace [YOUR_GOOGLE_SHEET_ID] with your actual Google Sheet ID Step 3: Customize Menu Update the menu section in the AI Agent system message with your actual: Food categories Item names and prices Delivery fees Any special offerings or combos Step 4: Test & Deploy Import the template into your n8n instance Test the conversation flow with a test Telegram account Verify Google Sheets logging works correctly Test the payment approval workflow Activate the workflow ๐ฐ Currency Examples USD Version ๐ MENU & PRICES (USD) Burgers Classic Burger โ $12.99 Cheese Burger โ $14.99 Deluxe Burger โ $18.99 Delivery Fee โ $3.99 EUR Version ๐ MENU & PRICES (EUR) Burgers Classic Burger โ โฌ11.50 Cheese Burger โ โฌ13.50 Deluxe Burger โ โฌ17.50 Delivery Fee โ โฌ3.50 ๐ Google Sheets Structure The template automatically logs orders with these columns: Order ID Customer Name Chat ID Phone Number Delivery Address Order Info Total Price Payment Status Order Status Timestamp ๐ง Customization Options Easy Customizations Menu Items - Add/remove/modify any food items Pricing - Update to your local pricing structure Currency - Change to any currency worldwide Business Hours - Modify operating hours Delivery Areas - Add location restrictions Payment Methods - Update payment information# Header 1
by Risper
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How It Works This n8n workflow automatically discovers high-quality business leads from Reddit posts by analysing posts across targeted subreddits. Loads your business profile from a connected Google Sheet. Uses AI to identify relevant subreddits where your potential customers engage. Generates intent-based Reddit search queries based on your services, keywords, and client pain points. Searches Reddit in real time using the generated queries. Classifies posts based on whether they show lead potential. Analyses high-potential posts for service-fit, urgency, and estimated value. Filters and scores leads to prioritize high-conversion opportunities. Saves the most promising leads to a dedicated Google Sheet. Sends Slack alerts to notify your sales team for immediate follow-up. Requirements Before using this workflow, ensure the following services are connected and configured: Google Sheets (OAuth2): Reads your business profile and writes qualified leads Reddit (OAuth2) Perform Reddit post searches based on generated queries Google Gemini API Analyse posts, generate queries, and extract insights Slack API : Notify your team with qualified lead summaries Google Sheets Setup You will need two Google Sheets: Business Profile Sheet (Input) This sheet contains a single row describing your service business. The workflow reads this to generate relevant subreddit selections and search queries. Required Fields (as headers in row 1): profession industry primary_services service_keywords target_client_profile pain_points intent_signals urgency_indicators price_range Reddit Leads Sheet (Output) This sheet stores high-quality Reddit posts identified as potential leads. The workflow appends or updates rows based on post_id to avoid duplication. Expected Columns: post_id post_url post_title post_post post_subreddit post_date
by Yang
๐ฝ๏ธ What this workflow does This workflow turns a user-submitted form with country or animal names into a cinematic video with animated scenes and immersive ambient audio. Using GPT-4 for prompt generation, Dumpling AI for visual creation,& Replicate for motion animation, ElevenLabs for sound generation, and Creatomate for video stitching, it fully automates video production โ from raw idea to rendered file. ๐ฏ What problem is this solving? Creating engaging multimedia content can take hours. This workflow automates the entire process of ideation, design, and rendering of high-quality cinematic clips, eliminating the need for manual video editing or audio production. ๐ฅ Who is this for? Content creators and educators Digital artists and storytellers Marketers or YouTubers creating short-form visual content No-code/AI automation enthusiasts โ๏ธ Setup Instructions โ Step 1: Google Sheet Create a Google Sheet with two columns: Title Generated videos Update the Sheet ID and tab name in the final node. โ Step 2: Google Drive Create two folders: One for ambient audio tracks One for final generated videos Update the folder IDs in both Google Drive nodes. โ Step 3: Credentials Setup Make sure all your API tokens are saved as credentials in n8n. This workflow uses the following integrations: OpenAI (GPT-4) Dumpling AI (via HTTP header) Replicate.com ElevenLabs Google Drive Google Sheets Creatomate โ Step 4: Form Fields Ensure your trigger form includes these fields: Title Country 1, Country 2, Country 3, Country 4 Style (e.g., cinematic, epic, fantasy, noir, etc.) ๐งฉ How it works User Form Submission Kicks off the workflow with the required inputs. Format Inputs Combines all 4 countries/animals into a single array. GPT-4: Generate Visual Prompts Uses GPT-4 to create rich cinematic descriptions per animal/country. Dumpling AI: Create Images Each description becomes a high-quality visual. GPT-4: Create Motion Prompts Each image prompt is rewritten into motion-based video prompts. Replicate: Animate Prompts and images are sent to Replicateโs model for animation. GPT-4: Generate Sound Prompt Based on the style, GPT-4 creates an ambient sound idea. ElevenLabs: Create Ambient Audio Audio is generated and uploaded to Google Drive. Creatomate: Stitch All Media All 4 motion videos and the audio track are stitched into one cinematic output. Upload to Google Drive + Log to Sheet Final video is saved in Drive and logged in Sheets with its title and link. ๐ ๏ธ How to Customize ๐จ Modify GPT prompts for different themes (e.g., horror, fantasy, sci-fi). ๐ง Swap animals for characters, objects, or locations. ๐ง Replace ambient sound with ElevenLabs voiceovers or music. ๐ Add metadata logging (generation time, duration, tags). ๐งช Try using alternative video tools like Pika Labs or Runway ML. โ Requirements n8n self-hosted or cloud instance Active accounts for: OpenAI, Dumpling AI, Replicate, ElevenLabs, Creatomate Google credentials set up for Drive + Sheets This is a perfect end-to-end automation that showcases the power of AI + automation for video storytelling.
by Dele Tosh
๐ Automate your social media presence! This workflow duo automatically curates content from your Wallabag RSS feeds, generates platform-specific posts using AI, and publishes themโcomplete with AI-generated images. โ๏ธ Setup & Configuration Required Credentials & Services: To run this workflow, you will need to set up the following credentials in n8n: Wallabag RSS Feeds:** This workflow uses Wallabag as your content curation service. Wallabag is the most cost-effective optionโeasy to self-host or use as a paid service. You'll need to generate access tokens for RSS feed access. Airtable API Key:** To create and update records in your "Content Store" database. LLM Provider API Key:* To power the social media content generation. The demo uses *Groq (llama-3.3-70b-versatile)**, but this can be replaced with any preferred LLM (OpenAI, Anthropic, etc.). GetLate API Key:** To authenticate and post to your social media platforms. Imgbb API Key:** To host the AI-generated images. An Image Generation Service API Key:* For creating images from prompts. The demo uses *Hugging Face (stable-diffusion-xl-base-1.0)**, but this can be swapped for any other service (Fal.ai, Stability AI, etc.). Key Setup Requirement: Define Your Tagging Convention Before running this workflow, establish a consistent tagging system in Wallabag where each tag corresponds to a specific social media platform. For example: #to-share-linkedin for LinkedIn content #to-share-bluesky for Bluesky content #to-share-instagram for Instagram content Adding More Feeds & Platforms: New Feeds:** Simply duplicate the example sub-workflows and update the RSS feed URL and target platform information for each new tag/platform combination. New Platforms:** To add support for additional social platforms, duplicate one of the existing platform sub-workflows (LinkedIn or Bluesky) and update the platform-specific parameters, prompts, and GetLate API settings for your new platform. Airtable Database Schema: "RSS Feed - Content Store" The workflow uses an Airtable base with the following fields to track content from ingestion to publication: id (Primary Field): Formula, for unique record ID (RECORD_ID()) audience_targeted: Long text author: Long text character_count: Number content_markdown: Long text cta_used: Long text feed_id: Single line text goal_applied: Long text image_filename: Singe line text image_id: Single line text image_link: URL image_prompt: Long text is_posted: Number, default is 0 platform: Single line text post_text: Long text suggested_hashtags: Long text title: Long text tone_applied: Long text article_url: URL Customizable User Inputs: This workflow is built for flexibility. Key inputs you can customize include: Wallabag RSS Feed URL & Platform Tag:** The specific feed and platform-specific tag (e.g., #to-share-linkedin) to monitor in Workflow #1. Target Social Platform:** Defined per feed in the "Edit Field" node. Content Generation Schedule:** The frequency for auto-posting in the Schedule Trigger. Brand Voice & LLM Parameters:** The tone, style, and specific instructions for the AI in the "Set Custom SMCG Prompt" node. Platform-Specific Prompts:** The template used to generate posts for each social network (Instagram, LinkedIn, etc.). Posting Behavior & Image Generation:* Configured within the *SMCG (Social Media Content Generation) node**. This is where you set the posting mode (immediate vs. draft) and define a boolean for each platform to enable or disable AI-generated images for its posts. ๐ฅ Workflow 1: RSS Aggregator & Content Store RSS Trigger** โ Pulls tagged articles from Wallabag feeds using platform-specific tags Platform Assignment** โ Sets target social platform based on tag Content Conversion** โ HTML to Markdown formatting Airtable Storage** โ Saves articles to content database ๐ Adding New RSS Feeds: To monitor additional Wallabag feeds for different content sources, simply duplicate the existing RSS feed sub-workflow and update the RSS URL with your new Wallabag access token and platform-specific tag. Each feed can target a different social platform or content category. ๐ Workflow 2: AI Content Generator & Publisher Schedule Trigger** โ Runs on your preferred frequency Content Selection** โ Pulls unpublished articles from Airtable AI Configuration** โ Sets brand voice, posting behavior, and image generation preferences Platform Routing** โ Directs to appropriate social platform workflow AI Content Generation** โ Creates posts and image prompts using LLM Image Generation** โ Creates & hosts images when enabled Social Publishing** โ Posts to platforms via GetLate API Database Update** โ Marks content as published in Airtable ๐ Adding New Platform Support: To extend this workflow to additional social platforms, simply duplicate one of the existing platform sub-workflows and update the platform-specific parameters, LLM prompts, and GetLate API configuration for your target platform.
by InfyOm Technologies
โ What problem does this workflow solve? Many websites lack a smart, searchable interface. Visitors often leave due to unanswered questions. This workflow transforms any website into a Retrieval-Augmented Generation (RAG) chatbotโautomatically extracting content, creating embeddings, and enabling real-time, context-aware chat on your own site. โ๏ธ What does this workflow do? Accepts a website URL through a form trigger. Fetches and cleans website content. Parses content into smaller sections. Generates vector embeddings using OpenAI (or your embedding model). Stores embeddings and metadata in Supabaseโs vector database. When a user asks a question: Searches Supabase for relevant chunks via similarity search. Retrieves matching content as context. Sends context + question to OpenAI to generate an accurate answer. Returns the AI-generated response to the user in the chat interface. ๐ง Setup Instructions ๐ฅ๏ธ Website Form Trigger Use a Form / HTTP Trigger to submit website URLs for indexing. ๐ฅ Content Extraction & Chunking Use HTTP nodes to fetch HTML. Clean and parse it (e.g., remove scripts, ads). Use a Function node to split into manageable text chunks. ๐ง Embedding Generation Call OpenAI (or Cohere) to generate embeddings for each chunk. Insert vectors and metadata into Supabase via its API or n8n Supabase node. ๐ฌ User Query Handling Use a Chat Trigger (webhook/UI) to receive user questions. Convert the question into an embedding. Query Supabase with similarity search (e.g., match_documents RPC). Retrieve top-matching chunks and feed them into OpenAI with the user question. Return the reply to the user. ๐ AI & Database Setup OpenAI API key** for embedding and chat. A Supabase project with: vector extension enabled Tables for document chunks and embeddings A similarity search function like match_documents ๐ฌ How to Embed the Chat Widget on Your Website You can add the chatbot interface to your website with a simple JavaScript snippet. Steps: Open the "When chat message received" node Copy Chat URL Make sure, "Make Chat Publicly Available "Toggle is enabled Make sure the mode is "Embedded Chat" Follow the instructions given on this package here. ๐ง How it Works Submit URL โ Form Trigger Fetch Website Content โ HTTP Request Clean & Chunk Content โ Function Node Make Embeddings (OpenAI/Cohere) Store in Supabase โ embeddings + metadata User Chat โ Chat Trigger Search for Similar Content โ Supabase similarity match Generate Answer โ OpenAI completion w/ context Send Reply โ Chat interface returns answer ๐ Why Supabase? Supabase offers a scalable Postgres-based vector database with extensions like pgvector, making it easy to: Store vector data alongside metadata Run ANN (Approximate Nearest Neighbor) similarity searches Integrate seamlessly with n8n and your chatbot UI :contentReference[oaicite:1]{index=1} ๐ค Who can use this? ๐ Documentation websites ๐ฉโ๐ผ Support portals ๐ข Product/Landing pages ๐ Internal knowledge bases Perfect for anyone who wants a smart, website-specific chatbot without building an entire AI stack from scratch. ๐ Ready to Deploy? Plug in your: โ OpenAI API Key โ Supabase project credentials โ Chat UI or webhook endpoint โฆ and launch your AI-powered, website-specific RAG chatbot in minutes!
by Tharwat Mohamed
๐ AI Resume Screener (n8n Workflow Template) An AI-powered resume screening system that automatically evaluates applicants from a simple web form and gives you clear, job-specific scoring โ no manual filtering needed. โก What the workflow does ๐ Accepts CV uploads via a web form (PDF) ๐ง Extracts key info using AI (education, skills, job history, city, birthdate, phone) ๐ฏ Dynamically matches the candidate to job role criteria stored in Google Sheets ๐ Generates an HR-style evaluation and a numeric score (1โ10) ๐ฅ Saves the result in a Google Sheet and uploads the original CV to Google Drive ๐ก Why youโll love it FeatureBenefitAI scoringInstantly ranks candidate fit without reading every CVGoogle Sheet-drivenEasily update job profiles โ no code changesFast setupConnect your accounts and you're live in ~15 minsScalableWorks for any department, team, or organizationDeveloper-friendlyExtend with Slack alerts, translations, or automations ๐งฐ Requirements ๐ OpenAI or Google Gemini API Key ๐ Google Sheet with 2 columns: Role, Profile Wanted โ๏ธ Google Drive account ๐ n8n account (self-hosted or cloud) ๐ Setup in 5 Steps Import the workflow into n8n Connect Google Sheets, Drive, and OpenAI or Gemini Add your job roles and descriptions in Google Sheets Publish the form and test with a sample CV Watch candidate profiles and scores populate automatically ๐ค Want help setting it up? Includes free setup guidance by the creator โ available by email or WhatsApp after purchase. Iโm happy to assist you in customizing or deploying this workflow for your team. ๐ง Email: tharwat.elsayed2000@gmail.com ๐ฌ WhatsApp: +20106 180 3236
by Oneclick AI Squad
An intelligent food menu update notification system that automatically detects changes in your restaurant's special menu and sends personalized notifications to customers via multiple channels - WhatsApp, Email, and SMS. This workflow ensures your customers are always informed about new dishes, price changes, and menu availability in real-time. What's the Goal? Automatically monitor special menu updates from Google Sheets Detect menu changes and generate alert messages using AI Send multi-channel notifications (WhatsApp, Email, SMS) based on customer preferences Maintain comprehensive notification logs for tracking and analytics Provide seamless customer communication for menu updates Enable restaurant owners to keep customers engaged with latest offerings By the end, you'll have a fully automated menu notification system that keeps your customers informed and engaged with your latest culinary offerings. Why Does It Matter? Manual menu update communication is time-consuming and often missed by customers. Here's why this workflow is essential for restaurants: Real-Time Updates**: Customers receive instant notifications about menu changes Multi-Channel Reach**: WhatsApp, Email, and SMS ensure maximum customer reach Personalized Experience**: Customers receive notifications via their preferred channels Increased Sales**: Immediate awareness of new items drives orders Customer Retention**: Regular updates keep customers engaged and coming back Operational Efficiency**: Eliminates manual notification tasks for staff Data-Driven Insights**: Comprehensive logging for marketing analytics Think of it as your restaurant's digital menu announcer that never misses an update. How It Works Here's the complete workflow process: Step 1: Menu Monitoring Node**: Daily Menu Update Scheduler Function**: Triggers the workflow on a scheduled basis Frequency**: Configurable (hourly, daily, or real-time) Step 2: Data Retrieval Node**: Fetch Special Menu Data Function**: Pulls current menu data from Google Sheets (Sheet 1) Data**: Retrieves item details, prices, descriptions, and availability Step 3: Change Detection Node**: Detect Menu Changes Function**: Compares current data with previous state Logic**: Identifies new items, price changes, or availability updates Step 4: AI Content Generation Node**: Generate Menu Alert Message Function**: Creates engaging notification content using AI Output**: Formatted message with new items, descriptions, and prices Step 5: Customer Data Processing Node**: Fetch Customer Contact List Function**: Retrieves customer preferences from Google Sheets (Sheet 2) Filter**: Segments customers by notification preferences Step 6: Multi-Channel Delivery The workflow splits into three parallel notification channels: WhatsApp Branch Node**: Filter WhatsApp Users Function**: Identifies customers with WhatsApp notifications enabled Node**: Send WhatsApp Notification Function**: Delivers menu updates via WhatsApp Node**: Log WhatsApp Status Function**: Records delivery status in Sheet 3 Email Branch Node**: Filter Email Users Function**: Identifies customers with email notifications enabled Node**: Send Menu Email Function**: Delivers formatted email notifications Node**: Log Email Status Function**: Records delivery status in Sheet 3 SMS Branch Node**: Filter SMS Users Function**: Identifies customers with SMS notifications enabled Node**: Send Twilio SMS Alert Function**: Delivers text message notifications via Twilio Node**: Log SMS Status Function**: Records delivery status in Sheet 3 Step 7: Comprehensive Logging All notification activities are logged in Sheet 3 for tracking and analytics. Google Sheets Structure Sheet 1: Special Menu | Column | Description | Example | |--------|-------------|---------| | Item ID | Unique identifier for menu item | "ITEM001" | | Item Name | Name of the dish | "Truffle Risotto" | | Price | Item price | "$28.99" | | Description | Detailed item description | "Creamy arborio rice with black truffle, parmesan, and wild mushrooms" | | Nutritions | Nutritional information | "Calories: 450, Protein: 15g" | | Category | Menu category | "Main Course" | | Available | Availability status | "Yes" / "No" | Sheet 2: Customer Database | Column | Description | Example | |--------|-------------|---------| | Customer Name | Customer's full name | "ABC" | | Email | Customer's email address | "abc@gmail.com" | | Phone Number | Customer's phone number | "91999999999" | | WhatsApp Number | Customer's WhatsApp number | "91999999999" | | Email Notifications | Email preference | "Yes" / "No" | | SMS Notifications | SMS preference | "Yes" / "No" | | WhatsApp Notifications | WhatsApp preference | "Yes" / "No" | Sheet 3: Notification Logs | Column | Description | Example | |--------|-------------|---------| | Timestamp | Notification send time | "2025-07-09T12:51:09.587Z" | | Customer Name | Recipient name | "ABC" | | Notification Type | Channel used | "Email" / "SMS" / "WhatsApp" | | Status | Delivery status | "Sent" / "Failed" / "Pending" | | Message | Content sent | "SPECIAL MENU UPDATE..." | How to Use the Workflow Prerequisites Google Sheets Setup: Create three sheets with the required structure n8n Account: Access to n8n workflow platform WhatsApp Business API: WhatsApp Business account with API access Email Service: Gmail or SMTP service for email notifications Twilio Account: Twilio account for SMS functionality AI Model Access: OpenAI or similar AI service for content generation Importing the Workflow in n8n Step 1: Obtain the Workflow JSON Export the workflow from your n8n instance or obtain the JSON file Ensure you have the complete workflow configuration Step 2: Access n8n Workflow Editor Log in to your n8n instance (Cloud or self-hosted) Navigate to the Workflows section Click "Add Workflow" to create a new workflow Step 3: Import the Workflow Option A: Import from Clipboard Click the three dots (โฏ) in the top-right corner Select "Import from Clipboard" Paste the JSON code into the text box Click "Import" to load the workflow Option B: Import from File Click the three dots (โฏ) in the top-right corner Select "Import from File" Choose the .json file from your computer Click "Open" to import the workflow Configuration Setup Google Sheets Integration Authentication: Connect your Google account in n8n Sheet 1 Configuration: Set spreadsheet ID and range for menu data Sheet 2 Configuration: Set spreadsheet ID and range for customer data Sheet 3 Configuration: Set spreadsheet ID and range for notification logs WhatsApp Integration WhatsApp Business API: Set up WhatsApp Business API credentials Webhook Configuration: Configure webhook URLs for message delivery Message Templates: Create approved message templates for menu updates Email Integration Gmail/SMTP Setup: Configure email service credentials Email Templates: Design HTML email templates for menu notifications Sender Configuration: Set sender name and email address Twilio SMS Integration Twilio Account: Set up Twilio Account SID and Auth Token Phone Number: Configure Twilio phone number for SMS sending Message Templates: Create SMS message templates AI Content Generation API Configuration: Set up OpenAI or preferred AI service credentials Prompt Customization: Configure prompts for menu update content Content Parameters: Set message tone, length, and style Workflow Execution Automatic Execution Scheduled Triggers: Set up cron expressions for regular checks Webhook Triggers: Configure real-time triggers for immediate updates Manual Triggers: Enable manual execution for testing Monitoring and Maintenance Execution Logs: Monitor workflow execution through n8n interface Error Handling: Set up error notifications and retry mechanisms Performance Monitoring: Track execution times and success rates Sample Notification Message SPECIAL MENU UPDATE ๐ฝ๏ธ NEW ITEMS: โข Truffle Risotto - $28.99 Creamy arborio rice with black truffle, parmesan, and wild mushrooms โข Chocolate Lava Cake - $18.99 Warm chocolate cake with molten center, vanilla ice cream Total Menu Items: 2 Updated: 7/9/2025, 12:10:50 PM Visit our restaurant or call to place your order! ๐ Best Practices Data Management Regularly validate customer contact information Keep menu data updated and accurate Maintain clean customer preference settings Notification Strategy Send notifications during optimal hours (lunch/dinner time) Limit frequency to avoid customer fatigue Personalize messages based on customer preferences Content Quality Use engaging language and emojis appropriately Include clear pricing and descriptions Add call-to-action for immediate orders Performance Optimization Batch process notifications to avoid rate limits Implement retry logic for failed deliveries Monitor API quotas and usage limits Troubleshooting Common Issues Authentication Errors**: Verify API credentials and permissions Rate Limiting**: Implement delays between notifications Message Delivery**: Check phone number formats and email addresses Sheet Access**: Ensure proper sharing permissions Error Handling Set up notification alerts for workflow failures Implement fallback mechanisms for service outages Maintain backup notification methods Analytics and Reporting Key Metrics Delivery Rates**: Track successful notifications by channel Customer Engagement**: Monitor response rates and feedback Menu Performance**: Analyze which items generate most interest Channel Effectiveness**: Compare performance across WhatsApp, Email, and SMS Reporting Features Automated daily/weekly reports Customer preference analytics Notification performance dashboards Revenue correlation with menu updates Security and Compliance Data Protection Secure storage of customer contact information Compliance with GDPR and local privacy laws Regular security audits of API access Rate Limiting Respect platform rate limits (WhatsApp, Twilio, Email) Implement queuing systems for high-volume notifications Monitor and adjust sending frequencies Conclusion The Food Menu Update Notifier transforms restaurant communication from reactive to proactive, ensuring customers are always informed about your latest offerings. By leveraging multiple communication channels and AI-generated content, this workflow creates a seamless bridge between your kitchen innovations and customer awareness. This system not only improves customer engagement but also drives immediate sales through timely notifications about new menu items, special offers, and seasonal dishes. The comprehensive logging and analytics capabilities provide valuable insights for menu optimization and marketing strategy refinement.
by Intuz
This n8n template delivers a complete AI-powered solution for automated LinkedIn posts, including unique content, custom images, and optimized hashtags. Use cases are many: Generate and schedule tailored LinkedIn content for different use-cases. By feeding the AI specific prompts, you can create specific post depending upon the topics and visuals to maintain a consistency yet and an online presence. How it works Maintaining a consistent and engaging presence on LinkedIn can be time-consuming, requiring constant ideation, content creation, and manual posting. This workflow takes that burden off your shoulders, delivering a fully automated solution for generating and publishing high-quality LinkedIn content. Scheduled Content Engine: Each day (or on your chosen schedule), the workflow kicks into gear, ensuring a fresh stream of content. Smart Topic & Content Generation: Using the power of Google Gemini, it intelligently crafts unique content topics and then expands them into full, engaging posts, ensuring your message is always fresh and relevant. Dynamic Image Creation: To make your posts stand out, the workflow leverages an AI image generator (like DALL-E) to produce a custom, eye-catching visual that perfectly complements your generated text. SEO-Optimized Hashtag Generation: Google Gemini then analyzes your newly created post and automatically generates a set of relevant, trending, and SEO-friendly hashtags, significantly boosting your content's reach and discoverability. Seamless LinkedIn Publishing: Finally, all these elementsโyour compelling text, unique image, and powerful hashtagsโare merged and automatically published to your LinkedIn profile, establishing you as a thought leader with minimal effort. How to Use: Quick Start Guide This guide will get your AI LinkedIn Content Automation workflow up and running in n8n. Import Workflow Template: Download the template's JSON file and import it into your n8n instance via "File" > "Import from JSON." Configure Credentials: Google Gemini: Set up and apply your API key credentials to all "Google Gemini Chat Model" nodes. AI Image Generation (e.g., OpenAI): Create and apply API key credentials for your chosen image generation service to the "Generate an Image" node. LinkedIn: Set up and apply OAuth credentials to the "Create a post" node for your LinkedIn account. Customize Schedule & AI Prompts: Schedule Trigger: Double-click "Schedule Trigger 1" to set how often your workflow runs (e.g., daily, weekly). AI Prompts: Review and edit the prompts within the "Content Topic Generator," "Content Creator," and "Hashtag Generator / SEO" nodes to guide the AI for your desired content style and topics. Test & Activate: Test Run: Click "Execute Workflow" to perform a test run and verify all steps are working as expected. Activate: Once satisfied, toggle the workflow "Active" switch to enable automated posting on your defined schedule. Requirements To use this workflow template, you will need: n8n Instance: A running n8n instance (cloud or self-hosted) to import and execute the workflow. Google Gemini Account: For content topic generation, content creation, and hashtag generation (requires Google Gemini API Key) from Google AI Studios. AI Image Generation Service Account: For creating images (e.g., OpenAI DALL-E API Key or similar service that the "Generate an Image" node uses). LinkedIn Account: For publishing the generated posts (requires LinkedIn OAuth Credentials for n8n connection). Connect with us Website: https://www.intuz.com/cloud/stack/n8n Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz
by Bright Data
๐ Yelp Business Finder: Scraping Local Businesses by Keyword, Category & Location Using Bright Data and Google Sheets Description: Automate local business data collection from Yelp using AI-powered input validation, Bright Data scraping, and automatic Google Sheets integration. Perfect for market research, lead generation, and competitive analysis. ๐ ๏ธ How It Works Form Submission: Users submit a simple form with country, location, and business category parameters. AI Validation: Google Gemini AI validates and cleans input data, ensuring proper formatting and Yelp category alignment. Data Scraping: Bright Data's Yelp dataset API scrapes business information based on the cleaned parameters. Status Monitoring: The workflow monitors scraping progress and waits for data completion. Data Export: Final business data is automatically appended to your Google Sheets for easy analysis. ๐ Setup Steps โฑ๏ธ Estimated Setup Time: 10-15 minutes Prerequisites โ Active n8n instance (cloud or self-hosted) โ Google account with Sheets access โ Bright Data account with Yelp scraping dataset โ Google Gemini API access Configuration Steps Import Workflow: Copy the provided JSON workflow In n8n: Go to Workflows โ + Add workflow โ Import from JSON Paste the JSON and click Import Configure Google Sheets: Create a new Google Sheet or use an existing one Set up OAuth2 credentials in n8n Update the Google Sheets node with your document ID Configure column mappings for business data Setup Bright Data: Add your Bright Data API credentials to n8n Replace BRIGHT_DATA_API_KEY with your actual API key Verify your Yelp dataset ID in the HTTP request nodes Test the connection Configure Google Gemini: Add your Google Gemini API credentials Test the AI Agent connection Verify the model configuration Test & Activate: Activate the workflow using the toggle switch Test with sample data: country="US", location="New York", category="restaurants" Verify data appears correctly in your Google Sheet ๐ Data Output ๐ Business Name Official business name from Yelp โญ Overall Rating Average customer rating (1-5 stars) ๐ Reviews Count Total number of customer reviews ๐ท๏ธ Categories Business categories and tags ๐ Website URL Official business website ๐ Phone Number Contact phone number ๐ Address Full business address ๐ Yelp URL Direct link to Yelp listing ๐ฏ Use Cases ๐ Market Research Analyze local business landscapes and competition ๐ Lead Generation Build prospect lists for B2B outreach ๐ช Location Analysis Research business density by area and category ๐ Competitive Intelligence Monitor competitor ratings and customer feedback โ ๏ธ Important Notes: Ensure you comply with Yelp's terms of service and rate limits Bright Data usage may incur costs based on your plan AI validation helps improve data quality and reduce errors Monitor your Google Sheets for data accuracy ๐ง Troubleshooting Common Issues: API Rate Limits:** Implement delays between requests if needed Invalid Categories:** AI agent helps standardize category names Empty Results:** Verify location spelling and category alignment Authentication Errors:** Check all API credentials and permissions ๐ Ready to start scraping Yelp business data efficiently!
by berke
Who's it for This workflow is perfect for sales teams, customer service departments, and businesses that frequently handle spare parts inquiries via email. It's especially valuable for companies managing multiple products with complex pricing structures who want to automate their quotation process while maintaining professional, multilingual communication. What it does This workflow: Monitors your Gmail inbox** for incoming spare parts requests Automatically generates professional HTML price quotes** in the sender's language Sends personalized replies** Uses AI to detect the email language (supports Turkish, English, German, and more) Extracts project or part codes** Fetches pricing data from Google Sheets** Calculates totals accurately** Formats everything** into a clean, professional quote that matches your brand How it works Schedule Trigger runs every minutes to check for new emails Gmail node fetches the latest unread email Keyword detection filters for spare parts-related terms in multiple languages AI Agent processes the request by: Detecting the email's language Extracting project/part codes Querying three Google Sheets: CRM, Bill of Materials, Pricing Calculating line totals and grand total Generating a professional HTML quote in the sender's language Gmail reply sends the quote and marks the original email as read Requirements n8n self-hosted or cloud instance Gmail account with OAuth2 authentication Google Sheets with proper structure (3 sheets for CRM, BoM, and Pricing data) Google Gemini API key for AI processing Basic understanding of Google Cloud Console for OAuth setup How to set up Import the workflow into your n8n instance Create three Google Sheets with the following column structure: CRM Sheet: Email, ProjectCode, CustomerName Bill of Materials: ProjectCode, PartCode, PartDescription, Quantity Pricing Sheet: PartCode, UnitPriceEUR, PartDescription Configure credentials: Set up Gmail OAuth2 in Google Cloud Console Configure Google Sheets OAuth2 (can use same project) Get your Google Gemini API key from Google AI Studio Update the workflow: Replace placeholder Sheet IDs in the CRM, BoM, and Pricing nodes Adjust company name in the AI Agentโs system message Modify keyword detection if needed Test with a sample email before activating How to customize the workflow Add more languages**: Update the keyword detection node with additional terms Modify the quote template**: Edit the HTML in the AI Agent's message to match your branding Change data sources**: Replace Google Sheets with PostgreSQL or MySQL nodes Add approval steps**: Insert a manual approval node for quotes above a certain value Include attachments**: Add PDF or product spec file nodes Enhance notifications**: Add Slack or Teams notifications after quote is sent Implement follow-ups**: Create a separate workflow for reminder emails This template provides a solid foundation for automating your quotation process, while staying flexible to fit your specific business needs. Feel free to contact me for further implementation guidelines: LinkedIn: Berke
by Cyril Nicko Gaspar
๐ AI Agent Template with Bright Data MCP Tool Integration This template obtains all the possible tools from Bright Data MCP, process this through chatbot, then run any tool based on the user's query โ Problem It Solves The problem that the MCP solves is the complexity and difficulty of traditional automation, where users need to have specific knowledge of APIs or interfaces to trigger backend processes. By allowing interaction through natural language, automatically classifying and routing queries, and managing context and memory effectively, MCP simplifies complex data operations, customer support, and workflow orchestration scenarios where inputs and responses change dynamically. ๐งฐ Pre-requisites Before deploying this template, ensure you have: An active n8n instance (self-hosted or cloud). A valid OpenAI API key (or any AI models) Access to Bright Data MCP API with credentials. Basic familiarity with n8n workflows and nodes. โ๏ธ Setup Instructions **Install the MCP Community Node in N8N In your N8N self-hosted instance, go to Settings โ Community Nodes. Search and install n8n-nodes-mcp. Configure Credentials: Add your OpenAI API key or any AI mdeols to the relevant nodes. If you want other AI model, please replace all associated nodes of OpenAI in the workflow Set up Bright Data MCP client credentials in the installed community node (STDIO) Obtain your API in Bright Data and put it in Environment field in the credentials window. It should be written as API_Key=<your api key from Bright Data> ๐ Workflow Functionality (Summary) User message** triggers the workflow. AI Classifier** (OpenAI) interprets the intent and maps it to a tool from Bright Data MCP. If no match is found, the user is notified. If more information is needed, the AI requests it. Memory** preserves context for follow-up actions. The tool is executed, and results are returned contextually to the user. > ๐ง Optional memory buffer and chat memory manager nodes keep conversations context-aware across multiple messages. ๐งฉ Use Cases Data Scraping Automation**: Trigger scraping tasks via chat. Lead Generation Bots**: Use MCP tools to fetch, enrich, or validate data. Customer Support Agents**: Automatically classify and respond to queries with tool-backed answers. Internal Workflow Agents**: Let team members trigger backend jobs (e.g., reports, lookups) by chatting naturally. ๐ ๏ธ Customization Tool Matching Logic**: Modify the AI classifier prompt and schema to suit different APIs or services. Memory Size and Retention**: Adjust memory buffer size and filtering to fit your appโs complexity. Tool Execution**: Extend the "Execute the tool" sub-workflow to handle additional actions, fallback strategies, or logging. Frontend Integration**: Connect this with various platforms (e.g., WhatsApp, Slack, web chatbots) using the webhook. โ Summary This template delivers a powerful no-code/low-code agent that turns chat into automation, combining AI intelligence with real-world tool execution. With minimal setup, you can build contextual, dynamic assistants that drive backend operations using natural language.
by Ranjan Dailata
Who this is for? The LinkedIn Company Story Generator is an automated workflow that extracts company profile data from LinkedIn using Bright Data's web scraping infrastructure, then transforms that data into a professionally written narrative or story using a language model (e.g., OpenAI, Gemini). The final output is sent via webhook notification, making it easy to publish, review, or further automate. This workflow is tailored for:โ Marketing Professionals**: Seeking to generate compelling company narratives for campaigns.โ Sales Teams**: Aiming to understand potential clients through summarized company insights.โ Content Creators**: Looking to craft stories or articles based on company data.โ Recruiters**: Interested in obtaining concise overviews of companies for talent acquisition strategies.โ What problem is this workflow solving? Manually gathering and summarizing company information from LinkedIn can be time-consuming and inconsistent. This workflow automates the process, ensuring:โ Efficiency**: Quick extraction and summarization of company data.โ Consistency**: Standardized summaries for uniformity across use cases.โ Scalability**: Ability to process multiple companies without additional manual effort. What this workflow does The workflow performs the following steps:โ Input Acquisition**: Receives a company's name or LinkedIn URL as input.โ Data Extraction**: Utilizes Bright Data to scrape the company's LinkedIn profile.โ Information Parsing**: Processes the extracted HTML content to retrieve relevant company details.โ Summarization**: Employs AI Google Gemini to generate a concise company story. Output Delivery**: Sends the summarized content to a specified webhook or email address. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the LinkedIn URL by navigating to the Set LinkedIn URL node. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Input Variations: Modify the **Set LinkedIn URL node to accept a different company LinkedIn URL. Data Points**: Adjust the HTML Data Extractor Node to retrieve additional details like employee count, industry, or headquarters location.โ Summarization Style**: Customize the AI prompt to generate summaries in different tones or formats (e.g., formal, casual, bullet points).โ Output Destinations**: Configure the output node to send summaries to various platforms, such as Slack, CRM systems, or databases.