by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for This n8n-powered automation uses Bright Data's MCP Client to extract real-time data from a price drop site listing the amazon products, including price changes and related product details. The extracted data is enriched with structured data transformation, content summarization, and sentiment analysis using Google Gemini LLM. The Amazon Price Drop Intelligence Engine is designed for: Ecommerce Analysts** who need timely updates on competitor pricing trends Brand Managers** seeking to understand consumer sentiment around pricing Data Scientists** building pricing models or enrichment pipelines Affiliate Marketers** looking to optimize campaigns based on dynamic pricing AI Developers** automating product intelligence pipelines What problem is this workflow solving? This workflow solves several key pain points: Reliable Scraping: Uses Bright Data MCP, a managed crawling platform that handles proxies, captchas, and site structure changes automatically. Insight Generation: Transforms unstructured HTML into structured data and then into human-readable summaries using Google Gemini LLM. Sentiment Context: Goes beyond raw pricing data to reveal how customers feel about the price change, helping businesses and researchers measure consumer reaction. Automated Reporting: Aggregates and stores data for easy access and downstream automation (e.g., dashboards, notifications, pricing models). What this workflow does Scrape price drop site with Bright Data MCP The workflow begins by scraping targeted price drop site for Amazon listings using Bright Data's Model Context Protocol (MCP). You can configure this to target: Structured Data Extraction Once the HTML content is retrieved, Google Gemini is employed to: Parse and structure the product information (title, price, discount, brand, ratings) Summarization & Sentiment Analysis The extracted data is passed through an LLM chain to: Generate a concise summary of the product and its recent price movement Perform sentiment analysis on user reviews and public perception Store the Results Save to disk for archiving or bulk processing Updated in a Google Sheet, making it instantly shareable with your team or integrated into a BI dashboard Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Target different platforms**: Switch Amazon for Walmart, eBay, or any ecommerce source using Bright Data’s flexible scraping infrastructure. Enrich with more LLM tasks**: Add brand tone analysis, category classification, or competitive benchmarking using Gemini prompts. Visualize output**: Pipe the Google Sheet to Looker Studio, Tableau, or Power BI. Notification integrations**: Add Slack, Discord, or email notifications for price drop alerts.
by Nick Saraev
AI Ad Scraper & Image Generator with Facebook Ad Library Categories: PPC Automation, Creative Generation, Competitive Intelligence This workflow creates an end-to-end ad library scraper and AI image spinner system that automatically discovers competitor ads, analyzes their design elements, and generates multiple unique variations ready for your own campaigns. Built to eliminate 60-70% of manual creative work for PPC agencies, this system transforms competitor research into actionable ad variants in minutes. Benefits Automated Competitor Research** - Scrapes Facebook Ad Library for active competitor campaigns automatically AI-Powered Creative Analysis** - Uses OpenAI vision to comprehensively analyze ad design elements and copy Intelligent Image Generation** - Creates 3+ unique variations per source ad while maintaining effective layouts Complete Asset Organization** - Automatically organizes source ads and generated variations in structured Google Drive folders Campaign-Ready Output** - Generates Google Sheets database with direct links to all assets for immediate campaign deployment Massive Time Savings** - Replaces hours of manual creative work with automated competitive intelligence and generation How It Works Facebook Ad Library Scraping: Connects to Facebook's Ad Library through Apify scraper integration Searches active ads based on keywords, industries, or competitor targeting Filters for image-based ads and removes video-only content for processing Intelligent Asset Organization: Creates unique Google Drive folder structure for each scraped ad campaign Separates source competitor ads from AI-generated variations Maintains organized asset library for easy campaign management and iteration AI-Powered Creative Analysis: Uses OpenAI's vision model to comprehensively describe each competitor ad Identifies design elements, color schemes, layout patterns, and messaging approaches Generates detailed creative briefs for intelligent variation generation Smart Image Variation System: Creates 3 unique style variations per source ad using advanced AI prompting Maintains effective layout structures while changing colors, fonts, and styling Customizes messaging and branding to match your business requirements Campaign Database Integration: Logs all source ads and generated variations in organized Google Sheets Provides direct links to all assets for immediate campaign deployment Tracks performance data and creative iterations for ongoing optimization Required Setup Configuration Google Drive Structure: The workflow automatically creates this folder organization: PPC Thievery (Parent Folder) ├── [Ad Archive ID] (Per Campaign) │ ├── 1. Source Assets (Original competitor ads) │ └── 2. Spun Assets (AI-generated variations) Google Sheets Database Columns: timestamp - Unique record identifier ad_archive_id - Facebook's internal ad identifier page_id - Advertiser's Facebook page ID original_image_url - Direct link to source competitor ad page_name - Advertiser's business name ad_body - Original ad copy text date_scraped - When the ad was discovered spun_prompts - AI-generated variation instructions asset_folder - Link to campaign's Google Drive folder source_folder - Link to original ads folder spun_folder - Link to generated variations folder direct_spun_image_link - Direct link to generated ad image Set Variables Configuration: Update these values in the "Set Variables" node: googleDriveFolderId - Your parent Google Drive folder ID changeRequest - Your brand-specific variation instructions spreadsheetId - Your Google Sheets database ID Apify API Setup: Create Apify account and obtain API key Replace <your-apify-api-key-here> with actual credentials Customize search terms in the JSON body for your target competitors Adjust scraping count (default: 20 ads per run) Business Use Cases PPC Agencies** - Automate competitive research and creative generation for client campaigns E-commerce Brands** - Monitor competitor advertising strategies and generate response campaigns Marketing Teams** - Scale creative production with AI-powered competitive intelligence Freelance Marketers** - Offer advanced competitive analysis and creative services to clients SaaS Companies** - Track competitor messaging and generate differentiated ad variations Agency Teams** - Replace manual creative research with automated competitive intelligence systems Revenue Potential This system revolutionizes PPC agency economics: 60-70% reduction** in manual creative work and competitive research time 3-5x faster** campaign launch times with ready-to-use creative assets $2,000-$5,000 service value** for comprehensive competitive intelligence and creative generation Scalable competitive advantage** through automated monitoring of competitor campaigns Premium positioning** offering AI-powered creative intelligence that competitors can't match manually Difficulty Level: Advanced Estimated Build Time: 2-3 hours Monthly Operating Cost: ~$100 (Apify + OpenAI + Google APIs) Watch My Complete Live Build Want to see me build this entire system from scratch? I walk through every component live - including the ad library integration, AI analysis setup, image generation pipeline, and all the debugging that goes into creating a production-ready competitive intelligence system. 🎥 See My Live Build Process: "Ad Library Scraper & AI Image Spinner System (N8N Build)" This comprehensive tutorial shows the real development process - including advanced AI prompting for image generation, competitive analysis strategies, and the organizational systems that make this scalable for agency use. Set Up Steps Initial Database Setup: Run the initialization flow once to create your Google Drive folder and Sheets database Copy the generated folder ID and spreadsheet ID into the "Set Variables" node Configure your brand-specific change request template for consistent output Apify Integration: Set up Apify account with Facebook Ad Library scraper access Configure API credentials and test with small ad batches Customize search parameters for your target competitors and industries AI Service Configuration: Connect OpenAI API for vision analysis and image generation Set up appropriate rate limiting to control processing costs Test the complete AI pipeline with sample competitor ads Google Services Setup: Configure Google Drive API credentials for automated folder creation Set up Google Sheets integration for campaign database management Test the complete asset organization and tracking workflow Campaign Customization: Define your brand guidelines and messaging requirements in the change request Set up variation templates for different campaign types and industries Configure batch processing limits based on your API usage requirements Production Optimization: Remove the limit node for full-scale competitive monitoring Set up automated scheduling for regular competitive intelligence gathering Monitor and optimize AI prompts based on generated creative quality Advanced Optimizations Scale the system with: Multi-Platform Scraping:** Extend to LinkedIn, Twitter, and Google Ads for comprehensive competitive intelligence Performance Tracking:** Integrate with ad platforms to track performance of generated variations Style Guide Automation:** Create industry-specific variation templates for consistent brand application A/B Testing Integration:** Automatically test generated variations against source ads for performance optimization CRM Integration:** Connect competitive intelligence data with sales and marketing systems Important Considerations API Rate Limits:** Built-in delays prevent service overload and ensure reliable operation Creative Quality:** System generates multiple variations to account for AI generation variability Legal Compliance:** Use generated variations as inspiration while respecting intellectual property rights Cost Management:** Monitor OpenAI image generation costs and adjust batch sizes accordingly Competitive Ethics:** Focus on learning from successful patterns rather than direct copying Why This System Works The competitive advantage lies in speed and scale: Minutes vs. Hours:** Generate campaign-ready creative variations in minutes instead of hours of manual work Systematic Analysis:** AI vision provides consistent, comprehensive analysis that humans might miss Organized Intelligence:** Structured asset management enables rapid campaign deployment and iteration Scalable Monitoring:** Automated competitive research that scales beyond manual capacity Quality Variations:** Multiple AI-generated options ensure high-quality creative output Check Out My Channel For more advanced automation systems and proven agency-building strategies that generate real revenue, explore my YouTube channel where I share the exact methodologies used to scale automation agencies to $72K+ monthly revenue.
by Jimleuk
This template attempts to replicate OpenAI's DeepResearch feature which, at time of writing, is only available to their pro subscribers. > An agent that uses reasoning to synthesize large amount of online information and complete multi-step research tasks for you. Source Though the inner workings of DeepResearch have not been made public, it is presumed the feature relies on the ability to deep search the web, scrape web content and invoking reasoning models to generate reports. All of which n8n is really good at! Using this workflow, n8n users can enjoy a variation of the Deep Research experience for themselves and their teams at a fraction of the cost. Better yet, learn and customise this Deep Research template for their businesses and/or organisations. Check out the generated reports here: https://jimleuk.notion.site/19486dd60c0c80da9cb7eb1468ea9afd?v=19486dd60c0c805c8e0c000ce8c87acf How it works A form is used to first capture the user's research query and how deep they'd like the researcher to go. Once submitted, a blank Notion page is created which will later hold the final report and the researcher gets to work. The user's query goes through a recursive series of web serches and web scraping to collect data on the research topic to generate partial learnings. Once complete, all learnings are combined and given to a reasoning LLM to generate the final report. The report is then written to the placeholder Notion page created earlier. How to use Duplicate this Notion database template and make sure all Notion related nodes point to it. Sign-up for APIFY.com API Key for web search and scraping services. Ensure you have access to OpenAI's o3-mini model. Alternatively, switch this out for o1 series. You must publish this workflow and ensure the form url is publically accessible. On depth & breadth configuration For more detailed reports, increase depth and breadth but be warned the workflow will take exponentially longer and cost more to complete. The recommended defaults are usually good enough. Depth=1 & Breadth=2 - will take about 5 - 10mins. Depth=1 & Breadth=3 - will take about 15 - 20mins. Dpeth=3 & Breadth=5 - will take about 2+ hours! Customising this workflow I deliberately chose not to use AI-powered scrapers like Firecrawl as I felt these were quite costly and quotas would be quickly exhausted. However, feel free to switch web search and scraping services which suit your environment. Maybe you don't decide to source the web and instead, data collection comes from internal documents instead. This template gives you freedom to change this. Experiment with different Reasoning/Thinking models such as Deepseek and Google's Gemini 2.0. Finally, the LLM prompts could definitely be improved. Refine them to fit your use-case. Credits This template is largely based off the work by David Zhang (dzhng) and his open source implementation of Deep Research: https://github.com/dzhng/deep-research
by Luciano Gutierrez
Healthcare Clinic Assistant with WhatsApp and Telegram Integration Version: 1.1.0 n8n Version: 1.88.0+ License: MIT 📋 Description A comprehensive and modular automation workflow designed for healthcare clinics. It manages patient communication, appointment scheduling, confirmations, rescheduling, internal tasks, and media processing by integrating WhatsApp, Telegram, Google Calendar, and Google Tasks, combined with AI-powered agents for maximum efficiency. This system guarantees proactive communication with patients, streamlined internal clinic management, and consistent data synchronization across platforms. 🌟 Key Features 🤖 AI-Powered Specialized Agents: Distinct agents handle WhatsApp patient support, appointment confirmations, and internal rescheduling tasks. 📱 Omnichannel Communication: Handles patient interactions via WhatsApp and staff commands via Telegram. 📅 Google Calendar Appointment Management: Full synchronization for creating, updating, canceling, and confirming appointments. 📋 Task Management with Google Tasks: Manages shopping lists and administrative tasks efficiently through staff Telegram requests. 🔔 Automated Appointment Reminders: Daily-triggered system proactively sends WhatsApp confirmations to patients for next-day appointments. 🖼️ Intelligent Media Processing: Transcribes audios, extracts text from images, and processes documents using OpenAI and OpenRouter AI models. 🛡️ Escalation to Human Support: Automatically detects sensitive or urgent cases and escalates them to a human agent when needed. 🏥 Use Cases Patient Communication:** Respond to inquiries, schedule, reschedule, and confirm appointments seamlessly via WhatsApp. Internal Clinic Operations:** Allow staff to modify appointments or add shopping list reminders directly from Telegram. Appointment Confirmation System:** Automatically contacts patients one day prior to appointments for confirmation or rescheduling. Task and Reminder Management:** Keeps clinic operations organized through automatic task management with Google Tasks. 🛠️ Technical Implementation WhatsApp Patient Interaction Flow Webhook Reception:** Incoming WhatsApp messages captured via Evolution API webhook. Message Classification:** Intelligent routing of messages based on content type (text, image, audio, document). Media Content Processing:** Audios: Download, convert, and transcribe via OpenAI Whisper. Images: Analyze and extract text/descriptions with OpenAI Vision model. Patient Request Handling:** Specialized WhatsApp assistant responds appropriately using AI prompts. Outbound Message Formatting:** Ensures messages comply with WhatsApp format standards. Message Delivery:** Sends responses back via Evolution API. Telegram Staff Management Flow Telegram Webhook Reception:** Captures messages from authorized staff accounts. Internal Assistant Processing:** Appointment Rescheduling: Identifies and updates appointments through MCP Google Calendar. Task Creation: Adds new entries to the clinic's shopping list using Google Tasks. Notifications and Confirmations:** Sends confirmations back to staff through Telegram. Appointment Reminder System Daily Trigger Activation:** Fires every weekday at 08:00 AM. Calendar Scraping:** Lists next day's appointments from Google Calendar. Patient Contact:** Sends WhatsApp confirmation messages for each appointment. Response Management:** Redirects confirmation or rescheduling replies to appropriate agents. ⚙️ Setup Instructions Import the Workflow n8n → Workflows → Import from File → Upload this JSON file. Configure Credentials Evolution API (WhatsApp Communication) Telegram Bot API (Staff Communication) Google Calendar OAuth2 (Appointment Management) Google Tasks OAuth2 (Task Management) OpenAI and OpenRouter APIs (AI Agents) PostgreSQL Database (Chat Memory) Set Sensitive Variables Replace placeholder values: {sua instância aqui} → Evolution API instance name {número_whatsapp} → WhatsApp numbers {url_do_servidor} → Server URLs {a sua apikey aqui} → API keys {seu_calendario} → Google Calendar ID Customize AI Prompts Adjust system prompts to fit your clinic’s tone, service style, and patient communication guidelines. Set clinic operating hours, escalation rules, and cancellation procedures in AI prompts. Activate and Test Simulate patient messages via WhatsApp. Test Telegram commands from staff members. Validate daily appointment reminders using the scheduled trigger. 🏷️ Tags Healthcare Clinic Management WhatsApp Integration Telegram Bot Appointment Scheduling Google Calendar Google Tasks AI Agents n8n Automation 📚 Technical Notes PostgreSQL is used for persistent chat memory across sessions. Multiple AI Models Used: OpenAI GPT-4.1-nano OpenAI GPT-4.1-mini Google Gemini 2.0 and 2.5 Full media content processing supported (audio, image, text). Compliant escalation workflows ensure patient safety and proper handoff to human staff when necessary. All sensitive patient data are securely stored inside calendar event descriptions for easy retrieval by agents. 📜 License This workflow is provided under the MIT License. Feel free to adapt and customize it for your clinic’s specific needs.
by Jez
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Uncover new business leads with this AI-Powered Prospect Discovery Agent! This n8n workflow acts as a specialized intelligent assistant that, given a business type and location, uses multiple search strategies to identify a list of potential prospect companies and their websites. Stop manually trawling through search results! This agent automates the initial phase of lead generation by: Understanding your target business profile (type, location, keywords). Strategically using web search tools (Brave Search, Google Gemini Search) to find relevant businesses. Performing quick validations to confirm relevance. Returning a clean, structured JSON list of prospect names and their website URLs. How it Works: The workflow is built around an AI agent powered by Google Gemini. This agent is equipped with tools like: Brave Web Search:** For broad initial sourcing of potential business candidates. Google Gemini Search:** For advanced, context-aware discovery and finding businesses mentioned in various online sources. Brave Local Search (Selective):** For quick verification of local presence or finding website URLs for identified names. Jina AI Web Page Scraper (Very Selective):** For extremely rapid relevance checks on uncertain websites by scanning page content for keywords. The agent's system prompt guides it to use these tools efficiently to build a list of prospects without getting bogged down in deep research on any single one at this discovery stage. Use Cases: Lead Generation:** Automatically generate lists of potential clients based on industry and location. Market Research:** Identify key players or types of businesses in a specific geographical area. Sales Development:** Provide SDRs with initial lists of companies to research further. Called as a Sub-Workflow:** Designed to be easily integrated as a "tool" into more complex orchestrating AI agents (e.g., a BNI Pitch Planner that first needs to identify who to target). Setup: Import the workflow. Configure Credentials: You'll need n8n credentials for: Google Gemini (for the Chat model and the Gemini Search/Vertex AI Search tool). Brave Search (e.g., via Smithery MCP, or adapt if you have direct API access). Jina AI (for the web scraper). Assign these to the respective nodes. Review System Prompt: The prospect_discovery_agent node contains a detailed system prompt. You can fine-tune this to adjust its search strategies or the strictness of its matching. Inputs: This workflow is triggered by an "Execute Workflow Trigger" node (prospect_discovery_workflow). It expects the following inputs: business_type (string): e.g., "artisan bakery" location_query (string): e.g., "Portland, Oregon" desired_num_prospects (number): e.g., 5 additional_keywords (string, optional): e.g., "organic, gluten-free" To Use (as a Sub-Workflow/Tool): This workflow is typically called by another n8n workflow (e.g., using a "Tool Workflow" node from the Langchain nodes). The calling workflow would provide the inputs listed above. The "Prospect Discovery" workflow will then execute and its final node (the prospect_discovery_agent) will output a JSON array of found prospects, like: [ { "business_name": "Rose Petal Bakery", "website_url": "https://rosepetalbakerypdx.com" }, { "business_name": "The Daily Bread Artisans", "website_url": "https://dailybreadpdx.com" } ] If no prospects are found, it returns an empty array []. This template provides a powerful and focused tool for automating the initial stages of prospect identification.
by Kanaka Kishore Kandregula
Boost Sales with Automated Magento 2 Product and Coupon Notifications This n8n workflow automatically posts new Magento products & coupons to Telegram while preventing duplicates. Key benefits: ✅ Increase conversions with time-sensitive alerts (creates urgency) ✅ Reduce missed opportunities with 24/7 monitoring ✅ Improve customer engagement through rich media posts ✅ Save hours per week by automating manual posting Why This Works: Triggers impulse buys with real-time notifications Eliminates human error in duplicate posting Scales effortlessly as your catalog grows Provides analytics through database tracking Perfect for e-commerce stores wanting to: Announce new arrivals instantly Promote limited-time offers effectively Maintain consistent social presence Track performance through MySQL This workflow automatically: ✅ Detects new products AND coupons in Magento ✅ Prevents duplicate postings with MySQL tracking ✅ Posts rich formatted alerts to Telegram ✅ Runs on a customizable schedule ✨ Key Features For Products: Product name, price, and image Direct store link Media gallery support For Coupons: Coupon code and status Usage limits (times used/available) Active/inactive status indicator Core System: 🔒 MySQL duplicate prevention⏰ 1-hour schedule (customizable)📱 Telegram notifications with Markdown 🛠️ Configuration Guide Database Setup CREATE TABLE IF NOT EXISTS posted_items (item_id INT PRIMARY KEY, item_type ENUM('product', 'coupon') NOT NULL, item_value VARCHAR(255), posted BOOLEAN DEFAULT FALSE, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP); Required Credentials Magento API (HTTP Header Auth) MySQL Database Telegram Bot Sticky Notes `❗ IMPORTANT SETUP NOTES ❗ For products: Ensure 'url_key' exists in custom_attributes For coupons: Magento REST API must expose coupon rules MySQL user needs INSERT/SELECT privileges Telegram bot must be added to your channel first 🔄 SCHEDULING: - Default: Checks every 1 hours at :00 - Adjust in Schedule Trigger node ` ⚙️ Technical Details Workflow Logic: Checks for new products/coupons via Magento API Verifies against MySQL database Only posts if record doesn't exist Updates database after successful post Error Handling: Automatic skip if product/coupon exists Empty result handling Connection timeout protection 🌟 Why This Template? Complete Solution**: Handles both products AND coupons Battle-Tested**: Prevents all duplicates reliably Ready-to-Use**: Just add your credentials Fully Customizable**: Easy to modify for different needs Perfect for e-commerce stores using Magento 2 who want automated, duplicate-free social notifications!
by Davide
This workflow is a highly advanced multimodal AI assistant designed to operate through WhatsApp. It can understand and respond to text, images, voice messages, and PDF documents by combining OpenAI models with smart logic to adapt to the content received. 🎯 Core Features 📥 1. Automatic Message Type Detection Using the Input type node, the bot detects whether the user has sent: Text Voice messages Images Files (PDF) Other unsupported content 💬 2. Smart Text Message Handling Text messages are processed by an OpenAI GPT-4o-mini agent with a customized system prompt. Replies are concise, accurate, and formatted for mobile readability. 🖼️ 3. Image Analysis & Description Images are downloaded, converted to base64, and analyzed by an image-aware AI model. The output is a rich, structured description, designed for visually impaired users or visual content interpretation. 🎙️ 4. Voice Message Transcription & Reply Audio messages are downloaded and transcribed using OpenAI Whisper. The transcribed text is analyzed and answered by the AI. Optionally, the AI reply can be converted back to voice using OpenAI's text-to-speech, and sent as an audio message. 📄 5. PDF Document Extraction & Summary Only PDFs are allowed (filtered via MIME type). The document’s content is extracted and combined with the user's message. The AI then provides a relevant summary or answer. 🧠 6. Contextual Memory Each user has a personalized session ID with a memory window of 10 interactions. This ensures a more natural and contextual conversation flow. How It Works Thisworkflow is designed to handle incoming WhatsApp messages and process different types of inputs (text, audio, images, and PDF documents) using AI-powered analysis. Here’s how it functions: Trigger: The workflow starts with the **WhatsApp Trigger node, which listens for incoming messages (text, audio, images, or documents). Input Routing: The **Input type (Switch node) checks the message type and routes it to the appropriate processing branch: Text: Directly forwards the message to the AI agent for response generation. Audio: Downloads the audio file, transcribes it using OpenAI, and sends the transcription to the AI agent. Image: Downloads the image, analyzes it with OpenAI’s GPT-4 model, and generates a detailed description. PDF Document: Downloads the file, extracts text, and processes it with the AI agent. Unsupported Formats: Sends an error message if the input is not supported. AI Processing: The **AI Agent1 node, powered by OpenAI, processes the input (text, transcribed audio, image description, or PDF content) and generates a response. Response Handling**: For audio inputs, the AI’s response is converted back into speech (using OpenAI’s TTS) and sent as a voice message. For other inputs, the response is sent as a text message via WhatsApp. Memory: The **Simple Memory node maintains conversation context for follow-up interactions. Setup Steps To deploy this workflow in n8n, follow these steps: Configure WhatsApp API Credentials: Set up WhatsApp Business API credentials (Meta Developer Account). Add the credentials in the WhatsApp Trigger, Get Image/Audio/File URL, and Send Message nodes. Set Up OpenAI Integration: Provide an OpenAI API key in the Analyze Image, Transcribe Audio, Generate Audio Response, and AI Agent1 nodes. Adjust Input Handling (Optional): Modify the Switch node ("Input type") to handle additional message types if needed. Update the "Only PDF File" IF node to support other document formats. Test & Deploy: Activate the workflow and test with different message types (text, audio, image, PDF). Ensure responses are correctly generated and sent back via WhatsApp. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Oneclick AI Squad
In this guide, we’ll walk you through setting up an AI-driven workflow that automatically fetches daily sales, food waste, and customer feedback data from Google Sheets, generates actionable insights using AI, merges them into a comprehensive report, and sends it as an email draft. Ready to automate your restaurant’s daily insights? Let’s dive in! What’s the Goal? Automatically retrieve daily sales data, food waste records, and customer feedback from Google Sheets. Use AI to analyze data and generate insights, including top performers, waste reduction recommendations, and feedback summaries. Merge the insights into a structured daily report. Send the report as an AI-generated email draft for review or sending. Enable scheduled automation for daily insights delivery. By the end, you’ll have a self-running system that delivers daily restaurant insights effortlessly. Why Does It Matter? Manual data analysis and reporting are time-consuming and error-prone. Here’s why this workflow is a game-changer: Zero Human Error**: AI ensures accurate and consistent insights. Time-Saving Automation**: Instantly process data and draft reports, boosting efficiency. Scheduled Delivery**: Receive insights daily without manual effort. Actionable Insights**: Empower your team with data-driven decisions. Think of it as your tireless data analyst that keeps your restaurant informed. How It Works Here’s the step-by-step magic behind the automation: Step 1: Trigger the Workflow Initiate the workflow daily using the Daily Report Scheduler node (e.g., every day at a set time). Step 2: Fetch Daily Sales Data Retrieve sales data from the Google Sheet using the Fetch Daily Sales Data node. Step 3: Fetch Daily Food Waste Records Retrieve food waste data from the Google Sheet using the Fetch Daily Food Waste Records node. Step 4: Fetch Customer Feedback Retrieve customer feedback from the Google Sheet using the Fetch Customer Feedback node. Step 5: Normalize Sales Records Process and standardize sales data for AI analysis. Step 6: Normalize Waste Data Process and standardize food waste data for AI analysis. Step 7: Normalize Feedback Data Process and standardize customer feedback data for AI analysis. Step 8: AI Sales Insights Generator Use AI (e.g., Google Chat Model) to analyze sales data, identify top performers, and provide recommendations. Step 9: AI Waste Reduction Insights Generator Use AI to analyze waste data and suggest reduction strategies. Step 10: AI Feedback Summary Use AI to summarize customer feedback and identify common themes. Step 11: Format Sales Output Structure the sales insights into a readable format. Step 12: Format Waste Output Structure the waste reduction insights into a readable format. Step 13: Format Feedback AI Output Structure the feedback summary into a readable format. Step 14: Merge & Create Email Combine all formatted insights into a single daily report email draft. Step 15: Prepare Email Content Finalize the email content for sending. Step 16: Send Daily Report Send the AI-generated daily summary email via Gmail. How to Use the Workflow? Importing a workflow in n8n is a straightforward process that allows you to use pre-built workflows to save time. Below is a step-by-step guide to importing the Restaurant Daily Insights Automation workflow in n8n. Steps to Import a Workflow in n8n Obtain the Workflow JSON Source the Workflow: Workflows are shared as JSON files or code snippets, e.g., from the n8n community, a colleague, or exported from another n8n instance. Format: Ensure you have the workflow in JSON format, either as a file (e.g., workflow.json) or copied text. Access the n8n Workflow Editor Log in to n8n (via n8n Cloud or self-hosted instance). Navigate to the Workflows tab in the n8n dashboard. Click Add Workflow to create a blank workflow. Import the Workflow Option 1: Import via JSON Code (Clipboard): Click the three dots (⋯) in the top-right corner to open the menu. Select Import from Clipboard. Paste the JSON code into the text box. Click Import to load the workflow. Option 2: Import via JSON File: Click the three dots (⋯) in the top-right corner. Select Import from File. Choose the .json file from your computer. Click Open to import. Setup Notes Google Sheet Columns**: Sales Data Sheet: Date, Item Name, Quantity Sold, Revenue, Cost, Profit. Food Waste Records Sheet: Date, Item Name, Waste Quantity, Reason, Timestamp. Customer Feedback Sheet: Date, Customer Name, Feedback Text, Rating, Timestamp. Google Sheets Credentials**: Configure OAuth2 settings in the fetch nodes with your Google Sheet ID and credentials. AI Models**: Set up the AI nodes (e.g., Google Chat Model) with appropriate API credentials. Gmail Integration**: Authorize the Send Daily Report node with Gmail API credentials to send emails. Scheduling**: Adjust the Daily Report Scheduler node to your preferred time (e.g., daily at 9 AM).
by Luan Correia
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This comprehensive RAG workflow enables your AI agents to answer user questions with contextual knowledge pulled from your own documents — using metadata-rich embeddings stored in Supabase. 🔧 Key Features: RAG Agents powered by GPT-4.5 or GPT-3.5 via OpenRouter or OpenAI. Supabase Vector Store to store and retrieve document embeddings. Cohere Reranker to improve response relevance and quality. Metadata Agent to enrich vectorized data before ingestion. PDF Extraction Flow to automatically parse and upload documents with metadata. ✅ Setup Steps: Connect your Supabase Vector Store. Use OpenAI Embeddings (e.g. text-embedding-3-small). Add API keys for OpenAI and/or OpenRouter. Connect a reranker like Cohere. Process documents with metadata before embedding. Start chatting — your AI agent now returns context-rich answers from your own knowledge base! Perfect for building AI assistants that can reason, search and answer based on internal company data, academic papers, support docs, or personal notes.
by Incrementors
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 📦 Multi-Platform Price Finder: Scraping Prices with Bright Data & Telegram An intelligent n8n automation that fetches real-time product prices from marketplaces like Amazon, Wayfair, Lowe's, and more using Bright Data's dataset, and sends promotional messages via Telegram using AI—perfect for price tracking, deal alerts, and affiliate monetization. 📋 Overview This automation tracks product prices across top e-commerce platforms using Bright Data and sends out alerts via Telegram based on the best available deals. The workflow is designed for affiliate marketers, resellers, and deal-hunting platforms who want real-time competitive pricing. ✨ Key Features 🔎 Multi-Platform Scraping: Supports Amazon, Wayfair, Lowe's, and more ⚡ Bright Data Integration: Access to structured product snapshots 📢 AI-Powered Alerts: Generates Telegram-ready promo messages using AI 🧠 Lowest Price Logic: Filters and compares products across sources 📈 Data Merge & Processing: Combines multiple sources into a single stream 🔄 Keyword-Driven Search: Searches using dynamic keywords from form input 📦 Scalable Design: Built for multiple platform processing simultaneously 🧼 Clean Output: Strips unnecessary formatting before publishing 🎯 What This Workflow Does Input Search Keywords**: User-defined keyword(s) from a form trigger Platform Sources**: Wayfair, Lowe's, Amazon, etc. Bright Data API Key**: Needed for authenticated scraping Processing Steps User Input via n8n form trigger (keyword-based) Bright Data API Trigger for each marketplace Status Polling: Wait until scraping snapshot is ready Data Retrieval: Fetches JSON results from Bright Data snapshot Data Cleaning & Normalization: Price, title, and URL are extracted Merging Products from all platforms Find Lowest Price Product using custom JS logic AI Prompt Generation via Claude/Anthropic Telegram Formatting and alert message creation Output 🛍️ Product Title 💰 Final Price 🔗 Product URL ✉️ Promotional Message (for Telegram/notifications) 🚀 Setup Instructions Step 1: Import Workflow Open n8n > Workflows > + Add Workflow Import the provided JSON file Step 2: Configure Bright Data Add credentials under Credentials → Bright Data API Set the appropriate dataset_id for each platform Ensure dataset includes title, price, and url fields Step 3: Enable Keyword Trigger Use the built-in Form Trigger node Input: Single keyword field (SearchHere) Step 4: Telegram or AI Integration Modify prompt node for your language or tone Add Telegram webhook or integration where needed 📖 Usage Guide Adding Keywords Trigger the form with a product keyword like iPhone 15 Wait for workflow to fetch best deals and generate Telegram message Understanding AI-Powered Output AI creates a short, engaging message like: > "🔥 Deal Alert: Get the iPhone 15 for just ₹74,999! Limited stock—Check it out: [link]" Debugging Output Output node shows cleaned JSON with title, price, url, and message If no valid results, debug message is returned with sample structure info 🔧 Customization Options Add More Marketplaces Clone any HTTP Request node (e.g., for Wayfair) Update dataset_id and required output fields Modify Price Logic Update the Code1 node to change comparison (e.g., highest price instead of lowest) Change Message Format Edit the AI Agent prompt to customize tone/language Add emoji, CTAs, or markdown formatting as needed 🧪 Test & Activation Add a few sample keywords via form trigger Run manually or set as a webhook for external app input Check final AI-generated message in output node 🚨 Troubleshooting | Issue | Solution | |-------|----------| | No Data Returned | Ensure keyword matches real products | | Status Not 'Ready' | Bright Data delay; add Wait nodes | | Invalid API Key | Check Bright Data credentials | | AI Errors | Adjust prompt or validate input fields | 📊 Use Cases 💰 Affiliate Campaigns: Show best deals across platforms 🛒 Deal Pages: Post live offers with product links 🧠 Competitor Analysis: Track cross-platform pricing 🔔 Alert Bots: Send real-time alerts to Telegram or Slack ✅ Quick Setup Checklist [x] Bright Data API credentials configured [x] n8n form trigger enabled [x] Claude or AI model connected [x] All HTTP requests working [x] AI message formatting verified 🌐 Example Output { "title": "Apple iPhone 15 Pro Max", "price": 1199, "url": "https://amazon.com/iphone-15", "message": "🔥 Grab the Apple iPhone 15 Pro Max for just $1199! Limited deal—Check it out: https://amazon.com/iphone-15" } 📬For any questions or support, please contact: 📧 <info@incrementors.com> or fill out this form: https://www.incrementors.com/contact-us/
by mustafa kendigüzel
How it works This automated workflow discovers trending Instagram posts and creates similar AI-generated content. Here's the high-level process: 1. Content Discovery & Analysis Scrapes trending posts from specific hashtags Analyzes visual elements using AI Filters out videos and duplicates 2. AI Content Generation Creates unique images based on trending content Generates engaging captions with relevant hashtags Maintains brand consistency while being original 3. Automated Publishing Posts content directly to Instagram Monitors publication status Sends notifications via Telegram Set up steps Setting up this workflow takes approximately 15-20 minutes: 1. API Configuration (7-10 minutes) Instagram Business Account setup Telegram Bot creation API key generation (OpenAI, Replicate, Rapid Api) 2. Database Setup (3-5 minutes) Create required database table Configure PostgreSQL credentials 3. Workflow Configuration (5-7 minutes) Set scheduling preferences Configure notification settings Test connection and permissions Detailed technical specifications and configurations are available in sticky notes within the workflow.
by Constantine Kissel
Generate research-backed article with n8n Who’s it for Content marketers, SEO teams, and founders who need fast, research-grounded blog posts or long-form articles—multi-language included. Works well for teams that want citations, outlines, and section-by-section drafting with minimal manual effort. How it works / What it does Use a Form to collects domain, keywords, and target language. The workflow refines keywords, finds recent articles and authoritative citations, then synthesizes a master outline and loops through each section: generate search queries → fetch web results → summarize findings → write the section with the advanced model. Finally, it aggregates all sections into a clean Markdown article. Optional delivery nodes (Email, Telegram) and an AI Agent are included but disabled by default.  How to set up Import the workflow JSON into n8n. Add your OpenAI credential Set simple_model / advanced_model in LLM Params Requirements n8n instance with outbound internet access. OpenAI API access (Responses API + web_search_preview). (Optional) Email/Telegram credentials if you want to deliver results async How to customize the workflow Edit prompts in LLM Params and Section Prompts to match tone, structure, and SEO style Tweak recency and source rules in Search Articles / Search Citations.  Insert a human review step before “Write Section”, enable delivery nodes Change working/output languages in the Language node