by Ayoub
Who is this for? This workflow is designed for businesses or developers looking to integrate voice-based chat applications with dynamic responses and conversational memory. What problem does this solve? It automates AI-powered voice conversations, maintaining context between sessions and converting speech-to-text and text-to-speech. What this workflow does: The workflow receives audio input, transcribes it using OpenAI, and processes the conversation using Google Gemini Chat Model (you can use OpenAI Chat Model). Responses are converted back to speech using ElevenLabs. Prerequisites: You'll need API keys for: OpenAI (you can obtain it from OpenAI website) ElevenLabs (you can obtain it from their website) Google Gemini (You can obtain it from Google AI Studio) Setup: Configure you API keys Ensure that the value (voice_message) in the "Path" parameter in the Webhook node is used as the name of the parameter that will contain the voice message you are sending via the HTTP Post request.
by mariskarthick
QuantumDefender AI is a next-generation intelligent cybersecurity assistant designed to harness the symbolic strength of quantum computing’s promise alongside cutting-edge AI capabilities. This sophisticated agent empowers SOC analysts, red teamers, and security researchers with rapid threat investigation, operational automation, and intelligent command execution—all driven by GPT-4 and integrated tools, accessible through Telegram or on any medium. 🔑 Key Features: Expert-Level Cybersecurity Research & Analysis: Leverages powerful AI models to deliver clean, detailed, domain-specific insights across detection, remediation, and offensive security. Command & Control: Executes Linux shell commands, autonomous scripts, and system operations securely in isolated environments. Real-Time Web Intelligence: Utilizes integrated Langsearch API to provide timely internet research with contextual relevance. Calendar & Scheduling Automation: Manage Google Calendar events or any similar application(create, update, delete, retrieve) dynamically from chat. Multi-Tool Orchestration: Combines calculator functions, internet searches, command execution, and messaging for comprehensive operational support. Telegram-native Chatbot: Delivers an adaptive, memory-informed, and interactive conversational experience with immediate typing indicators and high responsiveness. Conversation & Session Management: Maintains context-aware, session-based memory to enable smooth, multi-turn dialogues with individual users. Sends “typing…” indicators during processing to ensure an interactive, user-friendly chat experience. Operates exclusively within Telegram, delivering rich, timely responses and leveraging all Telegram bot capabilities. Execution Intelligence & Safety: Fully autonomous in deciding which tools to invoke, how frequently, and in what sequence to fulfill user requests comprehensively and responsibly. Operates within a secure temporary folder environment to contain all command executions safely and avoid persistent or harmful side effects. Enforces strict safety protocols to avoid running malicious or destructive commands, maintaining ethical standards and compliance. Use Cases: Cybersecurity researchers and operators seeking an intelligent assistant to accelerate investigations and automate routine tasks. Red team professionals requiring on-the-fly command execution and information gathering integrated with tactical chat interactions. SOC teams aiming to augment their alert triage and incident handling workflows with AI-powered analysis and action. Anyone looking for a robust multi-tool AI chatbot integrated with real-world operational capabilities. Setup Requirements: OpenAI API key for GPT-4.1-nano language processing. Telegram Bot API credentials with proper webhook setup to receive and respond to messages. Google OAuth credentials for Calendar integration if calendar features are used. SSH access credentials for executing commands on remote hosts, if remote execution is enabled. Internet connectivity for the Langsearch web search API. Customization & Extensibility: The workflow is built modularly with n8n’s flexible node system. Users can extend it by adding more tools, integrating other services (ticketing, threat intel, scanning tools), or modifying interaction logic to suit specialized operational needs and environments. Created by Mariskarthick M Senior Security Analyst | Detection Engineer | Threat Hunter | Open-Source Enthusiast
by Yang
👥 Who is this for? This workflow is ideal for virtual assistants, researchers, developers, automation specialists, and data analysts who need to regularly extract and organize structured product information (like books) from a website. It’s especially useful for those working with catalog-based websites who want to automate extraction and delivery of clean, sorted data. 🧩 What problem is this solving? Manually copying product listings like book titles and prices from a website into a spreadsheet is slow and repetitive. This automation solves that problem by scraping content using Dumpling AI, extracting the right data using CSS selectors, and formatting it into a clean CSV file that is sent to your email—all triggered automatically when a new URL is added to Google Sheets. ⚙️ What this workflow does This template automates an entire content scraping and delivery process: Watches a Google Sheet for new URLs Scrapes the HTML content of the given webpage using Dumpling AI Uses CSS selectors in the HTML node to extract each book from the page Splits the HTML array into individual items Extracts the book title and price from each HTML block Sorts the books in descending order based on price Converts the sorted data to a CSV file Sends the CSV via email using Gmail 🛠️ Setup Google Sheets Create a sheet titled something like URLs Add your product listing URLs (e.g., http://books.toscrape.com) Connect the Google Sheets trigger node to your sheet Ensure you have proper credentials connected Dumpling AI Create an account at Dumpling AI) - Generate your API key Set the HTTP Method to POST and pass the URL dynamically from the Google Sheet Use Header Auth to include your API key in the request header Make sure "cleaned": "True" is included in the body for optimized HTML output HTML Node The first HTML node extracts the main book container blocks using: .row > li The second HTML node parses out the individual fields: title: h3 > a (via the title attribute) price: .price_color Sort Node Sorts books by price in descending order Note: price is extracted as a string, ensure it's parsable if you plan to use numeric filtering later Convert to CSV The JSON data is passed into a Convert node and transformed into a CSV file Gmail Sends the CSV as an attachment to a designated email 🔄 How to customize this workflow Extract more data**: Add more CSS selectors in the second HTML node to pull fields like author, availability, or product links Switch destinations**: Replace Gmail with Slack, Google Drive, Dropbox, or another platform Adjust sorting**: Sort alphabetically or based on another extracted value Use a different source**: As long as the site structure is consistent, this can scrape any listing-like page Trigger differently**: Use a webhook, form submission, or schedule trigger instead of Google Sheets ⚠️ Dependencies and Notes This workflow uses Dumpling AI to perform the web scraping. This requires an API key and uses credits per request. The HTML node depends on valid CSS selectors. If the site layout changes, the selectors may need to be updated. Ensure you’re not scraping content from websites that prohibit automated scraping.
by Ranjan Dailata
Who this is for? This workflow automates the process of Wikipedia data extraction using the Bright Data Web Unlocker, parsing and cleaning the data, and then sending the results to a specified webhook URL for downstream processing, reporting, or integration. What problem is this workflow solving? Researchers who need structured information from Wikipedia pages regularly. Data Engineers building knowledge bases or enriching datasets with factual data. Digital Marketers or Content Writers automating fact-checking or content sourcing. Automation Enthusiasts who want to trigger external systems with rich context from Wikipedia. What this workflow does This workflow addresses the challenges of manually retrieving, structuring, and using data from Wikipedia at scale. Workflow Breakdown Trigger Type: Scheduled or Manual Purpose: Starts the workflow either on a fixed schedule (e.g., daily) or on-demand via a manual trigger or incoming webhook. Bright Data Wikipedia Scraping Tool Used: Bright Data Web Unlocker Action: Scrape the HTML content of one or multiple Wikipedia article URLs. Parse & Extract Structured Data The Basic LLM Chain node is responsible for producing a human readable content. Summarization Summarize the Wikipedia content by utilizing the Summarization Chain node. Send to Webhook Initiates a Webhook notification to the specified URL as part of the "Summary Webhook Notifier" node. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Set Wikipedia URL with Bright Data Zone node with the Wikipedia URL and Bright Data Zone. Update the Summary Webhook Notifier node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Wikipedia URL Replace with your own Wikipedia URL of your interest. Make sure to set the Wikipedia URL as part of the "Set Wikipedia URL with Bright Data Zone" node. Modify Data Extraction Logic Extract entire article content or just specific sections by extending the "LLM Data Extractor" node prompt. Extend AI Summarization Extract key bullet points or entities. Create short-form summaries by extending the "Concise Summary Generator" node. Extend Summary Webhook Notifier Send to Slack, Discord, Telegram, MS Teams via the Webhook notification mechanism. Connect to your internal database/API via the Webhook notification mechanism.
by David Olusola
n8n Set Node Tutorial - Complete Guide 🎯 How It Works This tutorial workflow teaches you everything about n8n's Set node through hands-on examples. The Set node is one of the most powerful tools in n8n - it allows you to create, modify, and transform data as it flows through your workflow. What makes this tutorial special: Progressive Learning**: Starts simple, builds to complex concepts Interactive Examples**: Real working nodes you can modify and test Visual Guidance**: Sticky notes explain every concept Branching Logic**: Shows how Set nodes work in different workflow paths Real Data**: Uses practical examples you'll encounter in automation The workflow demonstrates 6 core concepts: Basic data types (strings, numbers, booleans) Expression syntax with {{ }} and $json references Complex data structures (objects and arrays) "Keep Only Set" option for clean outputs Conditional data setting with branching logic Data transformation and aggregation techniques 📋 Setup Steps Step 1: Import the Workflow Copy the JSON from the code artifact above Open your n8n instance in your browser Navigate to Workflows section Click "Import from JSON" or the import button (usually a "+" or import icon) Paste the JSON into the import dialog Click "Import" to load the workflow Save the workflow (Ctrl+S or click Save button) Step 2: Choose Your Starting Point Option A: Default Tutorial Mode (Recommended for beginners) The workflow is ready to run as-is Uses simple "Welcome" message as starting data Click "Execute Workflow"** to begin Option B: Rich Test Data Mode (Recommended for experimentation) Locate the nodes: Find "Start (Manual Trigger)" and "0. Test Data Input" Disconnect default: Click the connection line between "Start (Manual Trigger)" → "1. Set Basic Values" and delete it Connect test data: Drag from "0. Test Data Input" output to "1. Set Basic Values" input Execute: Click "Execute Workflow" to run with rich test data Step 3: Execute and Learn Run the workflow: Click the "Execute Workflow" button Check outputs: Click on each node to see its output data Read the notes: Each sticky note explains what's happening Follow the flow: Data flows from left to right, top to bottom Step 4: Experiment and Modify Try These Experiments: 🔧 Change Basic Values: Click on "1. Set Basic Values" Modify user_age (try 20 vs 35) Change user_name to see how it propagates Execute and see the changes flow through 📊 Test Conditional Logic: Set user_age to 20 → triggers "Student Discount" path Set user_age to 30 → triggers "Premium Access" path Watch how the workflow branches differently 🎨 Modify Expressions: In "2. Set with Expressions", try changing: ={{ $json.score * 2 }} to ={{ $json.score * 3 }} ={{ $json.user_name }} Smith to ={{ $json.user_name }} Johnson 🏗️ Complex Data Structures: In "3. Set Complex Data", modify the JSON structure Add new properties to the user_profile object Try nested expressions 🎓 Learning Path Beginner Level (Nodes 1-2) Focus**: Understanding basic Set operations Learn**: Data types, static values, simple expressions Time**: 10-15 minutes Intermediate Level (Nodes 3-4) Focus**: Complex data and output control Learn**: Objects, arrays, "Keep Only Set" option Time**: 15-20 minutes Advanced Level (Nodes 5-6) Focus**: Conditional logic and data aggregation Learn**: Branching workflows, merging data, complex expressions Time**: 20-25 minutes 🔍 What Each Node Teaches | Node | Concept | Key Learning | |------|---------|-------------| | 1. Set Basic Values | Data Types | String, number, boolean basics | | 2. Set with Expressions | Dynamic Data | {{ }} syntax, $json references, $now functions | | 3. Set Complex Data | Advanced Structures | Objects, arrays, nested properties | | 4. Set Clean Output | Data Management | "Keep Only Set" for clean final outputs | | 5a/5b. Conditional Sets | Branching Logic | Different data based on conditions | | 6. Tutorial Summary | Data Aggregation | Combining and summarizing workflow data | 💡 Pro Tips 🚀 Quick Wins: Always check node outputs after execution Use sticky notes as your learning guide Experiment with small changes first Copy nodes to try variations 🛠️ Advanced Techniques: Use Keep Only Set for API responses Combine static and dynamic data in complex objects Leverage conditional paths for different user types Reference nested object properties with dot notation 🐛 Troubleshooting: If expressions don't work, check the {{ }} syntax Ensure field names match exactly (case-sensitive) Use the expression editor for complex logic Check data types match your expectations 🎯 Next Steps After Tutorial Create your own Set nodes in a new workflow Practice with real data from APIs or databases Build data transformation workflows for your specific use cases Combine Set nodes with other n8n nodes like HTTP, Webhook, etc. Explore advanced expressions using JavaScript functions Congratulations! You now have the foundation to use Set nodes effectively in any n8n workflow. The Set node is truly the "Swiss Army knife" of n8n automation! 🛠️
by Paul
Gmail AI Email Manager - Setup Guide 🎯 Workflow Overview This workflow will create an intelligent Gmail email manager that can: Monitor incoming emails via webhook Analyze email content using AI Categorize emails automatically Generate smart responses Take actions based on email content Send notifications for important emails 📋 Pre-Setup Checklist Before we build the workflow, let me gather the necessary information and validate our approach. Phase 1: Discovery & Planning [ ] Search for Gmail nodes [ ] Find AI analysis nodes [ ] Identify webhook trigger options [ ] Check notification nodes Phase 2: Configuration Requirements [ ] Gmail API credentials [ ] AI service (OpenAI/Claude) API key [ ] Webhook URL setup [ ] Email classification rules 🔧 Setup Instructions Step 1: Gmail API Setup Go to Google Cloud Console Create new project or select existing Enable Gmail API Create OAuth 2.0 credentials Add authorized redirect URI: https://your-n8n-instance.com/rest/oauth2-credential/callback Step 2: AI Service Setup Choose one of the following: OpenAI**: Get API key from platform.openai.com Claude**: Get API key from console.anthropic.com Local AI**: Set up Ollama or similar Step 3: n8n Credentials Gmail OAuth2: Add client ID, secret, and scopes AI Service: Add API key Webhook: Configure webhook URL Gmail AI Email Manager - Setup Guide 🔧 Quick Setup Checklist 1. Google Cloud Console [ ] Enable Gmail API [ ] Create OAuth2 credentials [ ] Add redirect URI: https://your-n8n.com/rest/oauth2-credential/callback [ ] Set up Gmail push notifications with Pub/Sub 2. API Keys [ ] Get OpenAI API key from platform.openai.com [ ] Create Google Sheets for logging (optional) 3. n8n Credentials [ ] Gmail OAuth2: Client ID, Secret, Scopes: gmail.readonly,gmail.modify,gmail.compose [ ] OpenAI API: Your API key 4. Gmail Labels (Create these) [ ] URGENT (red) [ ] IMPORTANT (orange) [ ] PROMOTIONAL (purple) [ ] PERSONAL (green) [ ] WORK (blue) [ ] SPAM (gray) 5. Update Workflow Values [ ] High Priority Alert: Change notification email [ ] Spreadsheet Log: Update sheet ID (if using) [ ] Webhook: Copy URL after saving workflow 6. Test [ ] Save & activate workflow [ ] Send test email to Gmail [ ] Check execution log [ ] Verify auto-categorization works That's it! Your AI email manager is ready! 🚀
by VKAPS IT
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🎯 How it works This workflow captures new lead information from a web form, enriches it with Apollo.io data, qualifies the lead using AI, and—if the lead is strong—automatically sends a personalized outreach email via Gmail and logs the result in Google Sheets. 🛠️ Key Features 📩 Lead form capture with validation 🔍 Enrichment via Apollo API 🤖 Lead scoring using AI (LangChain + Groq) 📧 Dynamic email generation & sending via Gmail 📊 Logging leads with job title & org into Google Sheets ✅ Conditional email sending (score ≥ 6 only) 🧪 Set up steps Estimated time: 15–20 minutes Add your Apollo API Key to the HTTP Header credential (never hardcode!) Connect your Gmail account for sending emails Connect your Google Sheets account and set up the correct spreadsheet & sheet name Enable LangChain/Groq credentials for lead scoring and AI-generated emails Update the form endpoint to your live webhook if needed 📌 Sticky Notes Add the following mandatory sticky notes inside your workflow: FormTrigger Node: "Collects lead info via form. Ensure your form is connected to this endpoint." HTTP Request Node: "Enrich lead using Apollo.io API. Add your API key via header-based authentication." AI Agent (Lead Score): "Scores lead from 1-10 based on job title and industry match. Only leads with score ≥ 6 proceed." AI Agent (Email Composer): "Generates a concise, polite email using lead’s job title & company. Modify tone if needed." Google Sheets Append: "Logs enriched lead with job title, org, and LinkedIn URL. Customize sheet structure if needed." Gmail Node: "Sends personalized outreach email if lead passes score threshold. Uses AI-generated content." 💸 Free or Paid? Free – No paid API services are required (Apollo has a free tier).
by Calistus Christian
What this workflow does Automatically triages risky AWS misconfigurations and alerts your team. Pipeline: Security Hub or AWS Config -> EventBridge rules -> SNS (HTTP) -> n8n Webhook -> Normalize -> AI Prioritizer -> Airtable (log) -> Gmail (email) Normalizes incoming findings (S3 / Security Groups / IAM / RDS) into a consistent JSON. Uses an LLM to assign a priority (P0–P3) with rationale and remediation steps. Upserts the finding into Airtable (avoids duplicates). Emails a compact incident summary to your inbox. This can be swapped for Microsoft Teams or Slack, etc. Category: Security / Cloud / Alerting Time to set up: ~10–15 minutes Difficulty: Beginner–Intermediate Cost: Mostly free (n8n CE + AWS SNS/EventBridge; OpenAI + Airtable/Gmail as used) What you’ll need An n8n instance reachable over HTTP. AWS account (one region) with permissions to create SNS topics and EventBridge rules. Security Hub** enabled (or AWS Config rules that emit compliance events). n8n credentials: OpenAI, Airtable, Gmail. Nodes used Webhook** (POST /aws-misconfig) Code:** SNS Handler (token check, confirm/unwrap) IF:** route mode === "confirm" vs notification HTTP Request:** SNS SubscriptionConfirmation (GET) Code:** Normalize Finding Message a model:** AI Prioritizer (JSON out) Airtable:** Create/Upsert Gmail:** Send message Edit Fields:** final JSON response Setup steps Import and activate the workflow in n8n. Webhook Respond: When Last Node Finishes -> First Entry JSON. Append a shared secret to the URL, e.g. ?token=MY_SUPER_TOKEN, and keep the check in the SNS Handler code node. Create an SNS topic (e.g., misconfig-events) in the same region as your EventBridge rules. Create EventBridge rules targeting the SNS topic: Rule A (Security Hub): source = aws.securityhub, detail-type = Security Hub Findings - Imported Rule B (AWS Config): source = aws.config, detail-type = Config Rules Compliance Change Create an SNS subscription with Protocol = HTTP and Endpoint = your production webhook URL: http://YOUR_HOST:5678/webhook/aws-misconfig?token=MY_SUPER_TOKEN (The workflow auto-confirms the subscription on first POST.) Configure Airtable (Upsert on Finding ID) and Gmail recipients.
by David Olusola
Shopify Order to Slack Notification E-commerce Automation Team Communication This n8n template instantly notifies your team in Slack whenever a new order is placed on your Shopify store. Perfect for small to medium businesses that want immediate awareness of sales activity and faster order processing. How it works Shopify sends webhook to n8n when new order is created Order data is extracted and formatted into professional message Rich Slack notification is posted to designated channel with customer details, order number, total amount, and direct admin link Team gets instant visibility into new sales activity Set up instructions Set up Shopify credentials in n8n: API Key, Password, Shop Subdomain, and Shared Secret Requirements Shopify store with admin access Slack workspace with channel permissions n8n Shopify and Slack credentials configured Customising this workflow Add email notifications alongside Slack alerts Include customer shipping information in notifications Filter alerts by order value thresholds or product types
by ikbendion
Reddit Poster to Discord This workflow checks Reddit every 15 minutes for new posts and sends selected posts to a Discord channel via webhook. Flow Overview: Schedule Trigger Runs every 15 minutes. Fetch Latest Posts Retrieves up to 3 new posts from any subreddit. Filter Posts Skips moderator or announcement posts based on author ID. Fetch Full Post Data Gets full details for the remaining post. Extract Image URL Parses the post to extract a direct image link. Send to Discord Sends the post title, image, and link to a Discord webhook. Setup Notes: Create a Reddit app and connect credentials in n8n. Add your subreddit name to both Reddit nodes. Connect a Discord webhook for posting.
by Adrian
📋 Description This template creates an intelligent AI assistant for WhatsApp that can: Respond naturally** to messages using Google Gemini AI Remember previous conversations** for each user Access a knowledge base** for answering frequently asked questions Automatically save** all conversations for long-term memory 🛠️ Requirements 1. WAMM.pro Account (FREE tier available) What is WAMM.pro?** - A platform that enables WhatsApp automation using proprietary API technology Free tier:** 50 messages/month PRO tier:** Unlimited messages + advanced features Link:** wamm.pro 2. Pinecone Account (for AI memory) For storing conversations and knowledge base Free tier available 3. Google AI Account (for Gemini) For the conversational AI model 4. OpenAI Account (for embeddings) For generating memory vectors 🚀 Step-by-step Setup Step 1: WAMM.pro Configuration Create account at wamm.pro Account Manager → Add WhatsApp profile Scan QR code with your WhatsApp Note down: Instance ID and Access Token Step 2: Webhook Configuration In WAMM.pro: Integrations → Webhooks → Messages Webhooks Add Webhook with the n8n URL Required configuration: From others: ✅ Relevant + ✅ Without media + ✅ Exclude no text To others: ✅ Relevant + ✅ Without media + ✅ Exclude no text To myself: ✅ None (to avoid responding to own messages) Step 3: Pinecone Configuration Create 2 indexes: historywa - for conversation memory knowledge - for knowledge base Index settings: Dimensions: 3072 Metric: cosine Embedding model: text-embedding-3-large Step 4: n8n Configuration Configure credentials: WAMM: Instance ID + Access Token Pinecone: API Key Google Gemini: API Key OpenAI: API Key for embeddings 🔧 How it Works Workflow Flow: 📱 WhatsApp Message ↓ (webhook) 🎯 AI Agent (Gemini) ↓ (uses tools) 📚 Memory Tool + Knowledge Tool ↓ (response generated) 📤 WAMM Send Message ↓ (saves) 💾 Pinecone Memory Storage Available AI Tools: Memory Tool - Searches previous conversations with the user Knowledge Tool - Searches the general knowledge base Special Features: Natural conversations** - AI doesn't mention "searching history" Persistent context** - Remembers names, preferences, previous conversations User language detection** - Automatically responds in user's language Organized memory** - Each user has their own memory space 📊 Benefits ✅ Zero maintenance - Runs automatically ✅ Scalable - Supports multiple users simultaneously ✅ Intelligent memory - Uses similarity search for relevant context ✅ Extensible - Easy to add new features ✅ Cost-effective - Free tiers available for all services 🎯 Use Cases Automated customer support** with memory Personal assistant** for WhatsApp Business chatbot** with specific knowledge Conversation automation** with persistent context 🔒 Security Data** stored in Pinecone as vector embeddings No plain text** message storage Each user** has separate memory space API keys** secured in n8n credentials 📈 Possible Extensions CRM** integrations Scheduling** and reminders Advanced multi-language** support Analytics** and conversation reports Custom knowledge bases** per user 💡 Tip: For optimal results, populate the knowledge base with frequently asked questions specific to your business!
by Oneclick AI Squad
This automated n8n workflow monitors ingredient price changes from external APIs or manual sources, analyzes historical trends, and provides smart buying recommendations. The system tracks price fluctuations in a PostgreSQL database, generates actionable insights, and sends alerts via email and Slack to help restaurants optimize their purchasing decisions. What is Price Trend Analysis? Price trend analysis uses historical price data to identify patterns and predict optimal buying opportunities. The system analyzes price movements over time and generates recommendations on when to buy ingredients based on current trends and historical patterns. Good to Know Price data accuracy depends on the reliability of external API sources Historical data improves recommendation accuracy over time (recommended minimum 30 days) PostgreSQL database provides robust data storage and complex trend analysis capabilities Real-time alerts help capture optimal buying opportunities Dashboard provides visual insights into price trends and recommendations How It Works Daily Price Check - Triggers the workflow daily to monitor price changes Fetch API Prices - Retrieves the latest prices from an external ingredient pricing API Setup Database - Ensures database tables are ready before inserting new data Store Price Data - Saves current prices to the PostgreSQL database for tracking Calculate Trends - Analyzes historical prices to detect patterns and price movements Generate Recommendations - Suggests actions based on price trends (buy/wait/stock up) Store Recommendations - Saves recommendations for future reporting Get Dashboard Data - Gathers necessary data for dashboard generation Generate Dashboard HTML - Builds an HTML dashboard to visualize insights Send Email Report - Emails the dashboard report to stakeholders Send Slack Alert - Sends key alerts or recommendations to Slack channels Database Structure The workflow uses PostgreSQL with two main tables: price_history - Historical price tracking with columns: id (Primary Key) ingredient (VARCHAR 100) - Name of the ingredient price (DECIMAL 10,2) - Current price value unit (VARCHAR 50) - Unit of measurement (kg, lbs, etc.) supplier (VARCHAR 100) - Source supplier name timestamp (TIMESTAMP) - When the price was recorded created_at (TIMESTAMP) - Record creation time buying_recommendations - AI-generated buying suggestions with columns: id (Primary Key) ingredient (VARCHAR 100) - Ingredient name current_price (DECIMAL 10,2) - Latest price price_change_percent (DECIMAL 5,2) - Percentage change from previous price trend (VARCHAR 20) - Price trend direction (INCREASING/DECREASING/STABLE) recommendation (VARCHAR 50) - Buying action (BUY_NOW/WAIT/STOCK_UP) urgency (VARCHAR 20) - Urgency level (HIGH/MEDIUM/LOW) reason (TEXT) - Explanation for the recommendation generated_at (TIMESTAMP) - When recommendation was created Price Trend Analysis The system analyzes historical price data over the last 30 days to calculate percentage changes, identify trends (INCREASING/DECREASING/STABLE), and generate actionable buying recommendations based on price patterns and movement history. How to Use Import the workflow into n8n Configure PostgreSQL database connection credentials Set up external ingredient pricing API access Configure email credentials for dashboard reports Set up Slack webhook or bot credentials for alerts Run the Setup Database node to create required tables and indexes Test with sample ingredient data to verify price tracking and recommendations Adjust trend analysis parameters based on your purchasing patterns Monitor recommendations and refine thresholds based on actual buying decisions Requirements PostgreSQL database access External ingredient pricing API credentials Email service credentials (Gmail, SMTP, etc.) Slack webhook URL or bot credentials Historical price data for initial trend analysis Customizing This Workflow Modify the Calculate Trends node to adjust the analysis period (currently 30 days) or add seasonal adjustments. Customize the recommendation logic to match your restaurant's buying patterns, budget constraints, or supplier agreements. Add additional data sources like weather forecasts or market reports for more sophisticated predictions.