by achiya
Find and share AliExpress affiliate products through Telegram Build a Telegram bot that helps users find AliExpress products using natural language requests. The bot uses OpenAI to optimize search queries, Decodo to scrape product listings, and AI analysis to select the best options based on ratings, reviews, and price—then automatically generates affiliate tracking links for each recommendation. What it does When users send "Find me wireless keyboard": Bot checks user is member of your Telegram channel (optional) Validates command starts with accepted phrases OpenAI generates optimized English search query Decodo scrapes products from AliExpress AI analyzes the top 10 products and selects best 2 based on reviews, ratings, and price AliExpress Affiliate API creates tracking links Bot sends formatted recommendations with images, prices, ratings, and links Who this is for Affiliate marketers monetizing Telegram channels E-commerce entrepreneurs automating recommendations Channel owners adding value while earning commissions Anyone building AliExpress affiliate systems Setup requirements Credentials needed Telegram Bot API Create bot via @BotFather Add token to n8n Make bot admin in your channel AliExpress Affiliate API Sign up for affiliate program Get: App Key, App Secret, Tracking ID Add to n8n OpenAI API Get API key Add to n8n Used for search and analysis Configuration required Before activation: Channel username - Replace @YOUR_CHANNEL in 2 nodes: Check Channel Membership Verify Channel Member Tracking ID - Set YOUR_AFFILIATE_TRACKING_ID in: Generate Affiliate Links Create Affiliate Link Channel URL - Update button in Request Channel Join Bot admin - Make bot admin in your channel How to use User commands Users send messages starting with: Find me [product] Search for [product] Look for [product] Get me [product] Send me [product] Show me [product] Examples: Find me wireless mouse Search for phone case Look for bluetooth speaker Bot responses Non-member: Asks to join channel Invalid format: Shows usage examples Valid request: Sends "searching..." status Processes with AI Returns 2 recommendations Each includes: image, title, price, rating, orders, link "More Results" button available Customization options Product count: Edit "Select Top 2 Products" node Selection criteria: Modify AI prompts in "AI Product Search" Commands: Add/remove in "Validate Command Format" Channel gate: Delete verification nodes to remove Language: Translate Telegram message nodes AI model: Switch to GPT-3.5-turbo for lower costs Technical details Workflow components: Entry: Telegram webhook Verification: Channel membership Validation: Command format Processing: AI query → Decodo scrape → AI analysis Output: Affiliate links → Message format → Send APIs used: Telegram Bot API - User interaction OpenAI API - Search optimization, product analysis Decodo - AliExpress scraping AliExpress Affiliate API - Link generation Error handling: Invalid commands → Usage guide Non-members → Join request No results → Error message Spam → Auto-removal Best practices Cost management: OpenAI: $0.01-0.05 per search Cache popular searches Use GPT-3.5 for lower costs Security: Store credentials in n8n Rotate API keys regularly Monitor activity Performance: Use webhook mode Set up error notifications Implement rate limiting Troubleshooting Bot not responding Verify workflow activated Check credentials valid Review error logs Channel verification fails Confirm bot is admin Check @username correct Ensure user joined No products found Validate credentials Check tracking ID Try different terms Links broken Confirm account active Verify tracking ID Check permissions Version Version: 1.0 Updated: January 2026 Compatible: n8n v1.0+ Setup: 10-15 minutes
by Hybroht
Source Discovery - Automatically Search More Up-to-Date Information Sources 🎬 Overview Version : 1.0 This workflow utilizes various nodes to discover and analyze potential sources of information from platforms like Google, Reddit, GitHub, Bluesky, and others. It is designed to streamline the process of finding relevant sources based on specified search themes. ✨ Features Automated source discovery from multiple platforms. Filtering of existing and undesired sources. Error handling for API requests. User-friendly configuration options. 👤 Who is this for? This workflow is ideal for researchers, content marketers, journalists, and anyone looking to efficiently gather and analyze information from various online sources. 💡 What problem does this solve? This workflow addresses the challenge of manually searching for relevant information sources, saving time and effort while ensuring that users have access to the most pertinent content. Ideal use-cases include: Resource Compilation for Academic and Educational Purposes Journalism and Research Content Marketing Competitor Analysis 🔍 What this workflow does The workflow gathers data from selected platforms through search terms. It filters out known and undesired sources, analyzes the content, and provides insights into potential sources relevant to the user's needs. 🔄 Workflow Steps 1. Search Queries Fetch sources using SerpAPI search, DuckDuckGo, and Bluesky. Utilizes GitHub repositories to find relevant links. Leverages RSS feeds from subreddits to identify potential sources. 2. Filtering Step Removes existing and undesired sources from the results. 3. Source Selection Analyzes the content of the identified sources for relevance. 📌 Expected Input / Configuration The workflow is primarily configured via the Configure Workflow Args (Manual) node or the Global Variables custom node. Search themes: Keywords or phrases relevant to the desired content. Lists of known sources and undesired sources for filtering. 📦 Expected Output A curated list of potential sources relevant to the specified search themes, along with insights into their content. 📌 Example ⚙️ n8n Setup Used n8n version:** 1.105.3 n8n-nodes-serpapi:** 0.1.6 n8n-nodes-globals:** 1.1.0 n8n-nodes-bluesky-enhanced**: 1.6.0 n8n-nodes-duckduckgo-search**: 30.0.4 LLM Model:** mistral-small-latest (API) Platform:** Podman 4.3.1 on Linux Date:** 2025-08-06 ⚡ Requirements to Use / Setup Self-hosted or cloud n8n instance. Install the following custom nodes: SerpAPI, Bluesky, and DuckDuckGo Search. n8n-nodes-serpapi n8n-nodes-duckduckgo-search n8n-nodes-bluesky-enhanced Install the Global Variables Node for enhanced configuration: n8n-nodes-globals (or use Edit Field (Set) node instead) Provide valid credentials to nodes for your preferred LLM model, SerpAPI, and Bluesky. Credentials for GitHub recommended. ⚠️ Notes, Assumptions \& Warnings Ensure compliance with the terms of service of any platforms accessed or discovered in this workflow, particularly concerning data usage and attribution. Monitor API usage to avoid hitting rate limits. The workflow may encounter errors such as 403 responses; in such cases, it will continue by ignoring the affected substep. Duplicate removal is applied, but occasional overlaps might still appear depending on the sources. This workflow assumes familiarity with n8n, APIs, and search engines. Using AI agents (Mistral or substitute LLMs) requires access to their API services and keys. This is not a Curator of News. It is designed to find websites that are relevant and useful to your searches. If you are looking for a relevant news selector, please check this workflow. ℹ️ About Us This workflow was developed by the Hybroht team. Our goal is to create tools that harness the possibilities of technology and more. We aim to continuously improve and expand functionalities based on community feedback and evolving use cases. For questions, reach out via contact@hybroht.com. ⚖️ Warranty & Legal Notice This free workflow is provided "as-is" without any warranties of any kind, either express or implied, including but not limited to the implied warranties of merchantability, fitness for a particular purpose, or non-infringement. By using this workflow, you acknowledge that you do so at your own risk. We shall not be held responsible for any damages, losses, or liabilities arising from the use or inability to use this workflow, including but not limited to any direct, indirect, incidental, or consequential damages. It is your responsibility to ensure that your use of this workflow complies with all applicable laws and regulations.
by Cheng Siong Chin
How It Works This workflow automates predictive maintenance for vehicle fleets by combining real-time telemetry analysis with historical pattern recognition to identify potential failures before they occur. Designed for fleet managers, maintenance supervisors, and transportation operations teams, it solves the critical challenge of preventing unexpected vehicle breakdowns while optimizing maintenance scheduling and resource allocation. The system triggers on schedule, fetches current vehicle telemetry data alongside historical maintenance records, merges datasets for comprehensive analysis, then deploys specialized AI agents using Anthropic's Claude to detect anomalies and prioritize maintenance interventions. The workflow calculates urgency levels using machine learning models and business rules, formats findings into standardized maintenance records and urgent alerts, generates audit logs for compliance tracking, and routes notifications to appropriate maintenance teams based on severity. Setup Steps Configure Schedule Trigger with desired monitoring frequency for fleet checks Set up API credentials for Fetch Real-Time Vehicle Telemetry node with fleet management system Configure Fetch Historical Vehicle Data node with maintenance database API access Connect Anthropic API credentials for both Anomaly Detection and Maintenance Prioritization agents Update Anomaly Detection Model with your fleet's baseline performance parameters Customize UL Calculation Tool and Maintenance Prioritization Output Parser Prerequisites Active Anthropic API account, fleet telemetry system with API access, historical maintenance database Use Cases Commercial fleet preventive maintenance, vehicle health monitoring, breakdown prediction Customization Modify anomaly detection thresholds for vehicle types, adjust prioritization algorithms for operational priorities Benefits Reduces unexpected breakdowns by 80%, decreases maintenance costs through predictive scheduling
by Nik B.
Automatically fetches daily sales, shifts, and receipts from Loyverse. Calculates gross profit, net operating profit, other key metrics, saves them to a Google Sheet and sends out a daily report via email. Who’s it for This template is for any business owner, manager, or analyst using Loyverse POS who needs more advanced financial reporting. If you're a restaurant, bar, or retail owner who wants to automatically track daily net profit, compare sales to historical averages, and build a custom financial dashboard in Google Sheets, this workflow is for you. How it works / What it does This workflow runs automatically on a daily schedule. It fetches all sales data and receipts from your Loyverse account for the previous business day, defined by your custom shift times (even past midnight). A powerful Code node then processes all the data to calculate the metrics that Loyverse either doesn't provide at all, or only spreads out across several separate reports instead of in one consolidated place. Already set up are metrics like... -Total Revenue, Gross Profit, and Net Operating Profit Cash handling differences (over/under) Average spend per receipt (ATV) 30-day rolling Net Operating Profit (NOP) Performance vs. your historical weekday average Finally, it appends the single, calculated row of daily metrics to a Google Sheet and sends an easily customizable summary report to your email. How to set up This workflow includes detailed Sticky Notes to guide you through the setup process. Because every business has a unique POS configuration (different POS devices, categories, and payment types), you'll need to set up a few things manually before executing the workflow. I've tried to make this as easy as possible to follow, and the entire setup should only take about 15 minutes. Preparations & Credential setup Subscribe to "Integrations" Add-on in Loyverse ($9 / month) to gain API access. Create an Access token in Loyverse Create Credentials: In your n8n instance, create credentials for Loyverse (use "Generic" > "Bearer Auth"), Google Sheets (OAuth2), and your Email (SMTP or other). Make a copy of a prep-configured Google Spreadsheet (Link in the second sticky note inside the workflow). Fill MASTER CONFIG: Open the MASTER CONFIG node. Follow the comments inside to add your Google Sheet ID, Sheet Names, business hours, timezone, and Loyverse IDs (for POS devices, payment types, and categories). Configure Google Sheet Nodes Configure Read Historical Data: Open this node. Follow the instructions in the nearby Sticky Note to paste the expressions for your Document ID and Sheet Name. Configure Save Product List: Open this node. Paste in the expressions for Document ID and Sheet Name. The column mapper will load; map your sheet columns (e.g., item_name) to the data on the left (e.g., {{ $json.item_name }}). Configure Save Latest Sales Data: Open this node. Paste in the expressions for Document ID and Sheet Name. Save and run the workflow. After that, the column mapper will load. This is the most important step: map your sheet's column names (e.g., "Total Revenue") to the calculated metrics from the Calculate All Metrics node (e.g., {{ $json.totalGrossRevenue }}). Activate the workflow. 🫡 Requirements Loyverse Integrations Subscription Loyverse Access Token Credentials for Loyverse (Bearer Auth) Credentials for Google Sheets (OAuth2) Credentials for Email/SMTP sender How to customize the workflow This template is designed to be highly flexible. Central Configuration: Almost all customization (POS devices, categories, payment types, sheet names) is done in the MASTER CONFIG node. You don't need to dig through other nodes. Add/Remove Metrics: The Calculate All Metrics node has additional metrics already set up, just add the relevant collumns to the SalesData sheet or even add your own calculations to the node. Any new metric you add (e.g., metrics.myNewMetric = 123) will be available to map in the Save Latest Sales Data node. Email Body: You can easily edit the Send email node to change the text or add new metrics from the Calculate All Metrics node.
by WeblineIndia
Zoho CRM → AI Sentiment Analysis for customer interactions & Automatic Alerts Workflow This workflow analyzes newly created Notes (in Any module) in Zoho CRM, detects customer sentiment using an AI model, updates the related CRM record with custom fields - sentiment label and score, and sends an instant alert whenever negative sentiment is detected. It runs on a scheduled interval and gives teams real-time visibility into customer emotions and potential risks. Quick Implementation Steps Connect Zoho CRM OAuth2 credentials Add custom fields in Zoho CRM: Sentiment_Label and Sentiment_Score Add AI provider credentials Set Gmail alert recipient Activate workflow and test by adding a Note What It Does This workflow automatically monitors Zoho CRM Notes. When a new Note is detected, the text is extracted and analyzed through an AI-powered sentiment model. The AI classifies the text as Positive, Neutral or Negative and produces a numeric sentiment score. The workflow updates the related CRM module with these values. If the sentiment is negative, a Gmail alert is triggered so your team can follow up quickly. This automation helps organizations maintain high customer satisfaction and detect potential issues early. Who’s It For Support teams Sales teams CRM administrators Customer success managers Businesses needing automated customer sentiment tracking Requirements n8n instance Zoho CRM OAuth2 credentials Gmail OAuth2 credentials AI provider key Custom fields in Zoho CRM: Sentiment_Label & Sentiment_Score (if you are using different field name then do changes in workflow accoredingly) How It Works & Setup Step 1: Schedule Trigger Runs periodically to check for new or updated Notes. Step 2: Fetch Latest Note Retrieves the most recently modified Note. Step 3: Extract Details Extracts Note text, note_id, parent_id and module name. Step 4: AI Sentiment Analysis Sends text to the AI (via LangChain chain) for sentiment classification. Step 5: Conditional Branching If Negative: Send Gmail alert and update CRM Otherwise: Just update CRM Step 6: Update CRM Writes sentiment data back into the related parent record. How to Customize Nodes Adjust sentiment output by modifying the AI prompt. Change field mappings in Zoho update nodes. Customize the Gmail alert message. Adjust Schedule Trigger frequency. Add additional metadata (e.g., emotion tags). Add‑Ons Slack/Teams alerts for negative sentiment. Historical sentiment logging. Weekly sentiment reports. Auto-task creation for negative interactions. Priority-based escalation logic. Use Case Examples Detect unhappy customers in support interactions. Monitor sentiment across sales conversations. Escalate negative feedback automatically. Quality assurance tracking for customer interactions. Early detection of churn indicators. Troubleshooting Guide | Issue | Possible Cause | Solution | |------|----------------|----------| | Sentiment not updating | Missing Zoho fields | Add custom fields in CRM | | Note not detected | Fetching only latest note | Increase frequency or widen fetch scope | | AI output invalid | Prompt mismatch | Update prompt and parser | | Alerts not sending | Gmail OAuth expired | Reconnect Gmail | | Incorrect sentiment | Weak prompt instructions | Refine prompt wording | Need Help? WeblineIndia can help you configure, customize and extend workflows like this. We specialize in: n8n automation CRM integrations AI/LLM-powered workflows Zoho CRM customization Reach out if you'd like assistance building or enhancing similar n8n automation solutions.
by Rahul Joshi
📊 Description Ensure your GitHub repositories stay configuration-accurate and documentation-compliant with this intelligent AI-powered validation workflow. 🤖 This automation monitors repository updates, compares configuration files against documentation references, detects inconsistencies, and alerts your team instantly—streamlining DevOps and compliance reviews. ⚡ What This Template Does Step 1: Triggers automatically on GitHub push or pull_request events. 🔄 Step 2: Fetches both configuration files (config/app-config.json and faq-config.json) from the repository. 📂 Step 3: Uses GPT-4o-mini to compare configurations and detect mismatches, missing keys, or deprecated fields. 🧠 Step 4: Categorizes issues by severity—critical, high, medium, or low—and generates actionable recommendations. 🚨 Step 5: Logs all discrepancies to Google Sheets for tracking and audit purposes. 📑 Step 6: Sends Slack alerts summarizing key issues and linking to the full report. 💬 Key Benefits ✅ Prevents production incidents due to config drift ✅ Ensures documentation stays in sync with code changes ✅ Reduces manual review effort with AI-driven validation ✅ Improves team response with Slack-based alerts ✅ Maintains audit logs for compliance and traceability Features Real-time GitHub webhook integration AI-powered config comparison using GPT-4o-mini Severity-based issue classification Automated Google Sheets logging Slack alerts with detailed issue context Error handling for malformed JSON or parsing issues Requirements GitHub OAuth2 credentials with repo and webhook permissions OpenAI API key (GPT-4o-mini or compatible model) Google Sheets OAuth2 credentials Slack API token with chat:write permissions Target Audience DevOps teams ensuring consistent configuration across environments Engineering leads maintaining documentation accuracy QA and Compliance teams tracking configuration changes and risks Setup Instructions Create GitHub OAuth2 credentials and enable webhook access. Connect your OpenAI API key under credentials. Add your Google Sheets and Slack integrations. Update file paths (config/app-config.json and faq-config.json) if your repo uses different names. Activate the workflow — it will start validating on every push or PR. 🚀
by Avkash Kakdiya
How it works This workflow runs daily to review all active deals and evaluate their likelihood of closing successfully. It enriches deal data with recent engagement activity and applies AI-based behavioral scoring to predict conversion probability. High-risk or stalled deals are flagged automatically. Actionable alerts are sent to the sales team, and all analysis is logged for forecasting and tracking. Step-by-step Trigger and fetch deals** Schedule Trigger – Runs the workflow automatically at a fixed time each day. Get Active Deals from HubSpot – Retrieves all open, non-closed deals with key properties. Formatting Data – Normalizes deal fields such as value, stage, age, contacts, and activity dates. Enrich deals with engagement data** If – Filters only active deals for further processing. Loop Over Items – Processes each deal individually. HTTP Request – Fetches engagement associations for the current deal. Get an engagement – Retrieves detailed engagement records from HubSpot. Extracts Data – Structures engagement content, timestamps, and metadata for analysis. Analyze risk, alert, and store results** OpenAI Chat Model – Provides the language model used for analysis. AI Agent – Evaluates behavioral signals, predicts conversion probability, and recommends actions. Format Data – Parses AI output into structured, machine-readable fields. Filter Alerts Needed – Identifies deals that need immediate attention. Send Slack Alert – Sends detailed alerts for high-risk or stalled deals. Append or update row in sheet – Logs analysis results into Google Sheets for reporting. Why use this? Automatically identify high-risk deals before they stall or fail Give sales teams clear, data-driven next actions instead of raw CRM data Improve forecasting accuracy with AI-powered probability scoring Maintain a historical deal health log for audits and performance reviews Reduce manual pipeline reviews while increasing response speed
by Muhammad Ali
Description How it works This powerful workflow helps businesses and freelancers automatically manage invoices received on WhatsApp. It detects new messages, downloads attached invoices, extracts key data using OCR (Optical Character Recognition), summarizes the details with AI, updates Google Sheets for record-keeping, saves files to Google Drive, and instantly replies with a clean summary message all without manual effort. Perfect for small businesses, agencies, accountants, and freelancers who regularly receive invoices via WhatsApp. Say goodbye to manual data entry and hello to effortless automation. Set up steps Setup takes around 10–15 minutes: Connect your WhatsApp Cloud API to trigger incoming messages. Add your OCR.Space API key to extract invoice text. Link your Google Sheets and Google Drive accounts for data logging and storage. Enter your OpenAI API key for AI-based summarization. Import the template, test once, and you’re ready to automate your invoice workflow. Why use this workflow Save hours of manual data entry Keep all invoices safely stored and organized in Drive Get instant summaries directly in WhatsApp Improve efficiency for client billing, and expense tracking.
by Jay Emp0
Automatically turns trending Reddit posts into punchy, first-person tweets powered by Google Gemini AI, Reddit, and Twitter API, with Google Sheets logging. 🧩 Overview This workflow repurposes Reddit content into original tweets every few hours. It’s perfect for creators, marketers, or founders who want to automate content inspiration while keeping tweets sounding human, edgy, and fresh. Core automation loop: Fetch trending Reddit posts from selected subreddits. Use Gemini AI to write a short, first-person tweet. Check your Google Sheet to avoid reusing the same Reddit post. Publish to Twitter automatically. Log tweet + Reddit reference in Google Sheets. 🧠 Workflow Diagram 🪄 How It Works 1️⃣ Every 2 hours → the workflow triggers automatically. 2️⃣ It picks a subreddit (like r/automation, r/n8n, r/SaaS). 3️⃣ Gemini AI analyzes a rising Reddit post and writes a fresh, short tweet. 4️⃣ The system checks your Google Sheet to ensure it hasn’t used that Reddit post before. 5️⃣ Once validated, the tweet is published via Twitter API and logged. 🧠 Example Tweet Output 📊 Logged Data (Google Sheets) Each tweet is automatically logged for version control and duplication checks. | Date | Subreddit | Post ID | Tweet Text | |------|------------|----------|-------------| | 08/10/2025 | n8n_ai_agents | 1o16ome | Just saw a wild n8n workflow on Reddit... | ⚙️ Key Components | Node | Function | |------|-----------| | Schedule Trigger | Runs every 2 hours to generate a new tweet. | | Code (Randomly Decide Subreddit) | Picks one subreddit randomly from your preset list. | | Gemini Chat Model | Generates tweet text in first person tone using custom prompt rules. | | Reddit Tool | Fetches top or rising posts from the chosen subreddit. | | Google Sheets (read database) | Keeps a record of already-used Reddit posts. | | Structured Output Parser | Ensures consistent tweet formatting (tweet text, subreddit, post ID). | | Twitter Node | Publishes the AI-generated tweet. | | Append Row in Sheet | Logs the tweet with date, subreddit, and post ID. | 🧩 Setup Tutorial 1️⃣ Prerequisites | Tool | Purpose | |------|----------| | n8n Cloud or Self-Host | Workflow execution | | Google Gemini API Key | For tweet generation | | Reddit OAuth2 API | To fetch posts | | Twitter (X) API OAuth2 | To publish tweets | | Google Sheets API | For logging and duplication tracking | 2️⃣ Import the Workflow Download Reddit Twitter Automation.json. In n8n, click Import Workflow → From File. Connect your credentials: Gemini → Gemini Reddit → Reddit account Twitter → X Google Sheets → Gsheet 3️⃣ Configure Google Sheet Your sheet must include these columns: | Column | Description | |--------|--------------| | PAST TWEETS | The tweet text | | Date | Auto-generated date | | subreddit | Reddit source | | post_id | Reddit post reference | 4️⃣ Customize Subreddits In the Code Node, update this array to choose which subreddits to monitor: const subreddits = [ "n8n", "microsaas", "SaaS", "automation", "n8n_ai_agents" ];
by Roshan Ramani
🛒 Smart Telegram Shopping Assistant with AI Product Recommendations Workflow Overview Target User Role: E-commerce Business Owners, Affiliate Marketers, Customer Support Teams Problem Solved: Businesses need an automated way to help customers find products on Telegram without manual intervention, while providing intelligent recommendations that increase conversion rates. Opportunity Created: Transform any Telegram channel into a smart shopping assistant that can handle both product queries and customer conversations automatically. What This Workflow Does This workflow creates an intelligent Telegram bot that: 🤖 Automatically detects** whether users are asking about products or just chatting 🛒 Scrapes Amazon** in real-time to find the best matching products 🎯 Uses AI to analyze and rank** products based on price, ratings, and user needs 📱 Delivers perfectly formatted** recommendations optimized for Telegram 💬 Handles casual conversations** professionally when users aren't shopping Real-World Use Cases E-commerce Support**: Reduce customer service workload by 70% Affiliate Marketing**: Automatically recommend products with tracking links Telegram Communities**: Add shopping capabilities to existing channels Product Discovery**: Help customers find products they didn't know existed Key Features & Benefits 🧠 Intelligent Intent Detection Uses Google Gemini AI to understand user messages Automatically routes to product search or conversation mode Handles multiple languages and casual typing styles 🛒 Real-Time Product Data Integrates with Apify's Amazon scraper for live data Fetches prices, ratings, reviews, and product details Processes up to 10 products per search instantly 🎯 AI-Powered Recommendations Analyzes multiple products simultaneously Ranks by relevance, value, and user satisfaction Provides top 5 personalized recommendations with reasoning 📱 Telegram-Optimized Output Perfect formatting with emojis and markdown Respects character limits for mobile viewing Includes direct purchase links for easy buying Setup Requirements Required Credentials Telegram Bot Token - Free from @BotFather Google Gemini API Key - Free tier available at AI Studio Apify API Token - Free tier includes 100 requests/month Required n8n Nodes @n8n/n8n-nodes-langchain (for AI functionality) Built-in Telegram, HTTP Request, and Code nodes Quick Setup Guide Step 1: Telegram Bot Creation Message @BotFather on Telegram Create new bot with /newbot command Copy the bot token to your credentials Step 2: AI Configuration Sign up for Google AI Studio Generate API key for Gemini Add credentials to all three AI model nodes Step 3: Product Scraping Setup Register for free Apify account Get API token from dashboard Add token to "Amazon Product Scraper" node Step 4: Activation Import workflow JSON Add your credentials Activate the Telegram Trigger Test with a product query! Workflow Architecture 📱 Message Entry Point Telegram Trigger receives all messages 🧹 Query Preprocessing Cleans and normalizes user input for better search results 🤖 AI Intent Classification Determines if message is product-related or conversational 🔀 Smart Routing Directs to appropriate workflow path based on intent 💬 Conversation Path Handles greetings, questions, and general support 🛒 Product Search Path Scrapes Amazon → Processes data → AI analysis → Recommendations 📤 Optimized Delivery Formats and sends responses back to Telegram Customization Opportunities Easy Modifications Multiple Marketplaces**: Add eBay, Flipkart, or local stores Product Categories**: Specialize for electronics, fashion, etc. Language Support**: Translate for different markets Branding**: Customize responses with your brand voice Advanced Extensions Price Monitoring**: Set up alerts for price drops User Preferences**: Remember customer preferences Analytics Dashboard**: Track popular products and queries Affiliate Integration**: Add commission tracking links Success Metrics & ROI Performance Benchmarks Response Time**: 3-5 seconds for product queries Accuracy**: 90%+ relevant product matches User Satisfaction**: 85%+ positive feedback in testing Business Impact Reduced Support Costs**: Automate 70% of product inquiries Increased Conversions**: Personalized recommendations boost sales 24/7 Availability**: Never miss a customer inquiry Scalability**: Handle unlimited concurrent users Workflow Complexity Intermediate Level - Requires API setup but includes detailed instructions. Perfect for users with basic n8n experience who want to create something powerful.
by Rahul Joshi
Automatically detect, classify, and document GitHub API errors using AI. This workflow connects GitHub, OpenAI (GPT-4o), Airtable, Notion, and Slack to build a real-time, searchable API error knowledge base — helping engineering and support teams respond faster, stay aligned, and maintain clean documentation. ⚙️📘💬 🚀 What This Template Does 1️⃣ Triggers on new or updated GitHub issues (API-related). 🪝 2️⃣ Extracts key fields (title, body, repo, and link). 📄 3️⃣ Classifies issues using OpenAI GPT-4o, identifying error type, category, root cause, and severity. 🤖 4️⃣ Validates & parses AI output into structured JSON format. ✅ 5️⃣ Creates or updates organized FAQ-style entries in Airtable for quick lookup. 🗂️ 6️⃣ Logs detailed entries into Notion, maintaining an ongoing issue knowledge base. 📘 7️⃣ Notifies the right Slack team channel (DevOps, Backend, API, Support) with concise summaries. 💬 8️⃣ Tracks & prevents duplicates, keeping your error catalog clean and auditable. 🔄 💡 Key Benefits ✅ Converts unstructured GitHub issues into AI-analyzed documentation ✅ Centralizes API error intelligence across teams ✅ Reduces time-to-resolution for recurring issues ✅ Maintains synchronized records in Airtable & Notion ✅ Keeps DevOps and Support instantly informed through Slack alerts ✅ Fully automated, scalable, and low-cost using GPT-4o ⚙️ Features Real-time GitHub trigger for API or backend issues GPT-4o-based AI classification (error type, cause, severity, confidence) Smart duplicate prevention logic Bi-directional sync to Airtable + Notion Slack alerts with contextual AI insights Modular design — easy to extend with Jira, Teams, or email integrations 🧰 Requirements GitHub OAuth2 credentials OpenAI API key (GPT-4o recommended) Airtable Base & Table IDs (with fields like Error Code, Category, Severity, Root Cause) Notion integration with database access Slack Bot token with chat:write scope 👥 Target Audience Engineering & DevOps teams managing APIs Customer support & SRE teams maintaining FAQs Product managers tracking recurring API issues SaaS orgs automating documentation & error visibility 🪜 Step-by-Step Setup Instructions 1️⃣ Connect your GitHub account and enable the “issues” webhook event. 2️⃣ Add OpenAI credentials (GPT-4o model for classification). 3️⃣ Create an Airtable base with fields: Error Code, Category, Root Cause, Severity, Confidence. 4️⃣ Configure your Notion database with matching schema and access. 5️⃣ Set up Slack credentials and choose your alert channels. 6️⃣ Test with a sample GitHub issue to validate AI classification. 7️⃣ Enable the workflow — enjoy continuous AI-powered issue documentation!
by vinci-king-01
Meeting Notes Distributor – Mailchimp and MongoDB This workflow automatically converts raw meeting recordings or written notes into concise summaries, stores them in MongoDB for future reference, and distributes the summaries to all meeting participants through Mailchimp. It is ideal for teams that want to keep everyone aligned without manual copy-and-paste or email chains. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or cloud) Audio transcription service or written notes available via HTTP endpoint MongoDB database (cloud or self-hosted) Mailchimp account with an existing Audience list Required Credentials MongoDB** – Connection string with insert permission Mailchimp API Key** – To send campaigns (Optional) HTTP Service Auth** – If your transcription/notes endpoint is secured Specific Setup Requirements | Component | Example Value | Notes | |------------------|--------------------------------------------|-----------------------------------------------------| | MongoDB Database | meeting_notes | Database in which summaries will be stored | | Collection Name | summaries | Collection automatically created if it doesn’t exist| | Mailchimp List | Meeting Participants | Audience list containing participant email addresses| | Notes Endpoint | https://example.com/api/meetings/{id} | Returns raw transcript or note text (JSON) | How it works This workflow automatically converts raw meeting recordings or written notes into concise summaries, stores them in MongoDB for future reference, and distributes the summaries to all meeting participants through Mailchimp. It is ideal for teams that want to keep everyone aligned without manual copy-and-paste or email chains. Key Steps: Schedule Trigger**: Fires daily (or on-demand) to check for new meeting notes. HTTP Request**: Downloads raw notes or transcript from your endpoint. Code Node**: Uses an AI or custom function to generate a concise summary. If Node**: Skips processing if the summary already exists in MongoDB. MongoDB**: Inserts the new summary document. Split in Batches**: Splits participants into Mailchimp-friendly batch sizes. Mailchimp**: Sends personalized summary emails to each participant. Wait**: Ensures rate limits are respected between Mailchimp calls. Merge**: Consolidates success/failure results for logging or alerting. Set up steps Setup Time: 15-25 minutes Clone the workflow: Import or copy the JSON into your n8n instance. Configure Schedule Trigger: Set the cron expression (e.g., every weekday at 18:00). Set HTTP Request URL: Replace placeholder with your transcription/notes endpoint. Add auth headers if needed. Add MongoDB Credentials: Enter your connection string in the MongoDB node. Customize Summary Logic: Open the Code node to tweak summarization length, language, or model. Mailchimp Credentials: Supply your API key and select the correct Audience list. Map Email Fields: Ensure participant emails are supplied from transcription metadata or external source. Test Run: Execute once manually to verify MongoDB insert and email delivery. Activate Workflow: Enable the workflow so it runs on its defined schedule. Node Descriptions Core Workflow Nodes: Schedule Trigger** – Initiates the workflow at predefined intervals. HTTP Request** – Retrieves the latest meeting data (transcript or notes). Code** – Generates a summarized version of the meeting content. If** – Checks MongoDB for duplicates to avoid re-sending. MongoDB** – Stores finalized summaries for archival and audit. SplitInBatches** – Breaks participant list into manageable chunks. Mailchimp** – Sends summary emails via campaigns or transactional messages. Wait** – Pauses between batches to honor Mailchimp rate limits. Merge** – Aggregates success/failure responses for logging. Data Flow: Schedule Trigger → HTTP Request → Code → If If summary is new: MongoDB → SplitInBatches → Mailchimp → Wait Merge collates all results Customization Examples 1. Change Summary Length // Inside the Code Node const rawText = items[0].json.text; const maxSentences = 5; // adjust to 3, 7, etc. items[0].json.summary = summarize(rawText, maxSentences); return items; 2. Personalize Mailchimp Subject // In the Set node before Mailchimp items[0].json.subject = Recap: ${items[0].json.meetingTitle} – ${new Date().toLocaleDateString()}; return items; Data Output Format The workflow outputs structured JSON data: { "meetingId": "abc123", "meetingTitle": "Quarterly Planning", "summary": "Key decisions on roadmap, budget approvals...", "participants": [ "alice@example.com", "bob@example.com" ], "mongoInsertId": "65d9278fa01e3f94b1234567", "mailchimpBatchIds": ["2024-01-01T12:00:00Z#1", "2024-01-01T12:01:00Z#2"] } Troubleshooting Common Issues Mailchimp rate-limit errors – Increase Wait node delay or reduce batch size. Duplicate summaries – Ensure the If node correctly queries MongoDB using meeting ID as a unique key. Performance Tips Keep batch sizes under 500 to stay well within Mailchimp limits. Offload AI summarization to external services if Code node execution time is high. Pro Tips: Store full transcripts in MongoDB GridFS for future reference. Use environment variables in n8n for all API keys to simplify workflow export/import. Add a notifier (e.g., Slack node) after Merge to alert admins on failures. This is a community template provided “as-is” without warranty. Always validate the workflow in a test environment before using it in production.