by shae
How it works This Lead Capture & Auto-Qualification workflow transforms raw leads into qualified prospects through intelligent automation. Here's the high-level flow: Lead Intake → Data Validation → Enrichment → Scoring → Smart Routing → CRM Integration & Notifications The system captures leads from any source, validates the data, enriches it with company intelligence, scores based on qualification criteria, and automatically routes high-value prospects to sales while nurturing lower-priority leads. Set up steps Time to set up: Approximately 30-45 minutes Prerequisites: Active accounts with HubSpot, Clearbit, Apollo, and Slack Step 1: Import Workflow (2 minutes) Copy the workflow JSON and import into your n8n instance The workflow will appear with all nodes and sticky note documentation Step 2: Configure Environment Variables (5 minutes) Set these in your n8n environment: APOLLO_API_URL SLACK_SALES_CHANNEL_ID SLACK_MARKETING_CHANNEL_ID CRM_ASSIGNMENT_URL Step 3: Set Up API Credentials (15 minutes) Create credential connections for: Clearbit API (enrichment) Apollo API (HTTP Header Auth) HubSpot API (CRM integration) Slack API (notifications) Step 4: Customize Scoring Logic (10 minutes) Review the qualification criteria in the Code node Adjust scoring weights based on your ideal customer profile Modify industry targeting and company size thresholds Step 5: Test & Activate (8 minutes) Send test webhook requests to validate the flow Verify CRM contact creation and Slack notifications Activate the workflow for live lead processing
by PDF Vector
Overview Transform your accounts payable department with this enterprise-grade invoice processing solution. This workflow automates the entire invoice lifecycle - from document ingestion through payment processing. It handles invoices from multiple sources (Google Drive, email attachments, API submissions), extracts data using AI, validates against purchase orders, routes for appropriate approvals based on amount thresholds, and integrates seamlessly with your ERP system. The solution includes vendor master data management, duplicate invoice detection, real-time spend analytics, and complete audit trails for compliance. What You Can Do This comprehensive workflow creates an intelligent invoice processing pipeline that monitors multiple input channels (Google Drive, email, webhooks) for new invoices and automatically extracts data from PDFs, images, and scanned documents using AI. It validates vendor information against your master database, matches invoices to purchase orders, and detects discrepancies. The workflow implements multi-level approval routing based on invoice amount and department, prevents duplicate payments through intelligent matching algorithms, and integrates with QuickBooks, SAP, or other ERP systems. Additionally, it generates real-time dashboards showing processing metrics and cash flow insights while sending automated reminders for pending approvals. Who It's For Perfect for medium to large businesses, accounting departments, and financial service providers processing more than 100 invoices monthly across multiple vendors. Ideal for organizations that need to enforce approval hierarchies and spending limits, require integration with existing ERP/accounting systems, want to reduce processing time from days to minutes, need audit trails and compliance reporting, and seek to eliminate manual data entry errors and duplicate payments. The Problem It Solves Manual invoice processing creates significant operational challenges including data entry errors (3-5% error rate), processing delays (8-10 days per invoice), duplicate payments (0.1-0.5% of invoices), approval bottlenecks causing late fees, lack of visibility into pending invoices and cash commitments, and compliance issues from missing audit trails. This workflow reduces processing time by 80%, eliminates data entry errors, prevents duplicate payments, and provides complete visibility into your payables process. Setup Instructions Google Drive Setup: Create dedicated folders for invoice intake and configure access permissions PDF Vector Configuration: Set up API credentials with appropriate rate limits for your volume Database Setup: Deploy the provided schema for vendor master and invoice tracking tables Email Integration: Configure IMAP credentials for invoice email monitoring (optional) ERP Connection: Set up API access to your accounting system (QuickBooks, SAP, etc.) Approval Rules: Define approval thresholds and routing rules in the configuration node Notification Setup: Configure Slack/email for approval notifications and alerts Key Features Multi-Channel Invoice Ingestion**: Automatically collect invoices from Google Drive, email attachments, and API uploads Advanced OCR and AI Extraction**: Process any invoice format including handwritten notes and poor quality scans Vendor Master Integration**: Validate and enrich vendor data, maintaining a clean vendor database 3-Way Matching**: Automatically match invoices to purchase orders and goods receipts Dynamic Approval Routing**: Route based on amount, department, vendor, or custom rules Duplicate Detection**: Prevent duplicate payments using fuzzy matching algorithms Real-Time Analytics**: Track KPIs like processing time, approval delays, and early payment discounts Exception Handling**: Intelligent routing of problematic invoices for manual review Audit Trail**: Complete tracking of all actions, approvals, and system modifications Payment Scheduling**: Optimize payment timing to capture discounts and manage cash flow Customization Options This workflow can be customized to add industry-specific extraction fields, implement GL coding rules based on vendor or amount, create department-specific approval workflows, add currency conversion for international invoices, integrate with additional systems (banks, expense management), configure custom dashboards and reporting, set up vendor portals for invoice status inquiries, and implement machine learning for automatic GL coding suggestions. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by Rapiwa
Who Is This For? This n8n workflow enables automated cross-selling by identifying each WooCommerce customer's most frequently purchased product, finding a related product to recommend, and sending a personalized WhatsApp message using the Rapiwa API. It also verifies whether the user's number is WhatsApp-enabled before sending, and logs both successful and unsuccessful attempts to Google Sheets for tracking. What This Workflow Does Retrieves all paying customers from your WooCommerce store Identifies each customer's most purchased product Finds the latest product in the same category as their most purchased item Cleans and verifies customer phone numbers for WhatsApp compatibility Sends personalized WhatsApp messages with product recommendations Logs all activities to Google Sheets for tracking and analysis Handles both verified and unverified numbers appropriately Key Features Customer Segmentation:** Automatically identifies paying customers from your WooCommerce store Product Analysis:** Determines each customer's most purchased product Smart Recommendations:** Finds the latest products in the same category as customer favorites WhatsApp Integration:** Uses Rapiwa API for message delivery Phone Number Validation:** Verifies WhatsApp numbers before sending messages Dual Logging System:** Tracks both successful and failed message attempts in Google Sheets Rate Limiting:** Uses batching and wait nodes to prevent API overload Personalized Messaging:** Includes customer name and product details in messages Requirements WooCommerce store with API access Rapiwa account with API access for WhatsApp verification and messaging Google account with Sheets access Customer phone numbers in WooCommerce (stored in billing.phone field) How to Use — Step-by-Step Setup 1. Credentials Setup WooCommerce API: Configure WooCommerce API credentials in n8n (e.g., "WooCommerce (get customer)" and "WooCommerce (get customer data)") Rapiwa Bearer Auth: Create an HTTP Bearer credential with your Rapiwa API token Google Sheets OAuth2: Set up OAuth2 credentials for Google Sheets access 2. Configure Google Sheets Ensure your sheet has the required columns as specified in the Google Sheet Column Structure section 3. Verify Code Nodes Code (get paying_customer): Filters customers to include only those who have made purchases Get most buy product id & Clear Number: Identifies the most purchased product and cleans phone numbers 4. Configure HTTP Request Nodes Get customer data: Verify the WooCommerce API endpoint for retrieving customer orders Get specific product data: Verify the WooCommerce API endpoint for product details Get specific product recommend latest product: Verify the WooCommerce API endpoint for finding latest products by category Check valid WhatsApp number Using Rapiwa: Verify the Rapiwa endpoint for WhatsApp number validation Rapiwa Sender: Verify the Rapiwa endpoint for sending messages Google Sheet Required Columns You’ll need two Google Sheets (or two tabs in one spreadsheet): A Google Sheet formatted like this ➤ sample The workflow uses a Google Sheet with the following columns to track coupon distribution: Both must have the following headers (match exactly): | name | number | email | address1 | price | suk | title | product link | validity | staus | | ---------- | ------------- | ----------------------------------------------- | ----------- | ----- | --- | ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | ---------- | -------- | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://your_shop_domain/p-img-nike | verified | sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://your_shop_domain/p-img-nike | unverified | not sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://your_shop_domain/p-img-nike | verified | sent | Important Notes Phone Number Format:** The workflow cleans phone numbers by removing all non-digit characters. Ensure your WooCommerce phone numbers are in a compatible format. API Rate Limits:** Rapiwa and WooCommerce APIs have rate limits. Adjust batch sizes and wait times accordingly. Data Privacy:** Ensure compliance with data protection regulations when sending marketing messages. Error Handling:** The workflow logs unverified numbers but doesn't have extensive error handling. Consider adding error notifications for failed API calls. Product Availability:** The workflow recommends the latest product in a category, but doesn't check if it's in stock. Consider adding stock status verification. Testing:** Always test with a small batch before running the workflow on your entire customer list. Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Andrey
Overview This n8n workflow automates brand monitoring across social media platforms (Reddit, LinkedIn, X, and Instagram) using the AnySite API. It searches posts mentioning your defined keywords, stores results in n8n Data Tables, analyzes engagement and sentiment, and generates a detailed AI-powered social media report automatically sent to your email. Key Features Multi-Platform Monitoring:** Reddit, LinkedIn, X (Twitter), and Instagram Automated Post Collection:** Searches for new posts containing tracked keywords Data Persistence:** Saves all posts and comments in structured Data Tables AI-Powered Reporting:** Uses GPT (OpenAI API) to summarize and analyze trends, engagement, and risks Automated Email Delivery:** Sends comprehensive daily/weekly reports via Gmail Comment Extraction:** Collects and formats post comments for deeper sentiment analysis Scheduling Support:** Can be executed manually or automatically (e.g., every night) How It Works Triggers The workflow runs: Automatically (via Schedule Trigger) — e.g., once daily Manually (via Manual Trigger) — for testing or on-demand analysis Data Collection Process Keyword Loading: Reads all keywords from the Data Table “Brand Monitoring Words” Social Media Search: For each keyword, the workflow calls the AnySite API endpoints: api/reddit/search/posts api/linkedin/search/posts api/twitter/search/posts (X) api/instagram/search/posts Deduplication: Before saving, checks if a post already exists in the “Brand Monitoring Posts” table. Data Storage: Inserts new posts into the Data Table with fields like type, title, url, vote_count, comment_count, etc. Comments Enrichment: For Reddit and LinkedIn, retrieves and formats comments into JSON strings, then updates the record. AI Analysis & Report Generation: The AI Agent (OpenAI GPT model) aggregates posts, analyzes sentiment, engagement, risks, and generates a structured HTML email report. Email Sending: Sends the final report via Gmail using your connected account. Setup Instructions Requirements Self-hosted or cloud n8n instance AnySite API key** – https://AnySite.io OpenAI API key** (GPT-4o or later) Connected Gmail account (for report delivery) Installation Steps Import the workflow Import the provided file: Social Media Monitoring.json Configure credentials AnySite API: Add access-token header with your API key OpenAI: Add your OpenAI API key in the “OpenAI Chat Model” node Gmail: Connect your Gmail account (OAuth2) in the “Send a message in Gmail” node Create required Data Tables 1️⃣ Brand Monitoring Words | Field | Type | Description | |-------|------|-------------| | word | string | Keyword or brand name to monitor | > Each row represents a single keyword to be tracked. 2️⃣ Brand Monitoring Posts | Field | Type | Description | |-------|------|-------------| | type | string | Platform type (e.g., reddit, linkedin, x, instagram) | | title | string | Post title or headline | | url | string | Direct link to post | | created_at | string | Post creation date/time | | subreddit_id | string | (Reddit only) subreddit ID | | subreddit_alias | string | (Reddit only) subreddit alias | | subreddit_url | string | (Reddit only) subreddit URL | | subreddit_description | string | (Reddit only) subreddit description | | comment_count | number | Number of comments | | vote_count | number | Votes, likes, or reactions count | | subreddit_member_count | number | (Reddit only) member count | | post_id | string | Unique post identifier | | text | string | Post body text | | comments | string | Serialized comments (JSON string) | | word | string | Matched keyword that triggered capture | AI Reporting Logic Collects all posts gathered during the run Aggregates by keyword and platform Evaluates sentiment, engagement, and risk signals Summarizes findings with an executive summary and key metrics Sends the Social Media Intelligence Report to your configured email Customization Options Schedule:** Adjust the trigger frequency (daily, hourly, etc.) Keywords:* Add or remove keywords in the *Brand Monitoring Words** table Report Depth:** Modify system prompts in the “AI Agent” node to customize tone and analysis focus Email Recipient:** Change the target email address in the “Send a message in Gmail” node Troubleshooting | Issue | Solution | |-------|-----------| | No posts found | Check AnySite API key and keyword relevance | | Duplicate posts | Verify Data Table deduplication setup | | Report not sent | Confirm Gmail OAuth2 connection | | AI Agent error | Ensure OpenAI API key and model selection are correct | Best Practices Use specific brand or product names in keywords for better precision Run the workflow daily to maintain fresh insights Periodically review and clean Data Tables Adjust AI prompt parameters to refine analytical tone Review AI-generated reports to ensure data quality Author Notes Created for automated cross-platform brand reputation monitoring, enabling real-time insights into how your brand is discussed online.
by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. WordPress to Blotato Social Publisher Overview: This automation monitors your WordPress site for new posts and automatically creates platform-specific social media content using AI, then posts to Twitter, LinkedIn, and Facebook via Blotato. What it does: Monitors WordPress site for new posts every 30 minutes Filters posts published in the last hour to avoid duplicates Processes each new post individually AI generates optimized content for each social platform (Twitter, LinkedIn, Facebook) Extracts platform-specific content from AI response Publishes to all three social media platforms via Blotato API Setup Required: WordPress Connection Configure WordPress credentials in the "Check New Posts" node Enter your WordPress site URL, username, and password/app password Blotato Social Media API Setup Get your Blotato API key from your Blotato account Configure API credentials in the Blotato connection node Map each platform (Twitter, LinkedIn, Facebook) to the correct Blotato channel AI Configuration Set up Google Gemini API credentials Connect the Gemini model to the "AI Social Content Creator" node Customization Options Posting Frequency: Modify schedule trigger (default: every 30 minutes) Content Tone: Adjust AI system message for different writing styles Post Filtering: Change time window in WordPress node (default: last hour) Platform Selection: Remove any social media platforms you don’t want to use Testing Run workflow manually to test connections Verify posts appear correctly on all platforms Monitor for API rate limit issues Features: Platform-optimized content (hashtags, character limits, professional tone) Duplicate prevention system Batch processing for multiple posts Featured image support Customizable posting frequency Customization: Change monitoring frequency Adjust AI prompts for different tones Add/remove social platforms Modify hashtag strategies Need Help? For n8n coaching or one-on-one consultation
by vinci-king-01
How it works This workflow automatically analyzes website visitors in real-time, enriches their data with company intelligence, and provides lead scoring and sales alerts. Key Steps Webhook Trigger - Receives visitor data from your website tracking system. AI-Powered Company Intelligence - Uses ScrapeGraphAI to extract comprehensive company information from visitor domains. Visitor Enrichment - Combines visitor behavior data with company intelligence to create detailed visitor profiles. Lead Scoring - Automatically scores leads based on company size, industry, engagement, and intent signals. CRM Integration - Updates your CRM with enriched visitor data and lead scores. Sales Alerts - Sends real-time notifications to your sales team for high-priority leads. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for company intelligence gathering. Set up HubSpot connection - Connect your HubSpot CRM to automatically update contact records. Configure Slack integration - Set up your Slack workspace and specify the sales alert channel. Customize lead scoring criteria - Adjust the scoring algorithm to match your target customer profile. Set up website tracking - Configure your website to send visitor data to the webhook endpoint. Test the workflow - Verify all integrations are working correctly with a test visitor. Key Features Real-time visitor analysis** with company intelligence enrichment Automated lead scoring** based on multiple factors (company size, industry, engagement) Intent signal detection** (pricing interest, demo requests, contact intent) Priority-based sales alerts** with recommended actions CRM integration** for seamless lead management Deal size estimation** based on company characteristics
by Trung Tran
Automating AWS S3 Operations with n8n: Buckets, Folders, and Files Watch the demo video below: This tutorial walks you through setting up an automated workflow that generates AI-powered images from prompts and securely stores them in AWS S3. It leverages the new AI Tool Node and OpenAI models for prompt-to-image generation. Who’s it for This workflow is ideal for: Designers & marketers** who need quick, on-demand AI-generated visuals. Developers & automation builders* exploring *AI-driven workflows** integrated with cloud storage. Educators or trainers** creating tutorials or exercises on AI image generation. Businesses* looking to automate *image content pipelines** with AWS S3 storage. How it works / What it does Trigger: The workflow starts manually when you click “Execute Workflow”. Edit Fields: You can provide input fields such as image description, resolution, or naming convention. Create AWS S3 Bucket: Automatically creates a new S3 bucket if it doesn’t exist. Create a Folder: Inside the bucket, a folder is created to organize generated images. Prompt Generation Agent: An AI agent generates or refines the image prompt using the OpenAI Chat Model. Generate an Image: The refined prompt is used to generate an image using AI. Upload File to S3: The generated image is uploaded to the AWS S3 bucket for secure storage. This workflow showcases how to combine AI + Cloud Storage seamlessly in an automated pipeline. How to set up Import the workflow into n8n. Configure the following credentials: AWS S3 (Access Key, Secret Key, Region). OpenAI API Key (for Chat + Image models). Update the Edit Fields node with your preferred input fields (e.g., image size, description). Execute the workflow and test by entering a sample image prompt (e.g., “Futuristic city skyline in watercolor style”). Check your AWS S3 bucket to verify the uploaded image. Requirements n8n** (latest version with AI Tool Node support). AWS account** with S3 permissions to create buckets and upload files. OpenAI API key** (for prompt refinement and image generation). Basic familiarity with AWS S3 structure (buckets, folders, objects). How to customize the workflow Custom Buckets**: Replace the auto-create step with an existing S3 bucket. Image Variations**: Generate multiple image variations per prompt by looping the image generation step. File Naming**: Adjust file naming conventions (e.g., timestamp, user input). Metadata**: Add metadata such as tags, categories, or owner info when uploading to S3. Alternative Storage: Swap AWS S3 with **Google Cloud Storage, Azure Blob, or Dropbox. Trigger Options: Replace manual trigger with **Webhook, Form Submission, or Scheduler for automation. ✅ This workflow is a hands-on example of how to combine AI prompt engineering, image generation, and cloud storage automation into a single streamlined process.
by Mirai
Icebreaker Generator powered with ChatGPT This n8n template crawls a company website, distills the content with AI, and produces a short, personalized icebreaker you can drop straight into your cold emails or CRM. Perfect for SDRs, founders, and agencies who want “real research” at scale. Good to know Works from a Google Sheet of leads (domain + LinkedIn, etc.). Handles common scrape failures gracefully and marks the lead’s Status as Error. Uses ChatGPT to summarize pages and craft one concise, non-generic opener. Output is written back to the same Google Sheet (IceBreaker, Status). You’ll need Google credentials (for Sheets) and OpenAI credentials (for GPT). How it works Step 1 — Discover internal pages Reads a lead’s website from Google Sheets. Scrapes the home page and extracts all links. A Code node cleans the list (removes emails/anchors/social/external domains, normalizes paths, de-duplicates) and returns unique internal URLs. If the home page is unreachable or no links are found, the lead is marked Error and the workflow moves on. Step 2 — Convert pages to text Visits each collected URL and converts the response into HTML/Markdown text for analysis. You can cap depth/amount with the Limit node. Step 3 — Summarize & generate the icebreaker A GPT node produces a two-paragraph abstract for each page (JSON output). An Aggregate node merges all abstracts for the company. Another GPT node turns the merged summary into a personalized, multi-line icebreaker (spartan tone, non-obvious details). The result is written back to Google Sheets (IceBreaker = ..., Status = Done). The workflow loops to the next lead. How to use Prepare your sheet Include at least: organization_website_url, linkedin_url, and any other lead fields you track. Keep an empty IceBreaker and Status column for the workflow to fill. Connect credentials Google Sheets: use the Google account that owns the sheet and link it in the nodes. OpenAI: add your API key to the GPT nodes (“Summarize Website Page”, “Generate Multiline Icebreaker”). Run the workflow Start with the Manual Trigger (or replace with a schedule/webhook). Adjust Limit if you want fewer/more pages per company. Watch Status (Done/Error) and IceBreaker populate in your sheet. Requirements n8n instance Google Sheets account & access to the leads sheet OpenAI API key (for summarization + icebreaker generation) Customizing this workflow Tone & format: tweak the prompts (both GPT nodes) to match your brand voice and structure. Depth: change the Limit node to scan more/less pages; add simple rules to prioritize certain paths (e.g., /about, /blog/*). Fields: write additional outputs (e.g., Company Summary, Key Products, Recent News) back to new sheet columns. Lead selection: filter rows by Status = "" (or custom flags) to only process untouched leads. Error handling: expand the Error branch to retry with www./HTTP→HTTPS or to log diagnostics in a separate tab. Tips Keep icebreakers short, specific, and free of clichés—small, non-obvious details from the site convert best. Start with a small batch to validate quality, then scale up. Consider adding a rate limit if target sites throttle requests. In short: Sheet → crawl internal pages → AI abstracts → single tailored icebreaker → write back to the sheet, then repeat for the next lead. This automation can work great with our automation for automated cold emailing.
by Shahzaib Anwar
📌 Overview This workflow automatically processes incoming Shopify/Gmail leads and pushes them into HubSpot as both Contacts and Deals. It helps sales and marketing teams capture leads instantly, enrich CRM data, and avoid missed opportunities. ⚡ How it works Trigger: Watches for new emails in Gmail. Extract Data: Parses email body (Name, Email, City, Phone, Message, Product URL/Title). Condition: Checks if sender is Shopify before processing. HubSpot: Creates/updates a Contact with customer details. Creates a Deal associated with that contact. 🎯 Benefits 📥 Automates lead capture → CRM 🚫 Eliminates manual copy-paste from Gmail 🔄 Real-time sync between Gmail and HubSpot 📈 Improves sales follow-up speed and accuracy 🛠 Setup Steps Import this workflow into your n8n instance. Connect your Gmail and HubSpot credentials. Replace the HubSpot Deal Stage ID with your own pipeline stage. (Optional) Adjust the Code Node regex if your email format differs. Activate the workflow and test with a sample lead email. 📝 Example Email Format Name: John Doe Email: john@example.com City: London Phone: +44 7000 000000 Body: Interested in product Product Url: https://example.com/product Product Title: Sample Product sticky_notes: name: Gmail Trigger note: > 📧 Watches for new emails in Gmail. Polls every minute and passes email data into the flow. name: Get a Message note: > 📩 Fetches the full Gmail message content (body + metadata) for parsing. name: Extract From Email note: > 🔍 Extracts the sender’s email address from Gmail to identify the source. name: If Sender is Shopify note: > ✅ Condition node that ensures only Shopify-originated emails/leads are processed. name: Code Node (Regex Parser) note: > 🧾 Parses the email body using regex to extract Name, Email, City, Phone, Message, Product URL, and Title. name: Edit Fields (Set Node) note: > 📝 Cleans and structures the extracted fields into proper JSON format before sending to HubSpot. name: HubSpot → Create/Update Contact note: > 👤 Creates or updates a HubSpot Contact with the extracted lead details. name: HubSpot → Create Deal note: > 💼 Creates a HubSpot Deal linked to the Contact, including campaign/product information.
by Abdullah Alshiekh
What Problem Does It Solve? Brands and marketers spend hours manually searching Google for product reviews. Reading through multiple websites to gauge general sentiment is tedious and inefficient. It is difficult to spot recurring customer complaints or praises without aggregating data. This workflow solves these by: Instantly searching and scraping review content from the web. Using AI to read and score the sentiment of every review found. Generating a consolidated "Executive Summary" with key quotes and actionable advice. How to Configure It Telegram Setup Connect your Telegram Bot credentials in n8n. Set the Get Message node to watch for text messages. Search & Scraping (Decodo) Connect your Decodo credentials (requires a Web Scraping API plan). This handles both the Google Search and the content extraction. AI Setup Add your Google Gemini API key. The prompts are pre-configured to act as a "Strict Data Analyst," but you can edit the system prompt in the AI Agent node to match your preferred tone. How It Works Trigger:** You send a company or product name (e.g., "XQ Pharma") to your Telegram bot. Search:** The workflow uses Decodo to Google search for "[Name] reviews" and extracts the top URL results. Scrape:** It visits the review pages and strips away the HTML code to get clean text. Analyze (Loop):** The first AI Agent reads the text and determines the sentiment (Positive/Neutral/Negative) and key topics. Report:* A second AI Agent collects all the analysis pieces and writes a final summary containing a *Sentiment Score, **Customer Voice (direct quotes), and an Actionable Verdict. Delivery:** The final report is sent back to you as a Telegram message. Customization Ideas Change the Source:** Modify the search query to target specific platforms (e.g., "site:reddit.com [Product] reviews"). Change the Output:* Send the final report to a *Slack channel* or *Email** for your team to see. Database Logging:* Save the "Actionable Verdict" and sentiment scores into *Notion* or *Airtable** to track brand reputation over time. Competitor Analysis:** Use it to research competitor products instead of your own to find their weaknesses. If you need any help Get in Touch
by Abdullah Alshiekh
What Problem Does It Solve? We’ve all been there: you want to check if a product is cheaper on Amazon or Jumia, but opening a dozen tabs is a pain. Building a bot to do this usually fails because big e-commerce sites love to block scrapers with CAPTCHAs. This workflow fixes that headache by: Taking a product name from a chat message. Using Decodo to handle the hard part—searching Google and scraping the product pages without getting blocked. Using AI to read the messy HTML and pull out just the price and product name. Sending a clean "Best Price" summary back to the user instantly. How to Configure It Telegram Setup Create a bot with BotFather and paste your token into the Telegram node. Make sure your webhook is set up so the bot actually "hears" the messages. Decodo This is the engine that makes the workflow reliable. You'll need to add your Decodo API key in the credentials. We used Decodo here specifically because it handles the proxies and browser fingerprinting for you—so your Amazon requests actually go through instead of failing. AI Setup Plug in your OpenAI API key (or swap the node for Claude/Gemini if you prefer). The system prompt is already set up to ignore ads and find the real price, but feel free to tweak the tone. How It Works Trigger: You text the bot a product name (e.g., "Sony XM5"). Search: The workflow asks Decodo to Google that specific term on sites like Amazon.eg. Scrape: It grabs the URLs and passes them back to Decodo to fetch the page content safely. Extract: The AI reads through the text, finds the lowest price, and ignores the clutter. Reply: The bot texts you back with the best deal found. Customization Ideas Go wider:** Edit the search query to check other stores like Noon or Carrefour. Track trends:** Connect a Google Sheet to log what people are searching for—great for market research. If you need any help Get In Touch
by Takumi Oku
Who’s it for This template is designed for Print-on-Demand (POD) business owners, independent artists, and e-commerce managers who want to automate the process of turning raw design files into listed products without manual data entry. How it works This workflow acts as an automated merchandise factory that handles everything from image processing to marketing. Trigger: The workflow starts when a new design file is uploaded to a specific Google Drive folder. Analyze: OpenAI Vision analyzes the image to determine the subject, mood, and color palette, and assesses copyright risk. Process: The image background is removed using Remove.bg, and the clean asset is uploaded to Cloudinary. Mockup: The workflow generates realistic product mockups (e.g., T-shirts, Tote bags) by overlaying the design onto base product images using Cloudinary transformations. Copywriting: OpenAI writes an SEO-friendly product title, description, and tags based on the visual analysis. Draft: A draft product is created in Shopify with the generated details and mockup image. Approval: A message is sent to Slack with the product details and mockup. The workflow pauses and waits for a human to click "Approve" or "Reject". Publish & Promote: If approved, the product is published to Shopify and automatically posted to Instagram and Pinterest. If rejected, a notification is sent to Slack. How to set up Base Images: Upload your blank product images (e.g., a white t-shirt, a tote bag) to your Cloudinary account and note their Public IDs. Configuration: Open the Workflow Configuration node and fill in all the required fields, including your API keys and the Cloudinary Public IDs for your base products. Credentials: Configure the credentials for Google Drive, OpenAI, Shopify, Slack, Instagram, and Pinterest in their respective nodes. Folder ID: Update the Google Drive Trigger node with the ID of the folder you want to watch. Requirements n8n (Self-hosted or Cloud) Google Drive account OpenAI API key (Access to GPT-4o model recommended for Vision capabilities) Remove.bg API key Cloudinary account Shopify store Slack workspace Instagram Business account Pinterest account How to customize Mockups: You can modify the Code - Generate Mockup URLs node to add more product types (e.g., Hoodies, Mugs) by adding their Cloudinary Public IDs. Prompt Engineering: Adjust the system prompt in the OpenAI - SEO Copywriting node to match your brand voice or language style. Social Channels: Add or remove nodes to support other platforms like Twitter (X) or Facebook Pages.