by Avkash Kakdiya
How it works This workflow captures new subscribers from a Mailchimp list and extracts their key details. It then enriches the subscriber data using AI, predicting professional attributes and assigning a lead score. Based on the score, high-value leads are identified, and all leads are synced into HubSpot and Pipedrive. For top-priority leads, the workflow automatically creates new deals in Pipedrive for sales follow-up. Step-by-step Step-by-step 1. Capture subscriber data Mailchimp Subscriber Trigger** – Detects new signups in a Mailchimp list. Extract Subscriber Data** – Normalizes payload into clean fields like name, email, and timestamp. 2. Enrich with AI Lead Enrichment AI** – Uses AI to infer company, role, industry, intent, LinkedIn, and lead score. Parse & Merge Enrichment** – Merges AI output with subscriber info and sets defaults if parsing fails. 3. Qualify leads High-Value Lead Check** – Filters leads with a score ≥70 to flag them as priority. Create High-Value Deal** – Opens a new deal in Pipedrive for high-value leads. 4. Sync to CRMs HubSpot Contact Sync** – Updates enriched lead data in HubSpot CRM. Pipedrive Person Create** – Adds lead as a person in Pipedrive. Why use this? Automatically enrich raw Mailchimp subscribers with valuable professional insights. Score and qualify leads instantly for prioritization. Keep HubSpot and Pipedrive updated with enriched lead records. Automate deal creation for high-value opportunities, saving sales team effort. Build a seamless pipeline from marketing signups to CRM-ready opportunities.
by Yves Tkaczyk
Use cases Monitor Google Drive folder, parsing PDF, DOCX and image file into a destination folder, ready for further processing (e.g. RAG ingestion, translation, etc.) Keep processing log in Google Sheet and send Slack notifications. How it works Trigger: Watch Google Drive folder for new and updated files. Create a uniquely named destination folder, copying the input file. Parse the file using Mistral Document, extracting content and handling non-OCRable images separately. Save the data returned by Mistral Document into the destination Google Drive folder (raw JSON file, Markdown files, and images) for further processing. How to use Google Drive and Google Sheets nodes: Create Google credentials with access to Google Drive and Google Sheets. Read more about Google Credentials. Update all Google Drive and Google Sheets nodes (14 nodes total) to use the credentials Mistral node: Create Mistral Cloud API credentials. Read more about Mistral Cloud Credentials. Update the OCR Document node to use the Mistral Cloud credentials. Slack nodes: Create Slack OAuth2 credentials. Read more about Slack OAuth2 credentials Update the two Slack nodes: Send Success Message and Send Error Message: Set the credentials Select the channel where you want to send the notifications (channels can be different for success and errors). Create a Google Sheets spreadsheet following the steps in Google Sheets Configuration. Ensure the spreadsheet can be accessed as Editor by the account used by the Google Credentials above. Create a directory for input files and a directory for output folders/files. Ensure the directories can be accessed by the account used by the Google Credentials. Update the File Created, File Updated and Workflow Configuration node following the steps in the green Notes. Requirements Google account with Google API access Mistral Cloud account access to Mistral API key. Slack account with access to Slack client ID and secret ID. Basic n8n knowledge: understanding of triggers, expressions, and credential management Who’s it for Anyone building a data pipeline ingesting files to be OCRed for further processing. 🔒 Security All credentials are stored as n8n credentials. The only information stored in this workflow that could be considered sensitive are the Google Drive Directory and Sheet IDs. These directories and the spreadsheet should be secured according to your needs. Need Help? Reach out on LinkedIn or Ask in the Forum!
by Corentin Ribeyre
This template can be used as a real-time listening and processing of search results with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into two steps : A webhook node to link your Icypeas account with your n8n workflow. A set node to retrieve the relevant informations. Set up steps You will need a working icypeas account to run the workflow and you will have to paste the production URL provided by the n8n webhook node.
by Rahul Joshi
Description Automate B2B order invoicing by fetching orders from Airtable, validating paid B2B entries, creating Stripe customers and invoices, finalizing invoices, and logging structured invoice data into Google Sheets. This workflow ensures seamless B2B billing, centralized record-keeping, and reduces manual errors in financial operations. ⚡💳📊 What This Template Does Triggers hourly to check for new B2B orders. ⏱️ Fetches order data from Airtable (Orders table). 📥 Filters only paid orders with “B2B” tag. ✅ Creates a corresponding Stripe customer from order details. 👤 Processes order line items for invoicing. 📦 Creates a Stripe invoice with due date and payment terms. 🧾 Finalizes the invoice automatically. ✔️ Formats invoice details (totals, due dates, customer info, links). 🔄 Logs structured invoice data into Google Sheets for tracking. 📊 Key Benefits Fully automates B2B invoicing workflow from orders to finalized invoices. 🔄 Ensures all invoices are linked, structured, and logged in Sheets. 🧾 Reduces manual effort and eliminates data entry errors. ⚡ Maintains centralized invoice tracking for finance teams. 📂 Creates a consistent billing flow integrated with Stripe. 💳 Features Hourly Trigger – Continuously monitors Airtable for new/updated orders. Airtable Integration – Fetches order details automatically. Conditional Filter – Processes only “B2B” paid orders. Stripe Customer Creation – Automatically creates customers in Stripe. Line Item Processor – Handles Shopify/Order line items or test data. Stripe Invoice Creation – Generates draft invoices with due dates. Invoice Finalization – Auto-finalizes and prepares invoices for payment. Data Formatter – Structures invoice info (totals, links, dates, status). Google Sheets Integration – Logs all invoice data for reporting. Requirements n8n instance (cloud or self-hosted). Airtable Personal Access Token with read access to Orders table. Stripe API credentials with customer + invoice permissions. Google Sheets OAuth2 credentials with read/write access. Target Audience Finance/ops teams handling B2B customer invoicing. 💼 SaaS or eCommerce businesses with B2B order flows. 🛍️ Startups needing automated billing + centralized reporting. 🚀 Teams tracking Stripe invoices inside Google Sheets. 📊 Step-by-Step Setup Instructions Connect Airtable credentials and replace with your base/table IDs. 🔑 Configure Stripe API credentials for invoice + customer creation. 💳 Link Google Sheets credentials and update the target sheet ID. 📊 Adjust order filtering conditions (tags, payment status) as needed. ⚙️ Test with sample data to validate invoices are created + logged. ✅
by Grace Gbadamosi
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works This workflow automatically synchronizes contact data from multiple CRM systems (HubSpot, Pipedrive, and Salesforce) into a unified Google Sheets database. The system runs on a daily schedule or can be triggered manually via webhook. It uses AI-powered data processing to intelligently deduplicate records, calculate data quality scores, and merge similar contacts. The workflow generates comprehensive quality reports and sends notifications via Slack, ensuring your master database stays clean and up-to-date. Who is this for This template is designed for revenue operations teams, data managers, and business analysts who need to consolidate customer data from multiple CRM platforms. It's particularly valuable for organizations using multiple sales tools simultaneously or during CRM migration projects. The workflow helps maintain data integrity while providing insights into data quality across different systems. Requirements Google Sheets Account**: For storing master database and quality reports CRM Platform Access**: API credentials for HubSpot, Pipedrive, and/or Salesforce OpenAI API Key**: For AI-powered data processing and deduplication MCP Server**: Github:marlonluo2018/pandas-mcp-server (optional - can be replaced with Code node) Slack Webhook**: For receiving sync completion and error notifications How to set up Configure Environment Variables - Set up secure credential storage for all API keys: HUBSPOT_API_KEY, PIPEDRIVE_API_KEY, SALESFORCE_ACCESS_TOKEN, OPENAI_API_KEY, and SLACK_WEBHOOK_URL Create Google Sheets Structure - Create a master Google Sheet with two tabs: "Master_CRM_Data" for the unified contact database and "Quality_Reports" for tracking sync statistics and data quality metrics Set Up MCP Server - Install and configure "marlonluo2018/pandas-mcp-server" Update Configuration Center - Modify all placeholder values in the Configuration Center node with your actual Google Sheet IDs, quality thresholds, deduplication keys, and batch processing settings
by Omer Fayyaz
Efficient loop-less N8N Workflow Backup Automation to Google Drive This workflow eliminates traditional loop-based processing entirely, delivering unprecedented performance and reliability even when the number of workflows to be processed are large What Makes This Different: NO SplitInBatches node** - Traditional workflows use loops to process workflows one by one NO individual file uploads** - No more multiple Google Drive API calls NO batch error handling** - Eliminates complex loop iteration error management ALL workflows processed simultaneously** - Revolutionary single-operation approach Single compressed archive** - One ZIP file containing all workflows One Google Drive upload** - Maximum efficiency, minimum API usage Key Benefits of Non-Loop Architecture: 3-5x Faster Execution** - Eliminated loop overhead and multiple API calls Reduced API Costs** - Single upload instead of dozens of individual operations Higher Reliability** - Fewer failure points, centralized error handling Better Scalability** - Performance doesn't degrade with workflow count Large Workflow Support* - *Efficiently handles hundreds of workflows without performance degradation** Easier Maintenance** - Simpler workflow structure, easier debugging Cleaner Monitoring** - Single success/failure point instead of loop iterations Who's it for This template is designed for n8n users, DevOps engineers, system administrators, and IT professionals who need reliable, automated backup solutions for their n8n workflows. It's perfect for businesses and individuals who want to ensure their workflow automation investments are safely backed up with intelligent scheduling, compression, and cloud storage integration. How it works / What it does This workflow creates an intelligent, automated backup system that transforms n8n workflow backups from inefficient multi-file operations into streamlined single-archive automation. The system: Triggers automatically every 4 hours or manually on-demand Creates timestamped folders in Google Drive for organized backup storage Retrieves all n8n workflows via the n8n API in a single operation Converts workflows to JSON and aggregates binary data efficiently Compresses all workflows into a single ZIP archive (eliminating the need for loops) Uploads the compressed backup to Google Drive in one operation Provides real-time Slack notifications for monitoring and alerting Key Innovation: No Loops Required - Unlike traditional backup workflows that use SplitInBatches or loops to process workflows individually, this system processes all workflows simultaneously and creates a single compressed archive, dramatically improving performance and reliability. How to set up 1. Configure Google Drive API Credentials Set up Google Drive OAuth2 API credentials Ensure the service account has access to create folders and upload files Update the parent folder ID where backup folders will be created 2. Configure n8n API Access Set up internal n8n API credentials for workflow retrieval Ensure the API has permissions to read all workflows Configure retry settings for reliability 3. Set up Slack Notifications Configure Slack API credentials for the notification channel Update the channel ID where backup notifications will be sent Customize notification messages as needed 4. Schedule Configuration The workflow automatically runs every 4 hours Manual execution is available for immediate backups Adjust the schedule in the Schedule Trigger node as needed 5. Test the Integration Run a manual backup to verify all components work correctly Check Google Drive for the created backup folder and ZIP file Verify Slack notifications are received Requirements n8n instance** (self-hosted or cloud) with API access Google Drive account** with API access and sufficient storage Slack workspace** for notifications (optional but recommended) n8n workflows** that need regular backup protection How to customize the workflow Modify Backup Frequency Adjust the Schedule Trigger node for different intervals (hourly, daily, weekly) Add multiple schedule triggers for different backup types Implement conditional scheduling based on workflow changes Enhance Storage Strategy Add multiple Google Drive accounts for redundancy Implement backup rotation and retention policies Add compression options (ZIP, TAR, 7Z) for different use cases Expand Notification System Add email notifications for critical backup events Integrate with monitoring systems (PagerDuty, OpsGenie) Add backup success/failure metrics and reporting Security Enhancements Implement backup encryption before upload Add backup verification and integrity checks Set up access logging and audit trails Performance Optimizations Add parallel processing for large workflow collections Implement incremental backup strategies Add backup size monitoring and alerts Key Features Zero-loop architecture** - Processes all workflows simultaneously without batch processing Intelligent compression** - Single ZIP archive instead of multiple individual files Automated scheduling** - Runs every 4 hours with manual override capability Organized storage** - Timestamped folders with clear naming conventions Real-time monitoring** - Slack notifications for all backup events Error handling** - Centralized error management with graceful failure handling Scalable design** - Handles any number of workflows efficiently Technical Architecture Highlights Eliminated Inefficiencies No SplitInBatches node** - Replaced with direct workflow processing No individual file uploads** - Single compressed archive upload No loop iterations** - All workflows processed in one operation No batch error handling** - Centralized error management Performance Improvements Faster execution** - Eliminated loop overhead and multiple API calls Reduced API quota usage** - Single Google Drive upload per backup Better resource utilization** - Efficient memory and processing usage Improved reliability** - Fewer points of failure in the workflow Data Flow Optimization Parallel processing** - Folder creation and workflow retrieval happen simultaneously Efficient aggregation** - Code node processes all binaries at once Smart compression** - Single ZIP with all workflows included Streamlined upload** - One file transfer instead of multiple operations Use Cases Production n8n instances** requiring reliable backup protection Development teams** needing workflow version control and recovery DevOps automation** requiring disaster recovery capabilities Business continuity** planning for critical automation workflows Compliance requirements** for workflow documentation and backup Team collaboration** with shared workflow backup access Business Value Risk Mitigation** - Protects valuable automation investments Operational Efficiency** - Faster, more reliable backup processes Cost Reduction** - Lower storage costs and API usage Compliance Support** - Organized backup records for audits Team Productivity** - Reduced backup management overhead Scalability** - Handles growth without performance degradation This template revolutionizes n8n workflow backup by eliminating the complexity and inefficiency of traditional loop-based approaches, providing a robust, scalable solution that grows with your automation needs while maintaining the highest levels of reliability and performance.
by Alexandra Spalato
Who's it for This workflow is for community builders, marketers, consultants, coaches, and thought leaders who want to grow their presence in Skool communities through strategic, value-driven engagement. It's especially useful for professionals who want to: Build authority in their niche by providing helpful insights Scale their community engagement without spending hours manually browsing posts Identify high-value conversation opportunities that align with their expertise Maintain authentic, helpful presence across multiple Skool communities What problem is this workflow solving Many professionals struggle to consistently engage meaningfully in online communities due to: Time constraints**: Manually browsing multiple communities daily is time-consuming Missed opportunities**: Important discussions happen when you're not online Inconsistent engagement**: Sporadic participation reduces visibility and relationship building Generic responses**: Quick replies often lack the depth needed to showcase expertise This workflow solves these problems by automatically monitoring your target Skool communities, using AI to identify posts where your expertise could add genuine value, generating thoughtful contextual comment suggestions, and organizing opportunities for efficient manual review and engagement. How it works Scheduled Community Monitoring Runs daily at 7 PM to scan your configured Skool communities for new posts and discussions from the last 24 hours. Intelligent Configuration Management Pulls settings from Airtable including target communities, your domain expertise, and preferred tools Possibility to add several configurations Filters for active configurations only Processes multiple community URLs efficiently Comprehensive Data Extraction Uses Apify Skool Scraper to collect: Post content and metadata Comments over 50 characters (quality filter) Direct links for easy access AI-Powered Opportunity Analysis Leverages OpenAI GPT-4.1 to: Analyze each post for engagement opportunities based on your expertise Identify specific trigger sentences that indicate a need you can address Generate contextual, helpful comment suggestions Maintain authentic tone without being promotional Smart Filtering and Organization Only surfaces genuine opportunities where you can add value Stores results in Airtable with detailed reasoning Provides suggested comments ready for review and posting Tracks engagement history to avoid duplicate responses Quality Control and Review All opportunities are saved to Airtable where you can: Review AI reasoning and suggested responses Edit comments before posting Track which opportunities you've acted on Monitor success patterns over time How to set up Required credentials OpenAI API key** - For GPT-4.1 powered opportunity analysis Airtable Personal Access Token** - For configuration and results storage Apify API token** - For Skool community scraping Airtable base setup Create an Airtable base with two tables: Config Table (config): Name (Single line text): Your configuration name Skool URLs (Long text): Comma-separated list of Skool community URLs cookies (Long text): Your Skool session cookies for authenticated access Domain of Activity (Single line text): Your area of expertise (e.g., "AI automation", "Digital marketing") Tools Used (Single line text): Your preferred tools to recommend (e.g., "n8n", "Zapier") active (Checkbox): Whether this configuration is currently active Results Table (Table 1): title (Single line text): Post title/author url (URL): Direct link to the post reason (Long text): AI's reasoning for the opportunity trigger (Long text): Specific sentence that triggered the opportunity suggested answer (Long text): AI-generated comment suggestion config (Link to another record): Reference to the config used date (Date): When the opportunity was found Select (Single select): Status tracking (not commented/commented) Skool cookies setup To access private Skool communities, you'll need to: Install Cookie Editor: Go to Chrome Web Store and install the "Cookie Editor" extension Login to Skool: Navigate to any Skool community you want to monitor and log in Open Cookie Editor: Click the Cookie Editor extension icon in your browser toolbar Export cookies: Click "Export" button in the extension Copy the exported text Add to Airtable: Paste the cookie string into the cookies field in your Airtable config Trigger configuration Ensure the Schedule Trigger is set to your preferred monitoring time Default is 7 PM daily, but adjust based on your target communities' peak activity Requirements Self-hosted n8n or n8n Cloud account** Active Skool community memberships** - You must be a legitimate member of communities you want to monitor OpenAI API credits** Apify subscription** - For reliable Skool data scraping (free tier available) Airtable account** - Free tier sufficient for most use cases How to customize the workflow Modify AI analysis criteria Edit the EvaluateOpportunities And Generate Comments node to: Adjust the opportunity detection sensitivity Modify the comment tone and style Add industry-specific keywords or phrases Change monitoring frequency Update the Schedule Trigger to: Multiple times per day for highly active communities Weekly for slower-moving professional groups Custom intervals based on community activity patterns Customize data collection Modify the Apify scraper settings to: Adjust the time window (currently 24 hours) Change comment length filters (currently >50 characters) Include/exclude media content Modify the number of comments per post Add additional filters Insert filter nodes to: Skip posts from specific users Focus on posts with minimum engagement levels Exclude certain post types or keywords Prioritize posts from influential community members Enhance output options Add nodes after Record Results to: Send Slack/Discord notifications for high-priority opportunities Create calendar events for engagement tasks Export daily summaries to Google Sheets Integrate with CRM systems for lead tracking Example outputs Opportunity analysis result { "opportunity": true, "reason": "The user is struggling with manual social media management tasks that could be automated using n8n workflows.", "trigger_sentence": "I'm spending 3+ hours daily just scheduling posts and responding to comments across all my social accounts.", "suggested_comment": "That sounds exhausting! Have you considered setting up automation workflows? Tools like n8n can handle the scheduling and even help with response suggestions, potentially saving you 80% of that time. The initial setup takes a day but pays dividends long-term." } Airtable record example Title: "Sarah Johnson - Social Media Burnout" URL: https://www.skool.com/community/post/123456 Reason: "User expressing pain point with manual social media management - perfect fit for automation solutions" Trigger: "I'm spending 3+ hours daily just scheduling posts..." Suggested Answer: "That sounds exhausting! Have you considered setting up automation workflows?..." Config: [Your Config Name] Date: 2024-12-09 19:00:00 Status: "not commented" Best practices Authentic engagement Always review and personalize AI suggestions before posting Focus on being genuinely helpful rather than promotional Share experiences and ask follow-up questions Engage in subsequent conversation when people respond Community guidelines Respect each community's rules and culture Avoid over-promotion of your tools or services Build relationships before introducing solutions Contribute value consistently, not just when selling Optimization tips Monitor which types of opportunities convert best A/B test different comment styles and approaches Track engagement metrics on your actual comments Adjust AI prompts based on community feedback
by Rahul Joshi
📊 Description Streamline sales prioritization by automatically identifying, scoring, and routing high-value leads from GoHighLevel CRM to your sales team. This workflow scores contacts daily, flags top prospects, alerts sales reps in Slack, logs data to Google Sheets, and schedules instant follow-ups in Google Calendar — ensuring no valuable lead slips through the cracks. 🚀📈 What This Template Does Triggers daily at 8:00 AM to fetch all contacts from GoHighLevel CRM. ⏰ Processes lead data and extracts key details from custom fields. 🧩 Calculates lead scores using your predefined CRM field mappings. 🔢 Filters out incomplete or invalid contacts to ensure clean data flow. 🧼 Identifies high-value leads with a score above 80 for immediate attention. 🎯 Sends real-time Slack alerts to sales teams with contact and lead score details. 💬 Logs high-priority leads into a dedicated Google Sheet for tracking and analytics. 📊 Creates automatic Google Calendar follow-up events within 1 hour of detection. 📅 Key Benefits ✅ Automatically surfaces top leads for faster follow-up ✅ Keeps sales teams aligned through instant Slack alerts ✅ Eliminates manual data review and prioritization ✅ Centralizes performance tracking via Google Sheets ✅ Ensures consistent follow-up with Google Calendar scheduling ✅ Fully customizable lead score threshold and timing Features Daily scheduled trigger (8:00 AM) GoHighLevel CRM integration for contact retrieval Smart lead scoring via custom field mapping Conditional filtering for high-value leads Slack alert system for real-time engagement Google Sheets logging for transparency and analytics Auto-created Google Calendar events for follow-ups Requirements GoHighLevel API credentials with contact read permissions Slack Bot token with chat:write access Google Sheets OAuth2 credentials Google Calendar OAuth2 credentials Defined custom fields for Lead Score and Assigned Representative in GoHighLevel Target Audience Sales and business development teams tracking high-value leads Marketing teams optimizing lead qualification and follow-up Agencies using GoHighLevel for CRM and lead management Operations teams centralizing sales activity and analytics Step-by-Step Setup Instructions Connect your GoHighLevel OAuth2 credentials and ensure contact API access. Replace placeholder custom field IDs (Lead Score & Assigned Rep) in the Code node. Add your Slack channel ID for team notifications. Connect your Google Sheets document and replace its Sheet ID in the workflow. Link Google Calendar for automatic follow-up event creation. Adjust the lead score threshold (default: 80) if needed. Run a manual test to verify data flow, then enable the daily trigger for automation.
by Aryan Shinde
Instagram Reel Downloader & Logger Automate Instagram Reel Downloads, Storage, and Activity Logging What does this workflow do? Handles incoming webhook requests (ideal for Instagram/Facebook API triggers). Validates the webhook via challenge-response and custom verify token. Checks for messages from yourself (filtering automated/self-triggered runs). Downloads Instagram Reels from URLs posted to the webhook. Uploads the reel to Google Drive and retrieves the download URL. Logs reel details (status, URL, and timestamp) to a Google Sheet for record-keeping. Notifies you on Telegram with the download details and Google Drive link. How does it work? Webhook: Listens for new messages/events (custom webhook endpoint for Meta). Validation: Confirms webhook subscribe/challenge and verify token from Meta API. Sender Check: Ignores messages unless they match your configured sender/recipient. Download Reel: Fetches the reel/attachment from Instagram using received URLs. Timestamp Gen: Adds a precise timestamp and ISO-based unique ID to the activity log. Upload to Drive: Saves the downloaded reel in a preset Google Drive folder. Log to Sheet: Updates a Google Sheet with the reel’s status, URL, and timestamp. Telegram Alert: Instantly notifies you when a new reel is downloaded and logged. What do I need to make this work? A registered webhook endpoint (from your Meta/Instagram app configuration). A Google Drive and Google Sheets account (OAuth2 connected to n8n). A Telegram Bot and Chat ID setup to receive download completion messages. The correct verify_token in your webhook event source matches your template (‘youtube-automation-n8n-token’ by default). Update your Drive/Sheet/Bot credentials as per your n8n instance’s environment. Why use this? Fully automates the collection and archival of Instagram Reels. Centralizes content download, backup, and activity records for your automation flows. Provides instant monitoring and archival of each event. Setup Tips: Make sure your webhook path and Meta app configuration match (/n8n-template-insta-webhook). Double-check the Google credentials and the sheet’s tab IDs/names. Replace the Telegram and Google connection credentials with your own securely. Use this as a foundation for any Instagram/Facebook-based automations in n8n, and customize as your automation stack evolves! Publish confidently, and let users know this template: Saves time, automates digital content management, and notifies users in real-time.
by takuma
This workflow automates reputation management for physical stores (restaurants, retail, clinics) by monitoring Google Maps reviews, analyzing them with AI, and drafting professional replies. It acts as a 24/7 customer support assistant, ensuring you never miss a negative review and saving hours of manual writing time. Who is this for? Store Managers & Owners:** Keep track of customer sentiment without manually checking Google Maps every day. Marketing Agencies:** Automate local SEO reporting and response drafting for multiple clients. Customer Support Teams:** Get instant alerts for negative feedback to resolve issues quickly. How it works Schedule: Runs every 24 hours (customizable) to fetch the latest data. Scrape: Uses Apify to retrieve the latest reviews from a specific Google Maps URL. Filter: Checks the Google Sheet database to identify only new reviews and avoid duplicates. AI Analysis: An AI Agent (via OpenRouter/OpenAI) analyzes the review text to: Generate a short summary. Draft a polite, context-aware reply based on the star rating (e.g., apologies for low stars, gratitude for high stars). Alert: Sends a Slack notification. Low Rating (<4 stars): Alerts a specific channel (e.g., #customer-support) with a warning. High Rating: Alerts a general channel (e.g., #wins) to celebrate. Save: Appends the review details, AI summary, and draft reply to the Google Sheet. Requirements n8n:** Cloud or self-hosted (v1.0+). Apify Account:* To run the *Google Maps Reviews Scraper. Google Cloud Platform:** Enabled Google Sheets API. Slack Workspace:** A webhook URL or OAuth connection. OpenRouter (or OpenAI) API Key:** For the LLM generation. How to set up Google Sheets: Create a new sheet with the following headers in the first row: reviewId, publishedAt, reviewerName, stars, text, ai_summary, ai_reply, reviewUrl, output, publishedAt date. Configure Credentials: Set up your accounts for Google Sheets, Apify, Slack, and OpenRouter within n8n. Edit the "CONFIG" Node: MAPS_URL: Paste the full Google Maps link to your store. SHEET_ID: Paste the ID found in your Google Sheet URL. SHOP_NAME: Your store's name. Slack Nodes: Select the appropriate channels for positive and negative alerts. How to customize Change the AI Persona:* Open the *AI Agent** node and modify the "System Message" to match your brand's tone of voice (e.g., casual, formal, or witty). Adjust Alert Thresholds:* Edit the *If Rating < 4** node to change the criteria for what constitutes a "negative" review (e.g., strictly < 3 stars). Multi-Store Support:** You can loop this workflow over a list of URLs to manage multiple locations in a single execution.
by Madame AI
Product Review Analysis with BrowserAct & Gemini-Powered Recommendations. This n8n template demonstrates how to perform product review sentiment analysis and generate improvement recommendations using an AI Agent. This workflow is perfect for e-commerce store owners, product managers, or marketing teams who want to automate the process of collecting feedback and turning it into actionable insights. How it works The workflow is triggered manually. An HTTP Request node initiates a web scraping task with the BrowserAct API to collect product reviews. A series of If and Wait nodes are used to check the status of the scraping task. If the task is not yet complete, the workflow pauses and retries until it receives the full dataset. An AI Agent node, powered by Google Gemini, then processes the scraped review summaries. It analyzes the sentiment of each review and generates actionable improvement recommendations. Finally, the workflow sends these detailed recommendations via a Telegram message and an Email to the relevant stakeholders. Requirements BrowserAct** API account for web scraping BrowserAct* "Product Review Sentiment Analysis*" Template Gemini** account for the AI Agent Telegram* and *SMTP** credentials for sending messages Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase How to INSTANTLY Get Product Improvement Ideas from Amazon Reviews | BrowserAct + n8n + Gemini
by Trung Tran
AI-Powered YouTube Auto-Tagging Workflow (SEO Automation) Watch the demo video below: > Supercharge your YouTube SEO with this AI-powered workflow that automatically generates and applies smart, SEO friendly tags to your new videos every week. No more manual tagging, just better discoverability, improved reach, and consistent optimization. Plus, get instant Slack notifications so your team stays updated on every video’s SEO boost. Who’s it for YouTube creators, channel admins, and marketing teams who publish regularly and want consistent, SEO-friendly tags without manual effort. Agencies managing multiple channels who need an auditable, automated tagging process with Slack notifications. How it works / What it does Weekly Schedule Trigger Runs the workflow once per week. Get all videos uploaded last week Queries YouTube for videos uploaded by the channel in the past 7 days. Get video detail Retrieves each video’s title, description, and ID. YouTube Video Auto Tagging Agent (LLM) Inputs: video.title, video.description, channelName. Uses a SEO-specialist system prompt to generate 15–20 relevant, comma-separated tags. Update video with AI-generated tags Writes the tags back to the video via YouTube Data API. Inform via Slack message Posts a confirmation message (video title + ID + tags) to a chosen Slack channel for visibility. How to set up YouTube connection Create a Google Cloud project and enable YouTube Data API v3. Configure OAuth client (Web app / Desktop as required). Authorize with the Google account that manages the channel. In your automation platform, add the YouTube credential and grant scopes (see Requirements). Slack connection Create or use an existing Slack app/bot. Install to your workspace and capture the Bot Token. Add the Slack credential in your automation platform. LLM / Chat Model Select your model (e.g., OpenAI GPT). Paste the System Prompt (SEO expert) and the User Prompt template: Inputs: {{video_title}}, {{video_description}}, {{channel_name}}. Output: comma-separated list of 15–20 tags (no #, no duplicates). Node configuration Weekly Schedule Trigger: choose day/time (e.g., Mondays 09:00 local). Get all videos uploaded last week: date filter = now() - 7 days. Get video detail: map each video ID from previous node. Agent node: map fields to the prompt variables. Update video: map the agent’s tag string to the YouTube tags field. Slack message: The video "{{video_title}} - {{video_id}}" has been auto-tagged successfully. Tags: {{tags}} Test run Manually run the workflow with one recent video. Verify the tags appear in YouTube Studio and the Slack message posts. Requirements APIs & Scopes YouTube Data API v3** youtube.readonly (to list videos / details) youtube or youtube.force-ssl (to update video metadata incl. tags) Slack Bot Token Scopes** chat:write (post messages) channels:read or groups:read if selecting channels dynamically (optional) Platform Access to a chat/LLM provider (e.g., OpenAI). Outbound HTTPS allowed. Rate limits & quotas YouTube updates consume quota; tag updates are write operations—avoid re-writing unchanged tags. Add basic throttling (e.g., 1–2 updates/sec) if you process many videos. How to customize the workflow Schedule:** switch to daily, or run on publish events instead of weekly. Filtering:** process only videos matching rules (e.g., title contains “tutorial”, or missing tags). Prompt tuning:** Add brand keywords to always include (e.g., “WiseStack AI”). Constrain to language (e.g., “Vietnamese tags only”). Enforce max 500 chars total for tags if you want a stricter cap. Safety guardrails:** Validate model output: split by comma, trim whitespace, dedupe, drop empty/over-long tags. If the agent fails, fall back to a heuristic generator (title/keywords extraction). Change log:** write a row per update to a sheet/DB (videoId, oldTags, newTags, timestamp, runId). Human-in-the-loop:** send tags to Slack as buttons (“Apply / Edit / Skip”) before updating YouTube. Multi-channel support:** loop through a list of channel credentials and repeat the pipeline. Notifications:** add error Slack messages for failed API calls; summarize weekly results. Tip: Keep a small allow/deny list (e.g., banned terms, mandatory brand terms) and run a quick sanitizer right after the agent node to maintain consistency across your channel.