by CustomJS
n8n Workflow: Automating Website Screenshots from Google Sheets This n8n workflow captures screenshots of websites listed in a Google Sheet and saves them to Google Drive using the CustomJS PDF Toolkit. @custom-js/n8n-nodes-pdf-toolkit Features Monitors** a Google Sheet for new rows with website URLs. Captures** screenshots of the websites using the CustomJS PDF Toolkit. Uploads** the screenshots to a specified Google Drive folder. Notice Community nodes can only be installed on self-hosted instances of n8n. Requirements Self-hosted** n8n instance A Google Sheets document containing website URLs and Titles. A Google Drive folder to store the screenshots. A CustomJS API key for website screenshots. n8n credentials** for Google Sheets and Google Drive. Workflow Steps Google Sheets Trigger Monitors a specified sheet for new rows. Extracts the URL and Title from the row. Website Screenshot Node Uses CustomJS PDF Toolkit to take a screenshot of the given URL. Google Drive Upload Saves the screenshot to a specific Google Drive folder. Uses the Title column as the filename. Setup Guide 1. Connect Google Sheets Ensure your Google Sheet has a column named Url for website URLs and Name for website names. Set up Google Sheets credentials in n8n. 2. Configure CustomJS API Sign up at CustomJS. Retrieve your API key from the profile page. Add your API key as n8n credentials. 3. Set Up Google Drive Create a folder in Google Drive to store screenshots. Copy the folder ID and set it in the Google Drive node in n8n. Perfect for: Website monitoring** Generating visual archives of web pages** Automating content curation** This workflow streamlines the process of capturing and organizing website screenshots efficiently.
by Kanaka Kishore Kandregula
Boost Sales with Automated Magento 2 Product and Coupon Notifications This n8n workflow automatically posts new Magento products & coupons to Telegram while preventing duplicates. Key benefits: ✅ Increase conversions with time-sensitive alerts (creates urgency) ✅ Reduce missed opportunities with 24/7 monitoring ✅ Improve customer engagement through rich media posts ✅ Save hours per week by automating manual posting Why This Works: Triggers impulse buys with real-time notifications Eliminates human error in duplicate posting Scales effortlessly as your catalog grows Provides analytics through database tracking Perfect for e-commerce stores wanting to: Announce new arrivals instantly Promote limited-time offers effectively Maintain consistent social presence Track performance through MySQL This workflow automatically: ✅ Detects new products AND coupons in Magento ✅ Prevents duplicate postings with MySQL tracking ✅ Posts rich formatted alerts to Telegram ✅ Runs on a customizable schedule ✨ Key Features For Products: Product name, price, and image Direct store link Media gallery support For Coupons: Coupon code and status Usage limits (times used/available) Active/inactive status indicator Core System: 🔒 MySQL duplicate prevention⏰ 1-hour schedule (customizable)📱 Telegram notifications with Markdown 🛠️ Configuration Guide Database Setup CREATE TABLE IF NOT EXISTS posted_items (item_id INT PRIMARY KEY, item_type ENUM('product', 'coupon') NOT NULL, item_value VARCHAR(255), posted BOOLEAN DEFAULT FALSE, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP, updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP); Required Credentials Magento API (HTTP Header Auth) MySQL Database Telegram Bot Sticky Notes `❗ IMPORTANT SETUP NOTES ❗ For products: Ensure 'url_key' exists in custom_attributes For coupons: Magento REST API must expose coupon rules MySQL user needs INSERT/SELECT privileges Telegram bot must be added to your channel first 🔄 SCHEDULING: - Default: Checks every 1 hours at :00 - Adjust in Schedule Trigger node ` ⚙️ Technical Details Workflow Logic: Checks for new products/coupons via Magento API Verifies against MySQL database Only posts if record doesn't exist Updates database after successful post Error Handling: Automatic skip if product/coupon exists Empty result handling Connection timeout protection 🌟 Why This Template? Complete Solution**: Handles both products AND coupons Battle-Tested**: Prevents all duplicates reliably Ready-to-Use**: Just add your credentials Fully Customizable**: Easy to modify for different needs Perfect for e-commerce stores using Magento 2 who want automated, duplicate-free social notifications!
by PUQcloud
Setting up n8n workflow Overview The Docker n8n WHMCS module uses a specially designed workflow for n8n to automate deployment processes. The workflow provides an API interface for the module, receives specific commands, and connects via SSH to a server with Docker installed to perform predefined actions. Prerequisites You must have your own n8n server. Alternatively, you can use the official n8n cloud installations available at: n8n Official Site Installation Steps Install the Required Workflow on n8n You have two options: Option 1: Use the Latest Version from the n8n Marketplace The latest workflow templates for our modules are available on the official n8n marketplace. Visit our profile to access all available templates: PUQcloud on n8n Option 2: Manual Installation Each module version comes with a workflow template file. You need to manually import this template into your n8n server. n8n Workflow API Backend Setup for WHMCS/WISECP Configure API Webhook and SSH Access Create a Basic Auth Credential for the Webhook API Block in n8n. Create an SSH Credential for accessing a server with Docker installed. Modify Template Parameters In the Parameters block of the template, update the following settings: server_domain – Must match the domain of the WHMCS/WISECP Docker server. clients_dir – Directory where user data related to Docker and disks will be stored. mount_dir – Default mount point for the container disk (recommended not to change). Do not modify the following technical parameters: screen_left screen_right Deploy-docker-compose In the Deploy-docker-compose element, you have the ability to modify the Docker Compose configuration, which will be generated in the following scenarios: When the service is created When the service is unlocked When the service is updated nginx In the nginx element, you can modify the configuration parameters of the web interface proxy server. The main section allows you to add custom parameters to the server block in the proxy server configuration file. The main\_location section contains settings that will be added to the location / block of the proxy server configuration. Here, you can define custom headers and other parameters specific to the root location. Bash Scripts Management of Docker containers and all related procedures on the server is carried out by executing Bash scripts generated in n8n. These scripts return either a JSON response or a string. All scripts are located in elements directly connected to the SSH element. You have full control over any script and can modify or execute it as needed.
by Budi SJ
Automated Financial Reporting Using Google Vision OCR, Telegram & Google Sheets This workflow automates the process of recording financial transactions from photos of receipts or shopping receipts. Users simply send an image of the receipt via Telegram. The image is processed using the Google Vision API to detect text, then extracted and structured by LLM via OpenRouter. The final result is saved to Google Sheets and also displayed to the user via a Telegram bot. 🧾 Google Sheets Template Create a Google Sheet using this template: Financial Reporting 🛠️ Key Features The workflow starts when a user sends a photo of a receipt to the Telegram bot. The image is converted to text using the Google Vision API's OCR. Data processing with LLM (OpenRouter) helps identify and structure transaction elements such as: date, vendor name & address, receipt/invoice number, item list (product name, quantity, unit price, total), and transaction category. Cleaned and structured data is automatically recorded to Google Sheets per item. The system also sends a summary of the recording results in an easy to read text format. Users can also send text messages to the bot to query stored transaction data, which will be answered by a Google Sheets-based AI Agent. 🔧 Requirements Active Telegram Bot + API Token Google Vision API Key OpenRouter Account + API Key Google Sheets connected to n8n 🧩 Setup Instructions Replace all API keys and tokens with your own in the relevant nodes. Google Vision API Key: Set in 'Set Vision API' node. Telegram Bot Token: Set in 'Set Telegram Token' node and all Telegram nodes. OpenRouter API Key: Set in all OpenRouter nodes. Google Sheets: Connect your own Google Sheets credential. Use the provided Google Sheets template or your own. Activate the workflow after configuration. (Optional) Review sticky notes for step-by-step explanations.
by Polina Medvedieva
This workflow automates the process of discovering and extracting APIs from various services, followed by generating custom schemas. It works in three distinct stages: research, extraction, and schema generation, with each stage tracking progress in a Google Sheet. 🙏 Jim Le deserves major kudos for helping to build this sophisticated three-stage workflow that cleverly automates API documentation processing using a smart combination of web scraping, vector search, and LLM technologies. How it works Stage 1 - Research: Fetches pending services from a Google Sheet Uses Google search to find API documentation Employs Apify for web scraping to filter relevant pages Stores webpage contents and metadata in Qdrant (vector database) Updates progress status in Google Sheet (pending, ok, or error) Stage 2 - Extraction: Processes services that completed research successfully Queries vector store to identify products and offerings Further queries for relevant API documentation Uses Gemini (LLM) to extract API operations Records extracted operations in Google Sheet Updates progress status (pending, ok, or error) Stage 3 - Generation: Takes services with successful extraction Retrieves all API operations from the database Combines and groups operations into a custom schema Uploads final schema to Google Drive Updates final status in sheet with file location Ideal for: Development teams needing to catalog multiple APIs API documentation initiatives Creating standardized API schema collections Automating API discovery and documentation Accounts required: Google account (for Sheets and Drive access) Apify account (for web scraping) Qdrant database Gemini API access Set up instructions: Prepare your Google Sheets document with the services information. Here's an example of a Google Sheet – you can copy it and change or remove the values under the columns. Also, make sure to update Google Sheets nodes with the correct Google Sheet ID. Configure Google Sheets OAuth2 credentials, required third-party services (Apify, Qdrant) and Gemini. Ensure proper permissions for Google Drive access.
by Nick Saraev
AI Ad Scraper & Image Generator with Facebook Ad Library Categories: PPC Automation, Creative Generation, Competitive Intelligence This workflow creates an end-to-end ad library scraper and AI image spinner system that automatically discovers competitor ads, analyzes their design elements, and generates multiple unique variations ready for your own campaigns. Built to eliminate 60-70% of manual creative work for PPC agencies, this system transforms competitor research into actionable ad variants in minutes. Benefits Automated Competitor Research** - Scrapes Facebook Ad Library for active competitor campaigns automatically AI-Powered Creative Analysis** - Uses OpenAI vision to comprehensively analyze ad design elements and copy Intelligent Image Generation** - Creates 3+ unique variations per source ad while maintaining effective layouts Complete Asset Organization** - Automatically organizes source ads and generated variations in structured Google Drive folders Campaign-Ready Output** - Generates Google Sheets database with direct links to all assets for immediate campaign deployment Massive Time Savings** - Replaces hours of manual creative work with automated competitive intelligence and generation How It Works Facebook Ad Library Scraping: Connects to Facebook's Ad Library through Apify scraper integration Searches active ads based on keywords, industries, or competitor targeting Filters for image-based ads and removes video-only content for processing Intelligent Asset Organization: Creates unique Google Drive folder structure for each scraped ad campaign Separates source competitor ads from AI-generated variations Maintains organized asset library for easy campaign management and iteration AI-Powered Creative Analysis: Uses OpenAI's vision model to comprehensively describe each competitor ad Identifies design elements, color schemes, layout patterns, and messaging approaches Generates detailed creative briefs for intelligent variation generation Smart Image Variation System: Creates 3 unique style variations per source ad using advanced AI prompting Maintains effective layout structures while changing colors, fonts, and styling Customizes messaging and branding to match your business requirements Campaign Database Integration: Logs all source ads and generated variations in organized Google Sheets Provides direct links to all assets for immediate campaign deployment Tracks performance data and creative iterations for ongoing optimization Required Setup Configuration Google Drive Structure: The workflow automatically creates this folder organization: PPC Thievery (Parent Folder) ├── [Ad Archive ID] (Per Campaign) │ ├── 1. Source Assets (Original competitor ads) │ └── 2. Spun Assets (AI-generated variations) Google Sheets Database Columns: timestamp - Unique record identifier ad_archive_id - Facebook's internal ad identifier page_id - Advertiser's Facebook page ID original_image_url - Direct link to source competitor ad page_name - Advertiser's business name ad_body - Original ad copy text date_scraped - When the ad was discovered spun_prompts - AI-generated variation instructions asset_folder - Link to campaign's Google Drive folder source_folder - Link to original ads folder spun_folder - Link to generated variations folder direct_spun_image_link - Direct link to generated ad image Set Variables Configuration: Update these values in the "Set Variables" node: googleDriveFolderId - Your parent Google Drive folder ID changeRequest - Your brand-specific variation instructions spreadsheetId - Your Google Sheets database ID Apify API Setup: Create Apify account and obtain API key Replace <your-apify-api-key-here> with actual credentials Customize search terms in the JSON body for your target competitors Adjust scraping count (default: 20 ads per run) Business Use Cases PPC Agencies** - Automate competitive research and creative generation for client campaigns E-commerce Brands** - Monitor competitor advertising strategies and generate response campaigns Marketing Teams** - Scale creative production with AI-powered competitive intelligence Freelance Marketers** - Offer advanced competitive analysis and creative services to clients SaaS Companies** - Track competitor messaging and generate differentiated ad variations Agency Teams** - Replace manual creative research with automated competitive intelligence systems Revenue Potential This system revolutionizes PPC agency economics: 60-70% reduction** in manual creative work and competitive research time 3-5x faster** campaign launch times with ready-to-use creative assets $2,000-$5,000 service value** for comprehensive competitive intelligence and creative generation Scalable competitive advantage** through automated monitoring of competitor campaigns Premium positioning** offering AI-powered creative intelligence that competitors can't match manually Difficulty Level: Advanced Estimated Build Time: 2-3 hours Monthly Operating Cost: ~$100 (Apify + OpenAI + Google APIs) Watch My Complete Live Build Want to see me build this entire system from scratch? I walk through every component live - including the ad library integration, AI analysis setup, image generation pipeline, and all the debugging that goes into creating a production-ready competitive intelligence system. 🎥 See My Live Build Process: "Ad Library Scraper & AI Image Spinner System (N8N Build)" This comprehensive tutorial shows the real development process - including advanced AI prompting for image generation, competitive analysis strategies, and the organizational systems that make this scalable for agency use. Set Up Steps Initial Database Setup: Run the initialization flow once to create your Google Drive folder and Sheets database Copy the generated folder ID and spreadsheet ID into the "Set Variables" node Configure your brand-specific change request template for consistent output Apify Integration: Set up Apify account with Facebook Ad Library scraper access Configure API credentials and test with small ad batches Customize search parameters for your target competitors and industries AI Service Configuration: Connect OpenAI API for vision analysis and image generation Set up appropriate rate limiting to control processing costs Test the complete AI pipeline with sample competitor ads Google Services Setup: Configure Google Drive API credentials for automated folder creation Set up Google Sheets integration for campaign database management Test the complete asset organization and tracking workflow Campaign Customization: Define your brand guidelines and messaging requirements in the change request Set up variation templates for different campaign types and industries Configure batch processing limits based on your API usage requirements Production Optimization: Remove the limit node for full-scale competitive monitoring Set up automated scheduling for regular competitive intelligence gathering Monitor and optimize AI prompts based on generated creative quality Advanced Optimizations Scale the system with: Multi-Platform Scraping:** Extend to LinkedIn, Twitter, and Google Ads for comprehensive competitive intelligence Performance Tracking:** Integrate with ad platforms to track performance of generated variations Style Guide Automation:** Create industry-specific variation templates for consistent brand application A/B Testing Integration:** Automatically test generated variations against source ads for performance optimization CRM Integration:** Connect competitive intelligence data with sales and marketing systems Important Considerations API Rate Limits:** Built-in delays prevent service overload and ensure reliable operation Creative Quality:** System generates multiple variations to account for AI generation variability Legal Compliance:** Use generated variations as inspiration while respecting intellectual property rights Cost Management:** Monitor OpenAI image generation costs and adjust batch sizes accordingly Competitive Ethics:** Focus on learning from successful patterns rather than direct copying Why This System Works The competitive advantage lies in speed and scale: Minutes vs. Hours:** Generate campaign-ready creative variations in minutes instead of hours of manual work Systematic Analysis:** AI vision provides consistent, comprehensive analysis that humans might miss Organized Intelligence:** Structured asset management enables rapid campaign deployment and iteration Scalable Monitoring:** Automated competitive research that scales beyond manual capacity Quality Variations:** Multiple AI-generated options ensure high-quality creative output Check Out My Channel For more advanced automation systems and proven agency-building strategies that generate real revenue, explore my YouTube channel where I share the exact methodologies used to scale automation agencies to $72K+ monthly revenue.
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for This n8n-powered automation uses Bright Data's MCP Client to extract real-time data from a price drop site listing the amazon products, including price changes and related product details. The extracted data is enriched with structured data transformation, content summarization, and sentiment analysis using Google Gemini LLM. The Amazon Price Drop Intelligence Engine is designed for: Ecommerce Analysts** who need timely updates on competitor pricing trends Brand Managers** seeking to understand consumer sentiment around pricing Data Scientists** building pricing models or enrichment pipelines Affiliate Marketers** looking to optimize campaigns based on dynamic pricing AI Developers** automating product intelligence pipelines What problem is this workflow solving? This workflow solves several key pain points: Reliable Scraping: Uses Bright Data MCP, a managed crawling platform that handles proxies, captchas, and site structure changes automatically. Insight Generation: Transforms unstructured HTML into structured data and then into human-readable summaries using Google Gemini LLM. Sentiment Context: Goes beyond raw pricing data to reveal how customers feel about the price change, helping businesses and researchers measure consumer reaction. Automated Reporting: Aggregates and stores data for easy access and downstream automation (e.g., dashboards, notifications, pricing models). What this workflow does Scrape price drop site with Bright Data MCP The workflow begins by scraping targeted price drop site for Amazon listings using Bright Data's Model Context Protocol (MCP). You can configure this to target: Structured Data Extraction Once the HTML content is retrieved, Google Gemini is employed to: Parse and structure the product information (title, price, discount, brand, ratings) Summarization & Sentiment Analysis The extracted data is passed through an LLM chain to: Generate a concise summary of the product and its recent price movement Perform sentiment analysis on user reviews and public perception Store the Results Save to disk for archiving or bulk processing Updated in a Google Sheet, making it instantly shareable with your team or integrated into a BI dashboard Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Target different platforms**: Switch Amazon for Walmart, eBay, or any ecommerce source using Bright Data’s flexible scraping infrastructure. Enrich with more LLM tasks**: Add brand tone analysis, category classification, or competitive benchmarking using Gemini prompts. Visualize output**: Pipe the Google Sheet to Looker Studio, Tableau, or Power BI. Notification integrations**: Add Slack, Discord, or email notifications for price drop alerts.
by Gede Suparsa
This template demonstrates how to provide an interactive chatbot for your work history based off your CV. Unanswered questions and follow-up email contacts are sent to you via Telegram. Use case: link on your profile to not only show off you AI workflow skills but also to provide an interactive chatbot about your work history for prospective employers. Good to Know It will require access to an OpenAI API Key (free for low usage) and setting up a bot in Telegram (free). How it Works The n8n inbuilt chat node will be hosted on n8n services to provide the chat interface. You will upload your CV either exported from LinkedIn or exported yourself to Microsoft OneDrive along with a simple text file explaining some other information about you. On each chat interaction the PDF and text file are used as tools to get context information for the chatbot to respond. If a question cannot be answered reliably, a subworkflow will be called to capture that question and send it to you as a telegram message. If the person chatting supplies their email address, this will be sent to you via a Telegram message along with other information the user provides. How to use After importing the template, create the subworkflows so that they can be used a Tools by the AI Node. Don't forget to add the Execute sub-workflow trigger. Setup credentials for Open AI, OneDrive and telegram. Upload your CV & text file summary to OneDrive and replace the document IDs in the get_documents sub-workflow. Activate the workflow so that publicly available chat will get generated on n8n.
by pavith
📄Description This automation workflow enables users to upload files via an N8N form, automatically analyzes the content using Google Gemini agents, and delivers the analyzed results via email along with a chatbot link. The system leverages Llama Cloud API, Google Gemini LLM, Pinecone vector database, and Gmail to provide a seamless, multilingual content analysis experience. ✅ Prerequisites Before setting up this workflow, ensure the following are in place: An active N8N instance. Access to Llama Cloud API. Google Gemini LLM API keys (for Translator & Analyzer agents). A Pinecone account with an active index. A Gmail account with API access configured. Basic knowledge of N8N workflow setup. ⚙️ Setup Instructions Deploy the N8N Form Create a public-facing form using N8N. Configure it to accept: File uploads. User email input. File Preprocessing Store the uploaded files temporarily. Organize and preprocess them as needed. Content Extraction using Llama Cloud API Feed the files into the Llama Cloud API. Extract and parse the content for further processing. Translation (if required) Use a Translator Agent (Google Gemini). Check if the content is in English. If not, translate it. Content Analysis Forward the (translated) content to the Analyzer Agent (Google Gemini). Perform deep analysis to extract insights. Vector Storage in Pinecone Store both: The parsed and translated content. The analyzed content. Use Pinecone to store the content as embeddings for chatbot use. User Notification via Gmail Send the analyzed content and chatbot link to the user’s provided email using Gmail API. 🧩 Customization Guidance To add more languages: Update the translation logic to include additional language support. To modify analysis depth: Adjust the prompts sent to the Gemini Analyzer Agent. To change the chatbot behavior: Retrain or reconfigure the chatbot to utilize the new Pinecone index contextually. 🔁 Workflow Summary User uploads files and email via N8N form. Files are parsed using Llama Cloud API. Content is translated (if needed) using Gemini Translator Agent. Translated content is analyzed by the Gemini Analyzer Agent. Parsed and analyzed data is stored in Pinecone. User receives email with analyzed results and a chatbot link.
by Garri
Description This workflow is designed to automate the security reputation check of domains and IP addresses using multiple APIs such as VirusTotal, AbuseIPDB, and Google DNS. It assesses potential threats including malicious and suspicious scores, as well as email security configurations (SPF, DKIM, DMARC). The analysis results are processed by AI to produce a concise assessment, then automatically updated into Google Sheets for documentation and follow-up. How It Works Automatic Trigger – The workflow runs periodically via a Schedule Trigger. Data Retrieval – Fetches a list of domains from Google Sheets with status "To do". Domain Analysis – Uses VirusTotal API to get the domain report, perform a rescan, and check IP resolutions. IP Analysis – Checks IP reputation using AbuseIPDB. Email Security Validation – Verifies SPF, DKIM, and DMARC configurations via Google DNS. AI Assessment – Analysis data is processed by AI to produce a short summary in Indonesian. Data Update – The results are automatically updated to Google Sheets, changing the status to "Done" or adding notes if potential threats are found. How to Setup Prepare API Keys Sign up and obtain API keys from VirusTotal and AbuseIPDB. Set up access to Google Sheets API. Configure Credentials in n8n Add VirusTotal API, AbuseIPDB API, and Google Sheets OAuth credentials in n8n. Prepare Google Sheets Create a sheet with columns No, Domain, Customer, Keterangan, Status. Ensure initial data has the status "To do". Import Workflow Upload the workflow JSON file into n8n. Set Schedule Trigger Define the checking interval as needed (e.g., every 1 hour). Test Run Run the workflow manually to ensure all API connections and Google Sheets output work properly.
by Cyril Nicko Gaspar
📌 AI Agent via GoHighLevel SMS with Website-Based Knowledgebase This n8n workflow enables an AI agent to interact with users through GoHighLevel SMS, leveraging a knowledgebase dynamically built by scraping the company's website. ❓ Problem It Solves Traditional customer support systems often require manual data entry and lack real-time updates from the company's website. This workflow automates the process by: Scraping the company's website at set intervals to update the knowledgebase. Integrating with GoHighLevel SMS to provide users with timely and accurate information. Utilizing AI to interpret user queries and fetch relevant information from the updated knowledgebase. 🧰 Pre-requisites Before deploying this workflow, ensure you have: An active n8n instance (self-hosted or cloud). A valid OpenAI API key (or any compatible AI model). A Bright Data account with Web Unlocker setup. A GoHighLevel SMS LeadConnector account. A GoHighLevel Marketplace App configured with the necessary scopes. Installed n8n-nodes-brightdata community node for Bright Data integration (if self-hosted). ⚙️ Setup Instructions 1. Install the Bright Data Community Node in n8n For self-hosted n8n instances: Navigate to Settings → Community Nodes. Click on Install. In the search bar, enter n8n-nodes-brightdata. Select the node from the list and click Install. Docs: https://docs.n8n.io/integrations/community-nodes/installation/gui-install 2. Configure Bright Data Credentials Obtain your API key from Bright Data. In n8n, go to Credentials → New, select HTTP Request. Set authentication to Header Auth. In Name, enter Authorization. In Value, enter Bearer <your_api_key_from_Bright_Data>. Save the credentials. 3. Configure OpenAI Credentials Add your OpenAI API key to the relevant nodes. If you want to use a different model, replace all OpenAI nodes accordingly. 4. Set Up GoHighLevel Integration a. Create a GoHighLevel Marketplace App Go to https://marketplace.gohighlevel.com Click My Apps → Create App Set Distribution Type to Sub-Account Add the following scopes: locations.readonly contacts.readonly contacts.write opportunities.readonly opportunities.write users.readonly conversations/message.readonly conversations/message.write Add your n8n OAuth Redirect URL as a redirect URI in the app settings. Save and copy the Client ID and Client Secret. b. Configure GoHighLevel Credentials in n8n Go to Credentials → New Choose OAuth2 API Input: Client ID Client Secret Authorization URL: https://auth.gohighlevel.com/oauth/authorize Access Token URL: https://auth.gohighlevel.com/oauth/token Scopes: locations.readonly contacts.readonly contacts.write opportunities.readonly opportunities.write users.readonly conversations/message.readonly conversations/message.write Save and authenticate to complete setup. Docs: https://docs.n8n.io/integrations/builtin/credentials/highlevel 🔄 Workflow Functionality (Summary) Scheduled Scraping**: Scrapes website at user-defined intervals. Edit Fields** node: User defines the homepage or site to scrape. Bright Data Node* (self-hosted) OR *HTTP Node** (cloud users) used to perform scraping. Knowledgebase Update**: The scraped content is stored or indexed. GoHighLevel SMS**: Incoming user queries are received through SMS. AI Processing**: AI matches queries to relevant content. Response Delivery**: AI-generated answers are sent back via SMS. 🧩 Use Cases Customer Support Automation**: Provide instant, accurate responses. Lead Qualification**: Automatically answer potential customer inquiries. Internal Knowledge Distribution**: Keep staff updated via SMS based on website info. 🛠️ Customization Scraping URLs**: Adjust targets in the Edit Fields node. Model Swap**: Replace OpenAI nodes to use a different LLM. Format Response**: Customize output to match your tone or brand. Other Channels**: Expand to include chat apps or email responses. Vector Databases**: It is advisable to store the data into a third-party vector database services like Pinecone, Supabase, etc. Chat Memory Node**: This workflow is using Redis as a chat memory but you can use N8N built-in chat memory. ✅ Summary This n8n workflow combines Bright Data’s scraping tools and GoHighLevel’s SMS interface with AI query handling to deliver a real-time, conversational support experience. Ideal for businesses that want to turn their website into a live knowledge source via SMS, this agent keeps itself updated, smart, and customer-ready.
by SuperAgent
Who is this template for? This template is ideal for small businesses, agencies, and solo professionals who want to automate appointment scheduling and caller follow-up through a voice-based AI receptionist. If you’re using tools like Google Calendar, Airtable, and Vapi (Twilio), this setup is for you. What problem does this workflow solve? Manual call handling, appointment booking, and email coordination can be time-consuming and prone to errors. This workflow solves that by automating the receptionist role: answering calls, checking calendar availability, managing appointments, and storing call summaries—all without human intervention. What this workflow does This Agent Receptionist manages inbound voice calls and scheduling tasks using Vapi and Google Calendar. It checks availability, books or updates calendar events, sends email confirmations, and logs call details into Airtable. The workflow includes built-in logic for slot management, email triggers, and storing call transcripts. Setup Instructions Duplicate Airtable Base: Use this Airtable base templateBASE LINK Import Workflow: Load provided JSON into your n8n instance. Credentials: Connect your Google Calendar and Airtable credentials in n8n. Activate Workflow: Enable workflow to get live webhook URLs. Vapi Configuration: Paste provided system prompt into Vapi Assistant. Link the appropriate webhook URLs from n8n (GetSlots, BookSlots, UpdateSlots, CancelSlots, and end-of-call report). Disclaimer Optimized for cloud-hosted n8n instances. Self-hosted users should verify webhook and credential setups.