by Joseph
(Image Generation → Hosting → Video Generation) This workflow is designed for creators, automation enthusiasts, and indie hackers who want to generate image-based videos automatically using AI tools — at a low cost. ⚙️ Workflow Overview This automation performs the following steps: Trigger (Schedule or manual) Generate an image using Flux (choose between two APIs) Upload the image to Kraken.io to get a public URL Send the image to Runway ML (choose between two APIs) to generate a video Receive the video as a URL — ready for posting, download, or further automation 🛠️ Step-by-Step Setup 🖼️ Flux (Image Generation) You can use either of the following providers: Option 1: Flux by BlackForest Labs (Direct API) 🔑 Get your API key here: https://docs.bfl.ml/ Paste your API key in the HTTP Request node named Flux (Blackforest) You can customize prompts or styles inside the JSON body Option 2: Flux via RapidAPI 🔑 Subscribe and get your key here: https://rapidapi.com/poorav925/api/ai-text-to-image-generator-flux-free-api/playground/apiendpoint\_e38039ee-1912-4ef9-b4d4-270d72fca851 Enter your RapidAPI key in the X-RapidAPI-Key header Optional: tweak prompts, style, or resolution inside the JSON body 🐙 Kraken.io (Hosting the Image Publicly) Runway ML requires the image to be publicly accessible. We use Kraken.io to host the generated image and return a public URL. 🔑 Get your API credentials: https://kraken.io/account/api-credentials Setup: Copy your API Key and API Secret Open the Kraken Upload node in n8n Replace placeholders with your credentials The node uploads your image and gives back a public image URL for Runway to use 🎬 RunwayML (Video Generation) You also have two options here: Option 1: Runway Official API 🔑 Get your credentials at: https://dev.runwayml.com/ Use the public image URL from Kraken in the JSON body Paste your Bearer token in the Authorization header Customize other settings like video length, style, FPS, etc. Option 2: Runway via RapidAPI 🔑 Subscribe and get your key here: https://rapidapi.com/fortunehoppers/api/runwayml/playground/apiendpoint\_93c8554d-8097-40cd-8252-3d4dec9c0e68 Paste your RapidAPI key in the request header Customize prompt and generation options in the body Use the Kraken-generated image URL as the input source 📤 What to Do with the Video Once the video is generated, you’ll get a direct video URL. You can: Save it to Google Sheets or Notion Send it via email Trigger a YouTube upload automation Or download manually for editing and reposting 💡 Optional Tips & Notes You can schedule this workflow to generate AI videos daily or weekly Combine it with a Google Sheet of prompts for bulk automation Try using a consistent visual style or theme for better branding This workflow is lightweight and affordable — perfect for indie projects or experimental content generation Great for shorts, quote visuals, music loops, AI art promos, etc. 🔗 Resources Flux (Blackforest) Docs Flux on RapidAPI RunwayML Official Docs Runway on RapidAPI Kraken.io API Dashboard 🙋 Need Help? Feel free to reach out: 🐦 Twitter: @juppfy 📧 Email: joseph@uppfy.com If you’d like to hire me for custom n8n workflows or product automations, don’t hesitate to get in touch.
by Max aka Mosheh
How it works: The n8n flow grabs the needed IDs, fetches the current links, adds your new one, and sends a single HTTP request to NocoDB to update the record’s linked entries. Set up steps: Plan for 10 minutes setup if you’re already running n8n and NocoDB. You’ll need to copy/paste table IDs, set up your HTTP node, and test once. No coding, just copy IDs.
by scrapeless official
AI-Powered Web Data Pipeline with n8n How It Works This n8n workflow builds an AI-powered web data pipeline that automates the entire process of: Extraction** Structuring** Vectorization** Storage** It integrates multiple advanced tools to transform messy web pages into clean, searchable vector databases. Integrated Tools Scrapeless** Bypasses JavaScript-heavy websites and anti-bot protections to reliably extract HTML content. Claude AI** Uses LLMs to analyze unstructured HTML and generate clean, structured JSON data. Ollama Embeddings** Generates local vector embeddings from structured text using the all-minilm model. Qdrant Vector DB** Stores semantic vector data for fast and meaningful search capabilities. Webhook Notifications** Sends real-time updates when workflows complete or errors occur. From messy webpages to structured vector data — this pipeline is perfect for building intelligent agents, knowledge bases, or research automation tools. Setup Steps 1. Install n8n > Requires Node.js v18 / v20 / v22 npm install -g n8n n8n After installation, access the n8n interface via: URL: http://localhost:5678 2. Set Up Scrapeless Register at: Scrapeless Copy your API token Paste the token into the HTTP Request node labeled "Scrapeless Web Request" 3. Set Up Claude API (Anthropic) Sign up at Anthropic Console Generate your Claude API key Add the API key to the following nodes: Claude Extractor AI Data Checker Claude AI Agent 4. Install and Run Ollama macOS brew install ollama Linux curl -fsSL https://ollama.com/install.sh | sh Windows Download the installer from: https://ollama.com Start Ollama Server ollama serve Pull Embedding Model ollama pull all-minilm 5. Install Qdrant (via Docker) docker pull qdrant/qdrant docker run -d \ --name qdrant-server \ -p 6333:6333 -p 6334:6334 \ -v $(pwd)/qdrant_storage:/qdrant/storage \ qdrant/qdrant Test if Qdrant is running: curl http://localhost:6333/healthz 6. Configure the n8n Workflow Modify the Trigger (Manual or Scheduled) Input your Target URLs and Collection Name in the designated nodes Paste all required API Tokens / Keys into their corresponding nodes Ensure your Qdrant and Ollama services are running Ideal Use Cases Custom AI Chatbots Private Search Engines Research Tools Internal Knowledge Bases Content Monitoring Pipelines
by InfraNodus
This template can be used to upload the files in your Google drive to an InfraNodus knowledge graph. The InfraNodus graph will then reveal the main topics and ideas in your collection of documents and show the content gaps in them. You can also use the built-in AI to converse with the documents. You can also access the InfraNodus Graphs via its GraphRAG API to re-use them in your other n8n workflows for high-quality content retrieval and knowledge base optimization. The template showcases the use of multiple n8n nodes and processes: Extracting documents from a Google Drive folder text extraction optional: high-quality PDF conversion using ConvertAPI InfraNodus knowledge graph generation Note: If you want to **Sync your Google drive to an InfraNodus graph, check out our other workflow* How it works Here's a description of this workflow step by step: Find all the files in a specific Google drive folder For each file found: reiterate the workflow and Identify the type of the file (TXT, PDF, Markdown) For TXT and Markdown files extract the text data For PDF files use a special PDF to Text convertor to extract the text data. (Optional: using ConvertAPI for better quality PDF conversion) Forward everything to the InfraNodus graphAndStatements API endpoint with the name of the new graph, the text field with the text data, the text settings, and doNotSave=false to create a new graph Reiterate through another file. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Use that API key to set up authorization for the InfraNodus tool in the workflow. If you want to upload the files to an existing graph, you should copy its name from InfraNodus. Otherwise you can specify any name you want. Requirements An InfraNodus account and API key A Google Drive account and authorization (you will need to set it up via Google Cloud using the n8n instructions provided in the Google Drive node). Customizing this workflow You can use Dropbox instead of Google Drive. You can also modify this workflow slightly to make it Sync with a Google Drive when the new files appear in it. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20267019838108-Upload-Sync-Your-Google-Drive-Folder-with-InfraNodus-using-n8n
by InfraNodus
This template can be used to sync the files in your Google drive to a new or existing InfraNodus knowledge graph. The InfraNodus graph will then reveal the main topics and ideas in your collection of documents and show the content gaps in them. You can also use the built-in AI to converse with the documents. You can also access the InfraNodus Graphs via its GraphRAG API to re-use them in your other n8n workflows for high-quality content retrieval and knowledge base optimization. The template showcases the use of multiple n8n nodes and processes: Syncing documents from a Google Drive folder / extracting them text extraction from files optional: high-quality PDF conversion using ConvertAPI InfraNodus knowledge graph generation Note: If you want to **upload files from your Google drive to an InfraNodus graph, check out our other workflow* How it works Here's a description of this workflow step by step: Wait for new file(s) to appear in the Google drive folder Reiterate through each file Retrieve the new file from the Google drive For each file found: reiterate the workflow and Identify the type of the file (TXT, PDF, Markdown) For TXT and Markdown files extract the text data For PDF files use a special PDF to Text convertor to extract the text data. (Optional: using ConvertAPI for better quality PDF conversion) Forward everything to the InfraNodus graphAndStatements API endpoint with the name of the new graph, the text field with the text data, the text settings, and doNotSave=false to create a new graph Reiterate through another file. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Use that API key to set up authorization for the InfraNodus tool in the workflow. If you want to upload the files to an existing graph, you should copy its name from InfraNodus. Otherwise you can specify any name you want. Requirements An InfraNodus account and API key A Google Drive account and authorization (you will need to set it up via Google Cloud using the n8n instructions provided in the Google Drive node). Customizing this workflow You can use Dropbox instead of Google Drive. You can also modify this workflow slightly to make it Upload the files from a Google Drive when the new files appear in it. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20267019838108-Upload-Sync-Your-Google-Drive-Folder-with-InfraNodus-using-n8n
by Dvir Sharon
🛒 Monitor Google Shopping Prices with Bright Data & Email Alerts This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that monitors product prices daily using Bright Data's Google Shopping dataset and sends smart email alerts when price conditions are met. 📋 Overview This workflow provides an automated price monitoring solution that tracks product prices from Google Shopping daily and sends intelligent email notifications. Perfect for e-commerce monitoring, competitor analysis, deal hunting, and inventory management. ✨ Key Features 🕘 Scheduled Monitoring: Daily automated price checks at 9 AM 🛍️ Google Shopping Integration: Uses Bright Data's dataset for accurate pricing 📊 Smart Price Comparison: Compares current prices with historical data 📧 Intelligent Alerts: Sends emails only when prices meet criteria 📈 Data Storage: Updates Google Sheets with latest pricing data 🔄 Batch Processing: Handles multiple products with rate limiting ⚡ Fast & Reliable: Built-in error handling 🎯 Customizable Filters: Advanced price comparison logic 🎯 What This Workflow Does Schedule Trigger: Runs daily at 9 AM Data Retrieval: Fetches product list from Google Sheets Price Extraction: Scrapes current prices using Bright Data Data Update: Updates Google Sheets with new prices Price Comparison: Compares new vs. old prices Smart Filtering: Filters products that meet alert criteria Email Notifications: Sends alerts for qualifying changes Rate Limiting: Adds delay between emails Output Data Points | Field | Description | Example | | :------------ | :------------------------- | :------------------------------- | | Product URL | Original Google Shopping URL | https://shopping.google.com/product/... | | Product Name | Product title | iPhone 15 Pro Max 256GB | | Ratings | Product rating score | 4.5 | | Reviews | Number of reviews | 1,247 | | Old Price | Previous price | $1,199.00 | | New Price | Current scraped price | $1,199.00 | | Timestamp | When the check occurred | 2025-05-30T09:00:00Z | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Shopping dataset access Gmail account for notifications Steps Import the workflow JSON into n8n Configure Bright Data credentials and dataset access Set up Google Sheets with required columns Configure Gmail OAuth2 credentials Update sheet IDs and schedule settings Test with sample products and activate 📖 Usage Guide Google Sheet Structure Your Google Sheet should have the following columns to ensure the workflow functions correctly: Product URL** (Text): The direct URL to the Google Shopping product page. This is the primary identifier for the product. Product Name** (Text): The name of the product. This will be automatically populated or updated by the workflow. Old Price** (Number/Currency): The price of the product from the previous check. This column is crucial for price comparison. New Price** (Number/Currency): The most recently scraped price of the product. Ratings** (Number): The star rating of the product. Reviews** (Number): The total number of reviews for the product. Timestamp** (Datetime): The date and time when the price check was performed. Adding Products Add Google Shopping URLs to your Google Sheet. The workflow will fetch product details and track prices. Historical price data builds over time. Understanding Price Alerts The default setting for this workflow is to send an email alert when the new price equals the old price. This might seem counterintuitive, but it's useful for specific scenarios, such as: Monitoring stable pricing:** If you are tracking a product and want to be notified when its price has remained consistent over time, indicating a potential stable buying opportunity or a benchmark. Verifying data consistency:** To confirm that the scraping process is working correctly and consistently retrieving the same price when no changes are expected. You can easily customize the alert logic to trigger on different conditions as described below. Customizing Alert Logic Price drops:** new_price < old_price Significant drops:** new_price < (old_price * 0.9) (e.g., price dropped by more than 10%) Price increases:** new_price > old_price Any change:** new_price != old_price Reading the Results Real-time pricing data Historical tracking Product metadata Timestamps for each check 🔧 Customization Options Add More Data:** Descriptions, availability, seller info, shipping, images Modify Email Templates:** Customize subject and body Multiple Recipients:** Duplicate email node and change recipients Webhook Integration:** Add real-time triggers or Slack alerts 🚨 Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No price data extracted:** Verify URLs and test with different products Google Sheets permission denied:** Re-authenticate and check sharing Emails not sending:** Re-auth Gmail OAuth and verify recipients Filter not working:** Check price formats and logic Workflow failed:** Check logs, retry logic, and network status 📊 Use Cases & Examples E-commerce Monitoring:** Track competitor pricing and trends Deal Hunting:** Get alerts for price drops on wishlist items Inventory Management:** Monitor supplier pricing for procurement Market Research:** Analyze pricing trends and generate reports ⚙️ Advanced Configuration Batch Processing:** Increase batch size, add delays, use parallel processing Price History:** Store historical data, calculate averages, forecast trends Tool Integration:** CRM, Slack, databases, BI tools (Tableau, Power BI) 📈 Performance & Limits Single URL:** 2–5 seconds Concurrent Requests:** 3–5 (depends on Bright Data plan) Data Accuracy:** 95%+ Success Rate:** 90%+ Daily Capacity:** 100–500 products Memory:** ~100MB per execution API Calls:** 1 Bright Data + 2 Google Sheets per product 🤝 Support & Community n8n Forum:** <https://community.n8n.io> Documentation:** <https://docs.n8n.io> Bright Data Support:** Via your Bright Data dashboard GitHub Issues:** Report bugs and request features 🎯 Ready to Use! Your workflow provides a solid foundation for automated price monitoring. Customize it to fit your specific needs and use cases for maximum effectiveness in tracking Google Shopping prices with intelligent email notifications. Please note that this template uses Community Nodes. Ensure you understand the risks before using community nodes.
by Yaron Been
Workflow Overview This sophisticated n8n automation is a powerful lead generation and outreach tool designed to transform YouTube channel research into actionable marketing opportunities. By intelligently connecting multiple services and APIs, this workflow: Discovers Targeted Channels: Scrapes YouTube channels based on specific keywords Extracts comprehensive channel metadata Identifies potential business opportunities Intelligent Lead Qualification: Filters channels with contact emails Validates email authenticity Ensures high-quality lead generation Personalized Outreach: Sends customized cold emails Leverages channel-specific personalization Automates initial contact process Key Benefits 🕵️ Automated Lead Discovery: Find potential collaborators or clients 🧠 Smart Filtering: Eliminate invalid or irrelevant leads 📧 Personalized Outreach: Contextual, channel-specific communication ⏱️ Time-Saving: Eliminate manual research and email hunting Workflow Architecture 🔍 Stage 1: Channel Scraping Apify Integration**: Scrapes YouTube channels Keyword-Based Search**: Target specific niches Metadata Extraction**: Collect channel details, emails 🧩 Stage 2: Lead Qualification Email Existence Check**: Filter channels with contact info ZeroBounce Verification**: Validate email authenticity Quality Control**: Ensure only valid leads proceed 📬 Stage 3: Personalized Outreach Gmail Integration**: Send customized cold emails Dynamic Personalization**: Use channel-specific details Automated Communication**: Streamline initial contact Potential Use Cases Marketing Agencies**: Find potential clients Influencer Marketers**: Discover collaboration opportunities Content Creators**: Network and expand professional connections Sales Teams**: Generate targeted lead lists Recruitment Specialists**: Identify industry professionals Setup Requirements Apify Account API token YouTube Scraper Actor Configured search keywords ZeroBounce Account Email verification API Validation credits Gmail Account OAuth2 authentication Configured sending profile n8n Installation Cloud or self-hosted instance Import workflow configuration Configure API credentials Future Enhancement Suggestions 🤖 AI-powered email personalization 📊 Advanced lead scoring mechanisms 🔄 Automated follow-up sequences 📈 Integration with CRM platforms 🌐 Multi-platform lead generation Ethical Considerations Respect email communication guidelines Comply with anti-spam regulations Provide clear opt-out mechanisms Maintain professional, value-driven outreach Connect With Me Ready to supercharge your lead generation? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your outreach strategy with intelligent, automated workflows!
by Sergey Skorobogatov
Accept YooKassa payments and log transactions in Google Sheets 🧾 Summary This workflow allows you to accept online payments via YooKassa and log both orders and transactions in Google Sheets — all without writing a single line of code. It supports full payment flow: product selection, payment initiation, webhook processing, refund updates, and payment status checks. 👥 Who is this for? This template is ideal for: Online stores with simple checkout flows Sellers of digital products or info-courses Entrepreneurs using Telegram bots or web forms Anyone needing quick payment integration with Google Sheets tracking 🎯 What problem does this workflow solve? Setting up online payments usually requires backend infrastructure. This no-code solution automates the entire payment flow: Handles product listing and price retrieval Initiates payments with email and return URL Listens for payment.succeeded and refund.succeeded events Records every action into structured Google Sheets ⚙️ What this workflow does 1. GET /products Returns a sorted list of products from a Google Sheet (products). 2. POST /payment Validates required fields (product_id, email, return_url) Checks email format Fetches product data from products Generates a unique idempotence key Sends a request to YooKassa API Saves the order into the orders sheet Returns a payment confirmation link 3. POST /yoomoney Webhook to process payment/refund events: On payment.succeeded, adds entry to transactions On refund.succeeded, updates transaction status 4. GET /status/\:id Returns real-time payment status from YooKassa 🚀 Setup Connect credentials: Google Sheets (OAuth2) YooKassa (Basic Auth using shopId and secretKey) Update the following Google Sheets: products: should contain product_id, title, price orders: for saving confirmed purchases transactions: for logging all successful or refunded payments Test endpoints using any HTTP client: Example payload for /payment: { "product_id": "abc123", "email": "user@example.com", "return_url": "https://your.site/success" } 🔧 How to customize this workflow Add delivery logic (e.g., email with product link after successful payment) Replace Google Sheets with a database (e.g., PostgreSQL) Connect Telegram or other messengers for post-payment notifications Add promo codes, discounts, or subscriptions logic 💼 Use cases Simple online checkouts Telegram bots selling access Educational product sales MVP e-commerce flows Donation or membership payments 📎 Notes ✅ Includes Sticky Notes for sections ✅ Includes error handling and validation ✅ No custom code needed except UUID generation
by KPendic
How it works This workflow simply exports all your CloudFlare domains to Google Sheet to get high overview of all of your settings. This could help for easy debugging, searching or similar needs. In flow simple pagging nodes are used to iterate over all your domains, because this list could be huge. For each host we are merging DNS & Settings and transforming them into columns for all our domains. Requirements For storing and processing of data in this flow you will need: CloudFlare.com API key/token - for retrieving your data (https://dash.cloudflare.com/:account/api-tokens) (need full access) Google Spreadsheet auth connected in your n8n Credentials Google Spreadsheet template - you can copy my sheet as starting point, start by copying it to your account Match Sheet ID in 'Export' node to your newly created. Official CloudFlare api Documentation For full details and specifications please use API documentation from: https://developers.cloudflare.com/api/ Potential API timeouts If you encounter CF API timeouts - I would suggest to only put somewhere in the loop simple sleep/wait node - for couple of seconds - and it should resolve timeouts. Google Sheet I've used simple Google Sheet feature conditional formatting to visually distinct my on|off toggles that was of my interest to easily get high overview for debuggint some of the settings on my hosts - but please use your own logic or change it completely.
by Incrementors
🛒 Google Maps Business Phone Number Scraper Using Bright Data API & Google Sheets Integration This template requires a self-hosted n8n instance to run. An automated workflow that extracts business information including phone numbers from Google Maps using Bright Data's API and saves the data to Google Sheets for easy access and analysis. 📋 Overview This workflow provides an automated solution for extracting business contact information from Google Maps based on location and keyword searches. Perfect for lead generation, market research, competitor analysis, and business directory creation. ✨ Key Features 🎯 Form-Based Input: Easy-to-use form for location and keyword submission 🗺️ Google Maps Integration: Uses Bright Data's Google Maps dataset for accurate business data 📊 Comprehensive Data Extraction: Extracts business names, addresses, phone numbers, ratings, and more 📧 Automated Processing: Handles the entire scraping process automatically 📈 Google Sheets Storage: Automatically saves extracted data to organized spreadsheets 🔄 Smart Status Checking: Monitors scraping progress with automatic retry logic ⚡ Fast & Reliable: Professional scraping with built-in error handling 🎯 Customizable Output: Configurable data fields for specific business needs 🎯 What This Workflow Does Input Location:** Geographic area to search (city, state, country) Keywords:** Business type or industry keywords Processing Form Submission: User submits location and keywords through web form API Request: Sends scraping request to Bright Data's Google Maps dataset Status Monitoring: Continuously checks scraping progress Data Retrieval: Fetches completed business data when ready Data Storage: Saves extracted information to Google Sheets Error Handling: Implements retry logic for failed requests Output Data Points | Field | Description | Example | |-------|-------------|---------| | Business Name | Official business name from Google Maps | "Joe's Pizza Restaurant" | | Phone Number | Contact phone number | "+1-555-123-4567" | | Address | Complete business address | "123 Main St, New York, NY 10001" | | Rating | Google Maps rating score | 4.5 | | URL | Google Maps listing URL | "https://maps.google.com/..." | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Maps dataset access 5-10 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials: In n8n: Credentials → + Add credential → HTTP Request Auth Enter your Bright Data API key Test the connection Configure dataset: Ensure you have access to Google Maps dataset (gd_m8ebnr0q2qlklc02fz) Verify dataset permissions in Bright Data dashboard Step 3: Configure Google Sheets Integration Create a Google Sheet: Go to Google Sheets Create a new spreadsheet named "Business Data" or similar Copy the Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit Set up Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Prepare your data sheet with columns: Column A: Name Column B: Address Column C: Rating Column D: Phone Number Column E: URL Step 4: Update Workflow Settings Update Google Sheets node: Open "Save to Google Sheets" node Replace the document ID with your Sheet ID Select your Google Sheets credential Choose the correct sheet/tab name Update Bright Data nodes: Open HTTP Request nodes Replace BRIGHT_DATA_API_KEY with your actual API key Verify dataset ID matches your subscription Step 5: Test & Activate Test the workflow: Activate workflow (toggle switch) Submit test form with location: "New York" and keywords: "restaurants" Verify data appears in Google Sheet Check for proper phone number extraction 📖 Usage Guide Submitting Search Requests Access the form URL provided by n8n Enter the desired location (city, state, or country) Enter relevant keywords (business type, industry, etc.) Submit the form and wait for processing Understanding the Results Your Google Sheet will populate with business data including: Complete business contact information Verified phone numbers from Google Maps Accurate addresses and ratings Direct links to Google Maps listings 🔧 Customization Options Adding More Data Points Edit the "Bright Data API - Request Business Data" node to capture additional fields: Business descriptions Operating hours Reviews count Website URLs Photos and videos Modifying Search Parameters Customize the search behavior: Adjust "limit_per_input" for more or fewer results Modify search type and discovery method Add geographical coordinates for precise targeting 🚨 Troubleshooting Common Issues & Solutions 1. "Bright Data connection failed" Cause:** Invalid API credentials or dataset access Solution:** Verify credentials in Bright Data dashboard, check dataset permissions 2. "No business data extracted" Cause:** Invalid search parameters or no results found Solution:** Try broader keywords or different locations, verify dataset availability 3. "Google Sheets permission denied" Cause:** Incorrect credentials or sheet permissions Solution:** Re-authenticate Google Sheets, check sheet sharing settings 4. "Workflow execution timeout" Cause:** Large search results or slow API response Solution:** Reduce search scope, increase timeout settings, check internet connection 📊 Use Cases & Examples 1. Lead Generation Goal:** Find potential customers in specific areas Search for businesses by industry and location Extract contact information for outreach campaigns Build targeted prospect lists 2. Market Research Goal:** Analyze local business landscape Study competitor density in target markets Identify market gaps and opportunities Gather business intelligence for strategic planning 3. Directory Creation Goal:** Build comprehensive business directories Create industry-specific business listings Maintain updated contact databases Support local business communities 📈 Performance & Limits Expected Performance Processing time:** 1-5 minutes per search depending on results Data accuracy:** 95%+ for active Google Maps listings Success rate:** 90%+ for accessible businesses Concurrent requests:** Depends on Bright Data plan limits Resource Usage Memory:** ~50MB per execution Storage:** Minimal (data stored in Google Sheets) API calls:** 2-3 Bright Data calls + 1 Google Sheets call per search Bandwidth:** ~1-2MB per search request Execution time:** 2-5 minutes for typical searches Scaling Considerations Rate limiting:** Respect Bright Data API limits Error handling:** Implement retry logic for failed requests Data validation:** Add checks for incomplete business data Cost optimization:** Monitor API usage to control expenses Batch processing:** Group multiple searches for efficiency 🤝 Support & Community Getting Help n8n Community Forum:** community.n8n.io Documentation:** docs.n8n.io Bright Data Support:** Contact through your dashboard GitHub Issues:** Report bugs and feature requests Contributing Share improvements with the community Report issues and suggest enhancements Create variations for specific use cases Document best practices and lessons learned 🎯 Ready to Use! This workflow provides a solid foundation for automated Google Maps business data extraction. Customize it to fit your specific needs and use cases. Your workflow URL: https://your-n8n-instance.com/workflow/google-maps-scraper For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by Anandkumar C
🛠 Website Downtime Monitoring with Scheduled Checks and Email Alerts Easily monitor your website uptime and receive instant email alerts when it becomes unreachable — using this no-code template powered by n8n, a free and flexible workflow automation tool. This ready-to-use workflow periodically checks your website’s status and sends an alert email if it’s down. ⚙️ How it Works Schedule Website Check** Triggers the workflow at regular intervals (e.g., every 8 hours by default). Check Website Status** Sends an HTTP GET request to your site. Evaluate Response** Determines if the site is reachable (expects HTTP status 200). Send Downtime Alert** If the site is down, an alert email is sent to the specified address. 🔧 Steps to Customize 1. HTTP Request Node Replace https://yourdomain.com with your actual website URL. 2. Send Email Node Update the To Email and From Email fields with your addresses. 3. Adjust Monitoring Frequency Modify the Schedule Trigger node to run every 5 minutes, hourly, or as needed. ✅ SMTP Configuration Instructions Before emails can be sent, you need to configure SMTP credentials in n8n. 📨 Option 1: Gmail SMTP Setup > Note: Gmail requires App Passwords (not your regular Gmail password) and 2FA to be enabled. Steps: Go to Google Account Security Settings. Enable 2-Step Verification. Go to App Passwords. Create a new app password (choose Mail and Other, name it n8n). In n8n: Go to Credentials → Create New → SMTP. Use the following values: Host: smtp.gmail.com Port: 465 (SSL) or 587 (TLS) User: your Gmail address (e.g., you@gmail.com) Password: the App Password you generated ✉️ Option 2: Generic SMTP Setup Use this if you're using your hosting provider's or business email SMTP server. Example Values: Host**: smtp.yourdomain.com or provider-specific (e.g., smtp.sendgrid.net) Port**: 587 (TLS) or 465 (SSL) User**: your email address (e.g., alerts@yourdomain.com) Password**: your email/SMTP password Secure**: Yes (if using 465 or TLS-enabled 587) Then in the workflow's Send Email node, select the SMTP credentials you created. 📌 Requirements A running instance of n8n (self-hosted or n8n.cloud) SMTP credentials configured in n8n for email delivery Basic familiarity with the n8n visual editor 🧠 Pro Tips Rename Nodes**: Use clear, descriptive names for maintainability. Sticky Notes**: Use stickies on the canvas to help explain logic for others. Expand Alerts**: Integrate with Slack, Discord, or Telegram for multi-channel alerts.
by Incrementors
Google Play Review Intelligence with Bright Data & Telegram Alerts Overview This n8n workflow automates the process of scraping Google Play Store reviews, analyzing app performance, and sending alerts for low-rated applications. It integrates with Bright Data for web scraping, Google Sheets for data storage, and Telegram for notifications. Workflow Components 1. ✅ Trigger Input Form Type:** Form Trigger Purpose:** Initiates the workflow with user input Input Fields:** URL (Google Play Store app URL) Number of reviews to fetch Function:** Captures user requirements to start the scraping process 2. 🚀 Start Scraping Request Type:** HTTP Request (POST) Purpose:** Sends scraping request to Bright Data API Endpoint:** https://api.brightdata.com/datasets/v3/trigger Parameters:** Dataset ID: gd_m6zagkt024uwvvwuyu Include errors: true Limit multiple results: 5 Custom Output Fields:** url, review_id, reviewer_name, review_date review_rating, review, app_url, app_title app_developer, app_images, app_rating app_number_of_reviews, app_what_new app_content_rating, app_country, num_of_reviews 3. 🔄 Check Scrape Status Type:** HTTP Request (GET) Purpose:** Monitors the progress of the scraping job Endpoint:** https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Function:** Checks if the dataset scraping is complete 4. ⏱️ Wait for Response 45 sec Type:** Wait Node Purpose:** Implements polling mechanism Duration:** 45 seconds Function:** Pauses workflow before checking status again 5. 🧩 Verify Completion Type:** IF Condition Purpose:** Evaluates scraping completion status Condition:** status === "ready" Logic:** True: Proceeds to fetch data False: Loops back to status check 6. 📥 Fetch Scraped Data Type:** HTTP Request (GET) Purpose:** Retrieves the final scraped data Endpoint:** https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Format:** JSON Function:** Downloads completed review and app data 7. 📊 Save to Google Sheet Type:** Google Sheets Node Purpose:** Stores scraped data for analysis Operation:** Append rows Target:** Specified Google Sheet document Data Mapping:** URL, Review ID, Reviewer Name, Review Date Review Rating, Review Text, App Rating App Number of Reviews, App What's New, App Country 8. ⚠️ Check Low Ratings Type:** IF Condition Purpose:** Identifies poor-performing apps Condition:** review_rating < 4 Logic:** True: Triggers alert notification False: No action taken 9. 📣 Send Alert to Telegram Type:** Telegram Node Purpose:** Sends performance alerts Message Format:** ⚠️ Low App Performance Alert 📱 App: {app_title} 🧑💻 Developer: {app_developer} ⭐ Rating: {app_rating} 📝 Reviews: {app_number_of_reviews} 🔗 View on Play Store Workflow Flow Input Form → Start Scraping → Check Status → Wait 45s → Verify Completion ↑ ↓ └──── Loop ────┘ ↓ Fetch Data → Save to Sheet & Check Ratings ↓ Send Telegram Alert Configuration Requirements API Keys & Credentials Bright Data API Key:** Required for web scraping Google Sheets OAuth2:** For data storage access Telegram Bot Token:** For alert notifications Setup Parameters Google Sheet ID:** Target spreadsheet identifier Telegram Chat ID:** Destination for alerts N8N Instance ID:** Workflow instance identifier Key Features Data Collection Comprehensive app metadata extraction Review content and rating analysis Developer and country information App store performance metrics Quality Monitoring Automated low-rating detection Real-time performance alerts Continuous data archiving Integration Capabilities Bright Data web scraping service Google Sheets data persistence Telegram instant notifications Polling-based status monitoring Use Cases App Performance Monitoring Track rating trends over time Identify user sentiment patterns Monitor competitor performance Quality Assurance Early warning for rating drops Customer feedback analysis Market reputation management Business Intelligence Review sentiment analysis Performance benchmarking Strategic decision support Technical Notes Polling Interval:** 45-second status checks Rating Threshold:** Alerts triggered for ratings < 4 Data Format:** JSON with structured field mapping Error Handling:** Includes error tracking in dataset requests Result Limiting:** Maximum 5 multiple results per request For any questions or support, please contact: info@incrementors.com or fill out this form https://www.incrementors.com/contact-us/