by Jason Guest
Automatically deploy n8n workflows by simply dropping JSON files into a Google Drive folder—this template watches for new exports, cleans and imports them into your n8n instance, applies a tag, and then archives the processed files. Who is this template for? This workflow template is designed for n8n power users, and automation specialists who need a simple, reliable way to bulk‑deploy or version‑control n8n workflows via Google Drive. It’s perfect if you: Manage multiple n8n instances (staging, production, etc.) Want an easy “drop‑in” approach to publish new or updated workflows Prefer storing/exporting JSON in Drive rather than editing in the UI Use case Manually importing .json exports into n8n is slow and error‑prone. With this template you can: Keep your workflows in a shared Drive folder (version control friendly) Automatically sanitize each file so only supported settings go through Tag deployed workflows consistently for easy filtering Move processed files to a “Deployed” folder for clear change tracking How it works Watch “ToDeploy” folder in Google Drive for new .json files Download & parse each file into a JSON object Clean payload: strip out everything except the allowed executionOrder (and timezone if you choose) POST the cleaned workflow to your n8n instance via /api/v1/workflows PUT a predefined tag onto the newly created workflow Move file to your “Deployed” folder when import succeeds, or capture the workflow name & error if it fails Setup instructions 1. In Google Drive create a ToDeploy folder and a Deployed folder Update "Google Drive Trigger -ToDeploy folder" to your ToDeploy folder Update "Move JSON file to Deployed folder" to you Deployed folder 2. Create a n8n API key: +Go to Settings > n8n API +Select Create an API key +Copy API Key 3. In "Get Existing Workflow Tags" node: Create n8n API Authentication Authentication: Predefined Credential Type Credential Type: n8n API Create new credential: +Paste in API key +Baseurl: https://SUBDOMAIN.YOURDOMAINNAME.com/api/v1/ 4. Add n8n API authentication to: "Create n8n Workflow" node "Set Workflow Tag" node 5. Add your N8N instance URL to the N8N_Instance_URL variable in "Set n8n URL variable" node. 6. Run "1. Get Workflow Tags" flow and copy the ID of your chosen tag. 7. In "Set n8n API URL & Tag ID variables" node: Add the Workflow Tag ID to the N8N_Instance_Tag variable Add your N8N instance URL to the N8N_Instance_URL variable 8. Set workflow to Active How to adjust it to your needs Use different tags: run Get Existing Workflow Tags on start‑up to refresh available tags, or hard‑code multiple tags in the Set Workflow Tag node. Add notifications**: connect the error branch to Slack or Email nodes so you get alerted if an import fails. Swap Drive for another storage**: replace Google Drive nodes with Dropbox, S3, or GitHub triggers if you prefer a different source for your JSON files.
by n8n Team
This workflow sends a message to a Discord channel when a new row is added or a row is updated in a Google Sheet. The message will send all data rows in the Google Sheet. Prerequisites Discord account and Discord credentials. Google account and Google credentials. How it works Using a code node, we can use the obtained Google Sheet data to create a custom message that will be sent to Discord. The message will be sent to the Discord channel specified in the Discord node. Setup This workflow requires that you set up a Discord webhook and have an existing Google Sheet with data. See how to set up a Discord webhook here.
by ScrapeOps
Amazon Product Price Tracker This workflow automatically monitors Amazon product prices, tracks price changes, and sends alerts when significant price fluctuations occur. Built with ScrapeOps' structured data API, it provides a reliable, maintenance-free solution for price tracking without worrying about anti-bot measures or complex selectors. What This Workflow Does Monitors multiple Amazon products simultaneously using their ASINs Calculates both absolute and percentage price changes Sends customizable email alerts when prices cross defined thresholds Maintains a historical record of all price data for trend analysis Updates a Google Sheets with the latest price information Prerequisites A ScrapeOps API key (register at https://scrapeops.io) Google account for Google Sheets integration SMTP email configuration for alerts Setup Instructions Spreadsheet Setup Make a copy of the template spreadsheet: https://docs.google.com/spreadsheets/d/1hRv-TBXrpN6rkIU65WorttNHt-IPWas_An0sF4Of39U Add your Amazon product ASINs in the "Products to Monitor" sheet Set your desired alert thresholds for price increases/decreases Workflow Configuration Add your ScrapeOps API key to the "Setup" node Update the spreadsheet URL in the "Setup" node with YOUR copy Configure your email settings for notifications Adjust the schedule frequency as needed (default: hourly) How It Works The workflow reads product ASINs from your Google Sheet, fetches current pricing data via ScrapeOps' Amazon Product API, calculates price changes, updates your spreadsheet, and sends alerts when price movements exceed your defined thresholds. Unlike traditional web scrapers that break when websites change, this solution uses ScrapeOps' reliable API that handles all the complexity of Amazon data extraction, ensuring consistent results without maintenance. Additional Notes This workflow is ideal for deal hunters, price comparison services, and e-commerce analytics The alerting system can be extended to additional channels like Slack or Telegram ScrapeOps handles all anti-bot measures, proxy management, and parsing complexities
by Joseph
(Image Generation → Hosting → Video Generation) This workflow is designed for creators, automation enthusiasts, and indie hackers who want to generate image-based videos automatically using AI tools — at a low cost. ⚙️ Workflow Overview This automation performs the following steps: Trigger (Schedule or manual) Generate an image using Flux (choose between two APIs) Upload the image to Kraken.io to get a public URL Send the image to Runway ML (choose between two APIs) to generate a video Receive the video as a URL — ready for posting, download, or further automation 🛠️ Step-by-Step Setup 🖼️ Flux (Image Generation) You can use either of the following providers: Option 1: Flux by BlackForest Labs (Direct API) 🔑 Get your API key here: https://docs.bfl.ml/ Paste your API key in the HTTP Request node named Flux (Blackforest) You can customize prompts or styles inside the JSON body Option 2: Flux via RapidAPI 🔑 Subscribe and get your key here: https://rapidapi.com/poorav925/api/ai-text-to-image-generator-flux-free-api/playground/apiendpoint\_e38039ee-1912-4ef9-b4d4-270d72fca851 Enter your RapidAPI key in the X-RapidAPI-Key header Optional: tweak prompts, style, or resolution inside the JSON body 🐙 Kraken.io (Hosting the Image Publicly) Runway ML requires the image to be publicly accessible. We use Kraken.io to host the generated image and return a public URL. 🔑 Get your API credentials: https://kraken.io/account/api-credentials Setup: Copy your API Key and API Secret Open the Kraken Upload node in n8n Replace placeholders with your credentials The node uploads your image and gives back a public image URL for Runway to use 🎬 RunwayML (Video Generation) You also have two options here: Option 1: Runway Official API 🔑 Get your credentials at: https://dev.runwayml.com/ Use the public image URL from Kraken in the JSON body Paste your Bearer token in the Authorization header Customize other settings like video length, style, FPS, etc. Option 2: Runway via RapidAPI 🔑 Subscribe and get your key here: https://rapidapi.com/fortunehoppers/api/runwayml/playground/apiendpoint\_93c8554d-8097-40cd-8252-3d4dec9c0e68 Paste your RapidAPI key in the request header Customize prompt and generation options in the body Use the Kraken-generated image URL as the input source 📤 What to Do with the Video Once the video is generated, you’ll get a direct video URL. You can: Save it to Google Sheets or Notion Send it via email Trigger a YouTube upload automation Or download manually for editing and reposting 💡 Optional Tips & Notes You can schedule this workflow to generate AI videos daily or weekly Combine it with a Google Sheet of prompts for bulk automation Try using a consistent visual style or theme for better branding This workflow is lightweight and affordable — perfect for indie projects or experimental content generation Great for shorts, quote visuals, music loops, AI art promos, etc. 🔗 Resources Flux (Blackforest) Docs Flux on RapidAPI RunwayML Official Docs Runway on RapidAPI Kraken.io API Dashboard 🙋 Need Help? Feel free to reach out: 🐦 Twitter: @juppfy 📧 Email: joseph@uppfy.com If you’d like to hire me for custom n8n workflows or product automations, don’t hesitate to get in touch.
by Max aka Mosheh
How it works: The n8n flow grabs the needed IDs, fetches the current links, adds your new one, and sends a single HTTP request to NocoDB to update the record’s linked entries. Set up steps: Plan for 10 minutes setup if you’re already running n8n and NocoDB. You’ll need to copy/paste table IDs, set up your HTTP node, and test once. No coding, just copy IDs.
by scrapeless official
AI-Powered Web Data Pipeline with n8n How It Works This n8n workflow builds an AI-powered web data pipeline that automates the entire process of: Extraction** Structuring** Vectorization** Storage** It integrates multiple advanced tools to transform messy web pages into clean, searchable vector databases. Integrated Tools Scrapeless** Bypasses JavaScript-heavy websites and anti-bot protections to reliably extract HTML content. Claude AI** Uses LLMs to analyze unstructured HTML and generate clean, structured JSON data. Ollama Embeddings** Generates local vector embeddings from structured text using the all-minilm model. Qdrant Vector DB** Stores semantic vector data for fast and meaningful search capabilities. Webhook Notifications** Sends real-time updates when workflows complete or errors occur. From messy webpages to structured vector data — this pipeline is perfect for building intelligent agents, knowledge bases, or research automation tools. Setup Steps 1. Install n8n > Requires Node.js v18 / v20 / v22 npm install -g n8n n8n After installation, access the n8n interface via: URL: http://localhost:5678 2. Set Up Scrapeless Register at: Scrapeless Copy your API token Paste the token into the HTTP Request node labeled "Scrapeless Web Request" 3. Set Up Claude API (Anthropic) Sign up at Anthropic Console Generate your Claude API key Add the API key to the following nodes: Claude Extractor AI Data Checker Claude AI Agent 4. Install and Run Ollama macOS brew install ollama Linux curl -fsSL https://ollama.com/install.sh | sh Windows Download the installer from: https://ollama.com Start Ollama Server ollama serve Pull Embedding Model ollama pull all-minilm 5. Install Qdrant (via Docker) docker pull qdrant/qdrant docker run -d \ --name qdrant-server \ -p 6333:6333 -p 6334:6334 \ -v $(pwd)/qdrant_storage:/qdrant/storage \ qdrant/qdrant Test if Qdrant is running: curl http://localhost:6333/healthz 6. Configure the n8n Workflow Modify the Trigger (Manual or Scheduled) Input your Target URLs and Collection Name in the designated nodes Paste all required API Tokens / Keys into their corresponding nodes Ensure your Qdrant and Ollama services are running Ideal Use Cases Custom AI Chatbots Private Search Engines Research Tools Internal Knowledge Bases Content Monitoring Pipelines
by Niklas Hatje
Use case To guarantee an effective sales process deals must be distributed between sales reps in the best way. Normally, this involves manually assigning new deals that have come in. This workflow automates it for you! What this workflow does This workflow runs once a day and checks for unassigned deals in your Hubspot CRM. Once it finds one, it enriches the deal with information about the assigned contact and their company. It then checks the region of the assigned company before looking at the company's employee size. Based on this, it assigns the deal to the right sales rep within your company. Requirements New deals in Hubspot need to be unassigned in the beginning New deals have to have an attached contact that has an attached company in Hubspot The company needs to have values for region and employee count in Hubspot Setup The setup is quite straight forward and will probably take a few minutes only. Add your Hubspot credentials Customize your criterias for assigning deals in the Assign by Region and the following Assign nodes Make sure deals are assigned to the right salesrep in the Hubspot nodes at the end Activate the workflow Customizing this to your needs Adjust the trigger interval to your needs. Currently, it defaults to once a day Adjust your region settings by adding/updating/removing options in the respective node Adjust your employee size settings by adding/updating/removing options in the respective node Ideas to enhance this flow Wrap each region's assigned criteria into different sub-workflows for easier maintainability. This will not consume additional execution counts. Add more logic on what happens once a deal does not match any criteria you've set
by Incrementors
Google Play Review Intelligence with Bright Data & Telegram Alerts Overview This n8n workflow automates the process of scraping Google Play Store reviews, analyzing app performance, and sending alerts for low-rated applications. It integrates with Bright Data for web scraping, Google Sheets for data storage, and Telegram for notifications. Workflow Components 1. ✅ Trigger Input Form Type:** Form Trigger Purpose:** Initiates the workflow with user input Input Fields:** URL (Google Play Store app URL) Number of reviews to fetch Function:** Captures user requirements to start the scraping process 2. 🚀 Start Scraping Request Type:** HTTP Request (POST) Purpose:** Sends scraping request to Bright Data API Endpoint:** https://api.brightdata.com/datasets/v3/trigger Parameters:** Dataset ID: gd_m6zagkt024uwvvwuyu Include errors: true Limit multiple results: 5 Custom Output Fields:** url, review_id, reviewer_name, review_date review_rating, review, app_url, app_title app_developer, app_images, app_rating app_number_of_reviews, app_what_new app_content_rating, app_country, num_of_reviews 3. 🔄 Check Scrape Status Type:** HTTP Request (GET) Purpose:** Monitors the progress of the scraping job Endpoint:** https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Function:** Checks if the dataset scraping is complete 4. ⏱️ Wait for Response 45 sec Type:** Wait Node Purpose:** Implements polling mechanism Duration:** 45 seconds Function:** Pauses workflow before checking status again 5. 🧩 Verify Completion Type:** IF Condition Purpose:** Evaluates scraping completion status Condition:** status === "ready" Logic:** True: Proceeds to fetch data False: Loops back to status check 6. 📥 Fetch Scraped Data Type:** HTTP Request (GET) Purpose:** Retrieves the final scraped data Endpoint:** https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Format:** JSON Function:** Downloads completed review and app data 7. 📊 Save to Google Sheet Type:** Google Sheets Node Purpose:** Stores scraped data for analysis Operation:** Append rows Target:** Specified Google Sheet document Data Mapping:** URL, Review ID, Reviewer Name, Review Date Review Rating, Review Text, App Rating App Number of Reviews, App What's New, App Country 8. ⚠️ Check Low Ratings Type:** IF Condition Purpose:** Identifies poor-performing apps Condition:** review_rating < 4 Logic:** True: Triggers alert notification False: No action taken 9. 📣 Send Alert to Telegram Type:** Telegram Node Purpose:** Sends performance alerts Message Format:** ⚠️ Low App Performance Alert 📱 App: {app_title} 🧑💻 Developer: {app_developer} ⭐ Rating: {app_rating} 📝 Reviews: {app_number_of_reviews} 🔗 View on Play Store Workflow Flow Input Form → Start Scraping → Check Status → Wait 45s → Verify Completion ↑ ↓ └──── Loop ────┘ ↓ Fetch Data → Save to Sheet & Check Ratings ↓ Send Telegram Alert Configuration Requirements API Keys & Credentials Bright Data API Key:** Required for web scraping Google Sheets OAuth2:** For data storage access Telegram Bot Token:** For alert notifications Setup Parameters Google Sheet ID:** Target spreadsheet identifier Telegram Chat ID:** Destination for alerts N8N Instance ID:** Workflow instance identifier Key Features Data Collection Comprehensive app metadata extraction Review content and rating analysis Developer and country information App store performance metrics Quality Monitoring Automated low-rating detection Real-time performance alerts Continuous data archiving Integration Capabilities Bright Data web scraping service Google Sheets data persistence Telegram instant notifications Polling-based status monitoring Use Cases App Performance Monitoring Track rating trends over time Identify user sentiment patterns Monitor competitor performance Quality Assurance Early warning for rating drops Customer feedback analysis Market reputation management Business Intelligence Review sentiment analysis Performance benchmarking Strategic decision support Technical Notes Polling Interval:** 45-second status checks Rating Threshold:** Alerts triggered for ratings < 4 Data Format:** JSON with structured field mapping Error Handling:** Includes error tracking in dataset requests Result Limiting:** Maximum 5 multiple results per request For any questions or support, please contact: info@incrementors.com or fill out this form https://www.incrementors.com/contact-us/
by InfraNodus
This template can be used to upload the files in your Google drive to an InfraNodus knowledge graph. The InfraNodus graph will then reveal the main topics and ideas in your collection of documents and show the content gaps in them. You can also use the built-in AI to converse with the documents. You can also access the InfraNodus Graphs via its GraphRAG API to re-use them in your other n8n workflows for high-quality content retrieval and knowledge base optimization. The template showcases the use of multiple n8n nodes and processes: Extracting documents from a Google Drive folder text extraction optional: high-quality PDF conversion using ConvertAPI InfraNodus knowledge graph generation Note: If you want to **Sync your Google drive to an InfraNodus graph, check out our other workflow* How it works Here's a description of this workflow step by step: Find all the files in a specific Google drive folder For each file found: reiterate the workflow and Identify the type of the file (TXT, PDF, Markdown) For TXT and Markdown files extract the text data For PDF files use a special PDF to Text convertor to extract the text data. (Optional: using ConvertAPI for better quality PDF conversion) Forward everything to the InfraNodus graphAndStatements API endpoint with the name of the new graph, the text field with the text data, the text settings, and doNotSave=false to create a new graph Reiterate through another file. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Use that API key to set up authorization for the InfraNodus tool in the workflow. If you want to upload the files to an existing graph, you should copy its name from InfraNodus. Otherwise you can specify any name you want. Requirements An InfraNodus account and API key A Google Drive account and authorization (you will need to set it up via Google Cloud using the n8n instructions provided in the Google Drive node). Customizing this workflow You can use Dropbox instead of Google Drive. You can also modify this workflow slightly to make it Sync with a Google Drive when the new files appear in it. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20267019838108-Upload-Sync-Your-Google-Drive-Folder-with-InfraNodus-using-n8n
by InfraNodus
This template can be used to sync the files in your Google drive to a new or existing InfraNodus knowledge graph. The InfraNodus graph will then reveal the main topics and ideas in your collection of documents and show the content gaps in them. You can also use the built-in AI to converse with the documents. You can also access the InfraNodus Graphs via its GraphRAG API to re-use them in your other n8n workflows for high-quality content retrieval and knowledge base optimization. The template showcases the use of multiple n8n nodes and processes: Syncing documents from a Google Drive folder / extracting them text extraction from files optional: high-quality PDF conversion using ConvertAPI InfraNodus knowledge graph generation Note: If you want to **upload files from your Google drive to an InfraNodus graph, check out our other workflow* How it works Here's a description of this workflow step by step: Wait for new file(s) to appear in the Google drive folder Reiterate through each file Retrieve the new file from the Google drive For each file found: reiterate the workflow and Identify the type of the file (TXT, PDF, Markdown) For TXT and Markdown files extract the text data For PDF files use a special PDF to Text convertor to extract the text data. (Optional: using ConvertAPI for better quality PDF conversion) Forward everything to the InfraNodus graphAndStatements API endpoint with the name of the new graph, the text field with the text data, the text settings, and doNotSave=false to create a new graph Reiterate through another file. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Use that API key to set up authorization for the InfraNodus tool in the workflow. If you want to upload the files to an existing graph, you should copy its name from InfraNodus. Otherwise you can specify any name you want. Requirements An InfraNodus account and API key A Google Drive account and authorization (you will need to set it up via Google Cloud using the n8n instructions provided in the Google Drive node). Customizing this workflow You can use Dropbox instead of Google Drive. You can also modify this workflow slightly to make it Upload the files from a Google Drive when the new files appear in it. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20267019838108-Upload-Sync-Your-Google-Drive-Folder-with-InfraNodus-using-n8n
by Dvir Sharon
🛒 Monitor Google Shopping Prices with Bright Data & Email Alerts This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that monitors product prices daily using Bright Data's Google Shopping dataset and sends smart email alerts when price conditions are met. 📋 Overview This workflow provides an automated price monitoring solution that tracks product prices from Google Shopping daily and sends intelligent email notifications. Perfect for e-commerce monitoring, competitor analysis, deal hunting, and inventory management. ✨ Key Features 🕘 Scheduled Monitoring: Daily automated price checks at 9 AM 🛍️ Google Shopping Integration: Uses Bright Data's dataset for accurate pricing 📊 Smart Price Comparison: Compares current prices with historical data 📧 Intelligent Alerts: Sends emails only when prices meet criteria 📈 Data Storage: Updates Google Sheets with latest pricing data 🔄 Batch Processing: Handles multiple products with rate limiting ⚡ Fast & Reliable: Built-in error handling 🎯 Customizable Filters: Advanced price comparison logic 🎯 What This Workflow Does Schedule Trigger: Runs daily at 9 AM Data Retrieval: Fetches product list from Google Sheets Price Extraction: Scrapes current prices using Bright Data Data Update: Updates Google Sheets with new prices Price Comparison: Compares new vs. old prices Smart Filtering: Filters products that meet alert criteria Email Notifications: Sends alerts for qualifying changes Rate Limiting: Adds delay between emails Output Data Points | Field | Description | Example | | :------------ | :------------------------- | :------------------------------- | | Product URL | Original Google Shopping URL | https://shopping.google.com/product/... | | Product Name | Product title | iPhone 15 Pro Max 256GB | | Ratings | Product rating score | 4.5 | | Reviews | Number of reviews | 1,247 | | Old Price | Previous price | $1,199.00 | | New Price | Current scraped price | $1,199.00 | | Timestamp | When the check occurred | 2025-05-30T09:00:00Z | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Shopping dataset access Gmail account for notifications Steps Import the workflow JSON into n8n Configure Bright Data credentials and dataset access Set up Google Sheets with required columns Configure Gmail OAuth2 credentials Update sheet IDs and schedule settings Test with sample products and activate 📖 Usage Guide Google Sheet Structure Your Google Sheet should have the following columns to ensure the workflow functions correctly: Product URL** (Text): The direct URL to the Google Shopping product page. This is the primary identifier for the product. Product Name** (Text): The name of the product. This will be automatically populated or updated by the workflow. Old Price** (Number/Currency): The price of the product from the previous check. This column is crucial for price comparison. New Price** (Number/Currency): The most recently scraped price of the product. Ratings** (Number): The star rating of the product. Reviews** (Number): The total number of reviews for the product. Timestamp** (Datetime): The date and time when the price check was performed. Adding Products Add Google Shopping URLs to your Google Sheet. The workflow will fetch product details and track prices. Historical price data builds over time. Understanding Price Alerts The default setting for this workflow is to send an email alert when the new price equals the old price. This might seem counterintuitive, but it's useful for specific scenarios, such as: Monitoring stable pricing:** If you are tracking a product and want to be notified when its price has remained consistent over time, indicating a potential stable buying opportunity or a benchmark. Verifying data consistency:** To confirm that the scraping process is working correctly and consistently retrieving the same price when no changes are expected. You can easily customize the alert logic to trigger on different conditions as described below. Customizing Alert Logic Price drops:** new_price < old_price Significant drops:** new_price < (old_price * 0.9) (e.g., price dropped by more than 10%) Price increases:** new_price > old_price Any change:** new_price != old_price Reading the Results Real-time pricing data Historical tracking Product metadata Timestamps for each check 🔧 Customization Options Add More Data:** Descriptions, availability, seller info, shipping, images Modify Email Templates:** Customize subject and body Multiple Recipients:** Duplicate email node and change recipients Webhook Integration:** Add real-time triggers or Slack alerts 🚨 Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No price data extracted:** Verify URLs and test with different products Google Sheets permission denied:** Re-authenticate and check sharing Emails not sending:** Re-auth Gmail OAuth and verify recipients Filter not working:** Check price formats and logic Workflow failed:** Check logs, retry logic, and network status 📊 Use Cases & Examples E-commerce Monitoring:** Track competitor pricing and trends Deal Hunting:** Get alerts for price drops on wishlist items Inventory Management:** Monitor supplier pricing for procurement Market Research:** Analyze pricing trends and generate reports ⚙️ Advanced Configuration Batch Processing:** Increase batch size, add delays, use parallel processing Price History:** Store historical data, calculate averages, forecast trends Tool Integration:** CRM, Slack, databases, BI tools (Tableau, Power BI) 📈 Performance & Limits Single URL:** 2–5 seconds Concurrent Requests:** 3–5 (depends on Bright Data plan) Data Accuracy:** 95%+ Success Rate:** 90%+ Daily Capacity:** 100–500 products Memory:** ~100MB per execution API Calls:** 1 Bright Data + 2 Google Sheets per product 🤝 Support & Community n8n Forum:** <https://community.n8n.io> Documentation:** <https://docs.n8n.io> Bright Data Support:** Via your Bright Data dashboard GitHub Issues:** Report bugs and request features 🎯 Ready to Use! Your workflow provides a solid foundation for automated price monitoring. Customize it to fit your specific needs and use cases for maximum effectiveness in tracking Google Shopping prices with intelligent email notifications. Please note that this template uses Community Nodes. Ensure you understand the risks before using community nodes.
by Yaron Been
Workflow Overview This sophisticated n8n automation is a powerful lead generation and outreach tool designed to transform YouTube channel research into actionable marketing opportunities. By intelligently connecting multiple services and APIs, this workflow: Discovers Targeted Channels: Scrapes YouTube channels based on specific keywords Extracts comprehensive channel metadata Identifies potential business opportunities Intelligent Lead Qualification: Filters channels with contact emails Validates email authenticity Ensures high-quality lead generation Personalized Outreach: Sends customized cold emails Leverages channel-specific personalization Automates initial contact process Key Benefits 🕵️ Automated Lead Discovery: Find potential collaborators or clients 🧠 Smart Filtering: Eliminate invalid or irrelevant leads 📧 Personalized Outreach: Contextual, channel-specific communication ⏱️ Time-Saving: Eliminate manual research and email hunting Workflow Architecture 🔍 Stage 1: Channel Scraping Apify Integration**: Scrapes YouTube channels Keyword-Based Search**: Target specific niches Metadata Extraction**: Collect channel details, emails 🧩 Stage 2: Lead Qualification Email Existence Check**: Filter channels with contact info ZeroBounce Verification**: Validate email authenticity Quality Control**: Ensure only valid leads proceed 📬 Stage 3: Personalized Outreach Gmail Integration**: Send customized cold emails Dynamic Personalization**: Use channel-specific details Automated Communication**: Streamline initial contact Potential Use Cases Marketing Agencies**: Find potential clients Influencer Marketers**: Discover collaboration opportunities Content Creators**: Network and expand professional connections Sales Teams**: Generate targeted lead lists Recruitment Specialists**: Identify industry professionals Setup Requirements Apify Account API token YouTube Scraper Actor Configured search keywords ZeroBounce Account Email verification API Validation credits Gmail Account OAuth2 authentication Configured sending profile n8n Installation Cloud or self-hosted instance Import workflow configuration Configure API credentials Future Enhancement Suggestions 🤖 AI-powered email personalization 📊 Advanced lead scoring mechanisms 🔄 Automated follow-up sequences 📈 Integration with CRM platforms 🌐 Multi-platform lead generation Ethical Considerations Respect email communication guidelines Comply with anti-spam regulations Provide clear opt-out mechanisms Maintain professional, value-driven outreach Connect With Me Ready to supercharge your lead generation? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your outreach strategy with intelligent, automated workflows!