by Niklas Hatje
Use case To guarantee an effective sales process deals must be distributed between sales reps in the best way. Normally, this involves manually assigning new deals that have come in. This workflow automates it for you! What this workflow does This workflow runs once a day and checks for unassigned deals in your Hubspot CRM. Once it finds one, it enriches the deal with information about the assigned contact and their company. It then checks the region of the assigned company before looking at the company's employee size. Based on this, it assigns the deal to the right sales rep within your company. Requirements New deals in Hubspot need to be unassigned in the beginning New deals have to have an attached contact that has an attached company in Hubspot The company needs to have values for region and employee count in Hubspot Setup The setup is quite straight forward and will probably take a few minutes only. Add your Hubspot credentials Customize your criterias for assigning deals in the Assign by Region and the following Assign nodes Make sure deals are assigned to the right salesrep in the Hubspot nodes at the end Activate the workflow Customizing this to your needs Adjust the trigger interval to your needs. Currently, it defaults to once a day Adjust your region settings by adding/updating/removing options in the respective node Adjust your employee size settings by adding/updating/removing options in the respective node Ideas to enhance this flow Wrap each region's assigned criteria into different sub-workflows for easier maintainability. This will not consume additional execution counts. Add more logic on what happens once a deal does not match any criteria you've set
by Akhil Varma Gadiraju
n8n Workflow: Sync Workflows with GitLab How It Works This workflow ensures that your self-hosted n8n workflows are version-controlled in a GitLab repository. It compares each current workflow from n8n with its stored counterpart in GitLab. If any differences are detected, the GitLab file is updated with the latest version. Core Logic: Retrieve Workflows – Fetch all workflows from the n8n REST API. Compare with GitLab – For each workflow, fetch the corresponding file from GitLab and compare the JSON. Update if Changed – If differences exist, commit the updated workflow to GitLab using its API. Setup Before using the workflow, ensure the following: Prerequisites: n8n**: Self-hosted instance with access to the /rest/workflows API. GitLab**: A repository where workflows will be stored, and a Personal Access Token (PAT) with api and write_repository permissions. n8n Nodes Required**: HTTP Request (to call n8n and GitLab APIs) Code or Function nodes (for diffing and formatting) Looping (SplitInBatches or similar) Configuration: Set environment variables or workflow credentials for: GITLAB_TOKEN GITLAB_REPO GITLAB_BRANCH (e.g., main) GITLAB_FILE_PATH_PREFIX (e.g., n8n-workflows/) How to Use Import the Workflow into your n8n instance. Configure GitLab API Credentials: Set the GitLab PAT as a header in the HTTP Request node: Private-Token: {{ $env.GITLAB_TOKEN }} Map Workflows to GitLab Paths: Use the workflow name or ID to create the file path. Example: n8n-workflows/workflow-name.json Trigger the Workflow: Can be manually triggered, or scheduled to run at intervals (e.g., daily). Review Commits in GitLab: Each updated workflow will be committed with a message like: "Update workflow: Sample Workflow" Disclaimer This workflow does not handle merge conflicts or manual edits made directly in GitLab. Always ensure proper coordination if multiple sources are modifying workflows. Only structural changes are tracked. Non-functional metadata (like timestamps or IDs) may trigger false positives unless filtered. Use at your own risk. Test in a safe environment before applying to production workflows.
by Joseph
(Image Generation → Hosting → Video Generation) This workflow is designed for creators, automation enthusiasts, and indie hackers who want to generate image-based videos automatically using AI tools — at a low cost. ⚙️ Workflow Overview This automation performs the following steps: Trigger (Schedule or manual) Generate an image using Flux (choose between two APIs) Upload the image to Kraken.io to get a public URL Send the image to Runway ML (choose between two APIs) to generate a video Receive the video as a URL — ready for posting, download, or further automation 🛠️ Step-by-Step Setup 🖼️ Flux (Image Generation) You can use either of the following providers: Option 1: Flux by BlackForest Labs (Direct API) 🔑 Get your API key here: https://docs.bfl.ml/ Paste your API key in the HTTP Request node named Flux (Blackforest) You can customize prompts or styles inside the JSON body Option 2: Flux via RapidAPI 🔑 Subscribe and get your key here: https://rapidapi.com/poorav925/api/ai-text-to-image-generator-flux-free-api/playground/apiendpoint\_e38039ee-1912-4ef9-b4d4-270d72fca851 Enter your RapidAPI key in the X-RapidAPI-Key header Optional: tweak prompts, style, or resolution inside the JSON body 🐙 Kraken.io (Hosting the Image Publicly) Runway ML requires the image to be publicly accessible. We use Kraken.io to host the generated image and return a public URL. 🔑 Get your API credentials: https://kraken.io/account/api-credentials Setup: Copy your API Key and API Secret Open the Kraken Upload node in n8n Replace placeholders with your credentials The node uploads your image and gives back a public image URL for Runway to use 🎬 RunwayML (Video Generation) You also have two options here: Option 1: Runway Official API 🔑 Get your credentials at: https://dev.runwayml.com/ Use the public image URL from Kraken in the JSON body Paste your Bearer token in the Authorization header Customize other settings like video length, style, FPS, etc. Option 2: Runway via RapidAPI 🔑 Subscribe and get your key here: https://rapidapi.com/fortunehoppers/api/runwayml/playground/apiendpoint\_93c8554d-8097-40cd-8252-3d4dec9c0e68 Paste your RapidAPI key in the request header Customize prompt and generation options in the body Use the Kraken-generated image URL as the input source 📤 What to Do with the Video Once the video is generated, you’ll get a direct video URL. You can: Save it to Google Sheets or Notion Send it via email Trigger a YouTube upload automation Or download manually for editing and reposting 💡 Optional Tips & Notes You can schedule this workflow to generate AI videos daily or weekly Combine it with a Google Sheet of prompts for bulk automation Try using a consistent visual style or theme for better branding This workflow is lightweight and affordable — perfect for indie projects or experimental content generation Great for shorts, quote visuals, music loops, AI art promos, etc. 🔗 Resources Flux (Blackforest) Docs Flux on RapidAPI RunwayML Official Docs Runway on RapidAPI Kraken.io API Dashboard 🙋 Need Help? Feel free to reach out: 🐦 Twitter: @juppfy 📧 Email: joseph@uppfy.com If you’d like to hire me for custom n8n workflows or product automations, don’t hesitate to get in touch.
by Max aka Mosheh
How it works: The n8n flow grabs the needed IDs, fetches the current links, adds your new one, and sends a single HTTP request to NocoDB to update the record’s linked entries. Set up steps: Plan for 10 minutes setup if you’re already running n8n and NocoDB. You’ll need to copy/paste table IDs, set up your HTTP node, and test once. No coding, just copy IDs.
by PollupAI
Never forget to send a satisfaction survey again! This workflow helps you automatically send CSAT surveys when a Freshdesk ticket is marked “Resolved” – and logs every response in Google Sheets for easy analysis, reporting, and escalation workflows. 💡 Built for CS and ops teams who care about real feedback This template is perfect for: Customer Support Teams who want timely, consistent survey delivery after every resolved ticket. Ops Leads & Admins tired of managing spreadsheets and survey tools manually. Businesses using Freshdesk looking for a no-code feedback loop. Automation fans who want to track, trigger, and take action — automatically. 🧩 What problem does it solve? Manual survey processes are slow, inconsistent, and hard to scale. This automation ensures: Fast survey delivery when experiences are still fresh. No duplicate emails thanks to a built-in tracking system. Centralized feedback in a Google Sheet — no more digging through platforms. Data you can act on, like triggering Slack alerts for poor scores. ⚙️ How it works 📨 Part 1: Auto-send the survey when a ticket is resolved Trigger: Workflow runs on a schedule (or manually via “Test”). Pull ticket status from Freshdesk. Compare ticket status to the last known status in Google Sheets. Detect resolution: If status = “Resolved” (ID 4), move forward. Update the Google Sheet to track that the survey was sent. Fetch the customer’s email from Freshdesk. Create & send the survey email, personalized with ticket info and your brand. Convert Markdown → HTML for a well-formatted email. 📥 Part 2: Collect responses and store in Sheets Form Trigger: Customer clicks the survey link and fills in the form. Capture responses (e.g. rating + comments). Log feedback in a second Google Sheet for analysis. You can extend this by adding escalation steps (e.g. flagging 1–2 star ratings to managers). 🚀 Setup Instructions 🔐 Connect your tools Freshdesk**: Add your API credentials to the get tickets and get client nodes. Google Sheets**: Authenticate in the get existing tickets, update status, and save survey nodes. Email (SMTP)**: Add your SMTP details in the “Send Email” node, or swap in Gmail, SendGrid, etc. 🛠 Set your data In the Set your data node, enter: Your name, email, company, and position Your survey form link (see below) 🔗 Get the form link Activate the workflow (toggle it ON) Go to the “Survey” (Form Trigger) node Copy the Production URL Paste it into the survey link field in the Set your data node 🧾 Prepare your Google Sheets Sheet 1: Freshdesk Tickets (status tracking) Used by: get existing tickets update status Create a new empty Google Sheet. Add the Spreadsheet ID + Sheet Name into the nodes. Sheet 2: Feedback freshdesk (survey responses) Used by: save survey to google sheet Create a new sheet or tab. It will auto-create columns based on your survey form field labels. Add the Spreadsheet ID + Sheet Name/GID to the save node. 🔧 Customize the workflow 📝 Survey Questions Modify them in the Survey (Form Trigger) node. Adjust the save survey to google sheet node as needed (or use auto-map). 💬 Email Content Edit the subject and message in the Create the email text (Set) node. 🏷 Freshdesk Status ID If your “Resolved” status ID isn’t 4, update the second condition in the If ticket resolved node. 📉 Escalate poor feedback Add logic after the save survey to google sheet node: If rating is low: Notify Slack Create a new internal ticket Email a team lead 🔁 Schedule Trigger Adjust the Schedule Trigger node to your desired interval (e.g., hourly). 🔄 Use a Webhook Instead (Optional) If Freshdesk supports ticket webhook events, swap the schedule trigger for a Webhook Trigger node to send surveys instantly on ticket resolution. 🤖 Why Pollup AI is building this At Pollup AI, we help CS and support teams stop drowning in tools and manual tasks. This template is part of our growing AI agent library: plug-and-play automations that connect your tools, clean your data, and free up your time – without writing a line of code. Try this workflow and let Pollup AI handle the boring parts, so your team can focus on what customers are really saying. Learn more at Pollup AI
by scrapeless official
AI-Powered Web Data Pipeline with n8n How It Works This n8n workflow builds an AI-powered web data pipeline that automates the entire process of: Extraction** Structuring** Vectorization** Storage** It integrates multiple advanced tools to transform messy web pages into clean, searchable vector databases. Integrated Tools Scrapeless** Bypasses JavaScript-heavy websites and anti-bot protections to reliably extract HTML content. Claude AI** Uses LLMs to analyze unstructured HTML and generate clean, structured JSON data. Ollama Embeddings** Generates local vector embeddings from structured text using the all-minilm model. Qdrant Vector DB** Stores semantic vector data for fast and meaningful search capabilities. Webhook Notifications** Sends real-time updates when workflows complete or errors occur. From messy webpages to structured vector data — this pipeline is perfect for building intelligent agents, knowledge bases, or research automation tools. Setup Steps 1. Install n8n > Requires Node.js v18 / v20 / v22 npm install -g n8n n8n After installation, access the n8n interface via: URL: http://localhost:5678 2. Set Up Scrapeless Register at: Scrapeless Copy your API token Paste the token into the HTTP Request node labeled "Scrapeless Web Request" 3. Set Up Claude API (Anthropic) Sign up at Anthropic Console Generate your Claude API key Add the API key to the following nodes: Claude Extractor AI Data Checker Claude AI Agent 4. Install and Run Ollama macOS brew install ollama Linux curl -fsSL https://ollama.com/install.sh | sh Windows Download the installer from: https://ollama.com Start Ollama Server ollama serve Pull Embedding Model ollama pull all-minilm 5. Install Qdrant (via Docker) docker pull qdrant/qdrant docker run -d \ --name qdrant-server \ -p 6333:6333 -p 6334:6334 \ -v $(pwd)/qdrant_storage:/qdrant/storage \ qdrant/qdrant Test if Qdrant is running: curl http://localhost:6333/healthz 6. Configure the n8n Workflow Modify the Trigger (Manual or Scheduled) Input your Target URLs and Collection Name in the designated nodes Paste all required API Tokens / Keys into their corresponding nodes Ensure your Qdrant and Ollama services are running Ideal Use Cases Custom AI Chatbots Private Search Engines Research Tools Internal Knowledge Bases Content Monitoring Pipelines
by Airtop
Automating LinkedIn Company Page Discovery Use Case Finding the official LinkedIn page of a company is crucial for tasks like outreach, research, or enrichment. This automation streamlines the process by intelligently searching the company’s website and LinkedIn to locate the correct profile. What This Automation Does This automation identifies a company's LinkedIn page using the following input parameters: Company domain**: The official website domain of the company (e.g., company.com). Airtop Profile (connected to LinkedIn)**: The name of your Airtop Profile authenticated on LinkedIn. How It Works Launches an Airtop session using the provided authenticated profile. Attempts to extract a LinkedIn link directly from the company's website. If not found, performs a LinkedIn search for the company. If still unsuccessful, falls back to a Google search. Validates the most likely result to confirm it’s a LinkedIn company page. Outputs the verified LinkedIn URL. Setup Requirements Airtop API Key — free to generate. An Airtop Profile logged in to LinkedIn (requires one-time login). Next Steps Combine with People Enrichment**: Use this with LinkedIn profile discovery for full contact + company data workflows. CRM Integration**: Automatically enrich company records with LinkedIn links. Build Custom Outreach Workflows**: Connect company pages to SDR tooling for deeper research and engagement. Read more about how to find Linkedin Company page
by InfraNodus
This template can be used to upload the files in your Google drive to an InfraNodus knowledge graph. The InfraNodus graph will then reveal the main topics and ideas in your collection of documents and show the content gaps in them. You can also use the built-in AI to converse with the documents. You can also access the InfraNodus Graphs via its GraphRAG API to re-use them in your other n8n workflows for high-quality content retrieval and knowledge base optimization. The template showcases the use of multiple n8n nodes and processes: Extracting documents from a Google Drive folder text extraction optional: high-quality PDF conversion using ConvertAPI InfraNodus knowledge graph generation Note: If you want to **Sync your Google drive to an InfraNodus graph, check out our other workflow* How it works Here's a description of this workflow step by step: Find all the files in a specific Google drive folder For each file found: reiterate the workflow and Identify the type of the file (TXT, PDF, Markdown) For TXT and Markdown files extract the text data For PDF files use a special PDF to Text convertor to extract the text data. (Optional: using ConvertAPI for better quality PDF conversion) Forward everything to the InfraNodus graphAndStatements API endpoint with the name of the new graph, the text field with the text data, the text settings, and doNotSave=false to create a new graph Reiterate through another file. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Use that API key to set up authorization for the InfraNodus tool in the workflow. If you want to upload the files to an existing graph, you should copy its name from InfraNodus. Otherwise you can specify any name you want. Requirements An InfraNodus account and API key A Google Drive account and authorization (you will need to set it up via Google Cloud using the n8n instructions provided in the Google Drive node). Customizing this workflow You can use Dropbox instead of Google Drive. You can also modify this workflow slightly to make it Sync with a Google Drive when the new files appear in it. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20267019838108-Upload-Sync-Your-Google-Drive-Folder-with-InfraNodus-using-n8n
by InfraNodus
This template can be used to sync the files in your Google drive to a new or existing InfraNodus knowledge graph. The InfraNodus graph will then reveal the main topics and ideas in your collection of documents and show the content gaps in them. You can also use the built-in AI to converse with the documents. You can also access the InfraNodus Graphs via its GraphRAG API to re-use them in your other n8n workflows for high-quality content retrieval and knowledge base optimization. The template showcases the use of multiple n8n nodes and processes: Syncing documents from a Google Drive folder / extracting them text extraction from files optional: high-quality PDF conversion using ConvertAPI InfraNodus knowledge graph generation Note: If you want to **upload files from your Google drive to an InfraNodus graph, check out our other workflow* How it works Here's a description of this workflow step by step: Wait for new file(s) to appear in the Google drive folder Reiterate through each file Retrieve the new file from the Google drive For each file found: reiterate the workflow and Identify the type of the file (TXT, PDF, Markdown) For TXT and Markdown files extract the text data For PDF files use a special PDF to Text convertor to extract the text data. (Optional: using ConvertAPI for better quality PDF conversion) Forward everything to the InfraNodus graphAndStatements API endpoint with the name of the new graph, the text field with the text data, the text settings, and doNotSave=false to create a new graph Reiterate through another file. How to use You need an InfraNodus GraphRAG API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Use that API key to set up authorization for the InfraNodus tool in the workflow. If you want to upload the files to an existing graph, you should copy its name from InfraNodus. Otherwise you can specify any name you want. Requirements An InfraNodus account and API key A Google Drive account and authorization (you will need to set it up via Google Cloud using the n8n instructions provided in the Google Drive node). Customizing this workflow You can use Dropbox instead of Google Drive. You can also modify this workflow slightly to make it Upload the files from a Google Drive when the new files appear in it. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20267019838108-Upload-Sync-Your-Google-Drive-Folder-with-InfraNodus-using-n8n
by Dvir Sharon
🛒 Monitor Google Shopping Prices with Bright Data & Email Alerts This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that monitors product prices daily using Bright Data's Google Shopping dataset and sends smart email alerts when price conditions are met. 📋 Overview This workflow provides an automated price monitoring solution that tracks product prices from Google Shopping daily and sends intelligent email notifications. Perfect for e-commerce monitoring, competitor analysis, deal hunting, and inventory management. ✨ Key Features 🕘 Scheduled Monitoring: Daily automated price checks at 9 AM 🛍️ Google Shopping Integration: Uses Bright Data's dataset for accurate pricing 📊 Smart Price Comparison: Compares current prices with historical data 📧 Intelligent Alerts: Sends emails only when prices meet criteria 📈 Data Storage: Updates Google Sheets with latest pricing data 🔄 Batch Processing: Handles multiple products with rate limiting ⚡ Fast & Reliable: Built-in error handling 🎯 Customizable Filters: Advanced price comparison logic 🎯 What This Workflow Does Schedule Trigger: Runs daily at 9 AM Data Retrieval: Fetches product list from Google Sheets Price Extraction: Scrapes current prices using Bright Data Data Update: Updates Google Sheets with new prices Price Comparison: Compares new vs. old prices Smart Filtering: Filters products that meet alert criteria Email Notifications: Sends alerts for qualifying changes Rate Limiting: Adds delay between emails Output Data Points | Field | Description | Example | | :------------ | :------------------------- | :------------------------------- | | Product URL | Original Google Shopping URL | https://shopping.google.com/product/... | | Product Name | Product title | iPhone 15 Pro Max 256GB | | Ratings | Product rating score | 4.5 | | Reviews | Number of reviews | 1,247 | | Old Price | Previous price | $1,199.00 | | New Price | Current scraped price | $1,199.00 | | Timestamp | When the check occurred | 2025-05-30T09:00:00Z | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Shopping dataset access Gmail account for notifications Steps Import the workflow JSON into n8n Configure Bright Data credentials and dataset access Set up Google Sheets with required columns Configure Gmail OAuth2 credentials Update sheet IDs and schedule settings Test with sample products and activate 📖 Usage Guide Google Sheet Structure Your Google Sheet should have the following columns to ensure the workflow functions correctly: Product URL** (Text): The direct URL to the Google Shopping product page. This is the primary identifier for the product. Product Name** (Text): The name of the product. This will be automatically populated or updated by the workflow. Old Price** (Number/Currency): The price of the product from the previous check. This column is crucial for price comparison. New Price** (Number/Currency): The most recently scraped price of the product. Ratings** (Number): The star rating of the product. Reviews** (Number): The total number of reviews for the product. Timestamp** (Datetime): The date and time when the price check was performed. Adding Products Add Google Shopping URLs to your Google Sheet. The workflow will fetch product details and track prices. Historical price data builds over time. Understanding Price Alerts The default setting for this workflow is to send an email alert when the new price equals the old price. This might seem counterintuitive, but it's useful for specific scenarios, such as: Monitoring stable pricing:** If you are tracking a product and want to be notified when its price has remained consistent over time, indicating a potential stable buying opportunity or a benchmark. Verifying data consistency:** To confirm that the scraping process is working correctly and consistently retrieving the same price when no changes are expected. You can easily customize the alert logic to trigger on different conditions as described below. Customizing Alert Logic Price drops:** new_price < old_price Significant drops:** new_price < (old_price * 0.9) (e.g., price dropped by more than 10%) Price increases:** new_price > old_price Any change:** new_price != old_price Reading the Results Real-time pricing data Historical tracking Product metadata Timestamps for each check 🔧 Customization Options Add More Data:** Descriptions, availability, seller info, shipping, images Modify Email Templates:** Customize subject and body Multiple Recipients:** Duplicate email node and change recipients Webhook Integration:** Add real-time triggers or Slack alerts 🚨 Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No price data extracted:** Verify URLs and test with different products Google Sheets permission denied:** Re-authenticate and check sharing Emails not sending:** Re-auth Gmail OAuth and verify recipients Filter not working:** Check price formats and logic Workflow failed:** Check logs, retry logic, and network status 📊 Use Cases & Examples E-commerce Monitoring:** Track competitor pricing and trends Deal Hunting:** Get alerts for price drops on wishlist items Inventory Management:** Monitor supplier pricing for procurement Market Research:** Analyze pricing trends and generate reports ⚙️ Advanced Configuration Batch Processing:** Increase batch size, add delays, use parallel processing Price History:** Store historical data, calculate averages, forecast trends Tool Integration:** CRM, Slack, databases, BI tools (Tableau, Power BI) 📈 Performance & Limits Single URL:** 2–5 seconds Concurrent Requests:** 3–5 (depends on Bright Data plan) Data Accuracy:** 95%+ Success Rate:** 90%+ Daily Capacity:** 100–500 products Memory:** ~100MB per execution API Calls:** 1 Bright Data + 2 Google Sheets per product 🤝 Support & Community n8n Forum:** <https://community.n8n.io> Documentation:** <https://docs.n8n.io> Bright Data Support:** Via your Bright Data dashboard GitHub Issues:** Report bugs and request features 🎯 Ready to Use! Your workflow provides a solid foundation for automated price monitoring. Customize it to fit your specific needs and use cases for maximum effectiveness in tracking Google Shopping prices with intelligent email notifications. Please note that this template uses Community Nodes. Ensure you understand the risks before using community nodes.
by Yaron Been
Workflow Overview This sophisticated n8n automation is a powerful lead generation and outreach tool designed to transform YouTube channel research into actionable marketing opportunities. By intelligently connecting multiple services and APIs, this workflow: Discovers Targeted Channels: Scrapes YouTube channels based on specific keywords Extracts comprehensive channel metadata Identifies potential business opportunities Intelligent Lead Qualification: Filters channels with contact emails Validates email authenticity Ensures high-quality lead generation Personalized Outreach: Sends customized cold emails Leverages channel-specific personalization Automates initial contact process Key Benefits 🕵️ Automated Lead Discovery: Find potential collaborators or clients 🧠 Smart Filtering: Eliminate invalid or irrelevant leads 📧 Personalized Outreach: Contextual, channel-specific communication ⏱️ Time-Saving: Eliminate manual research and email hunting Workflow Architecture 🔍 Stage 1: Channel Scraping Apify Integration**: Scrapes YouTube channels Keyword-Based Search**: Target specific niches Metadata Extraction**: Collect channel details, emails 🧩 Stage 2: Lead Qualification Email Existence Check**: Filter channels with contact info ZeroBounce Verification**: Validate email authenticity Quality Control**: Ensure only valid leads proceed 📬 Stage 3: Personalized Outreach Gmail Integration**: Send customized cold emails Dynamic Personalization**: Use channel-specific details Automated Communication**: Streamline initial contact Potential Use Cases Marketing Agencies**: Find potential clients Influencer Marketers**: Discover collaboration opportunities Content Creators**: Network and expand professional connections Sales Teams**: Generate targeted lead lists Recruitment Specialists**: Identify industry professionals Setup Requirements Apify Account API token YouTube Scraper Actor Configured search keywords ZeroBounce Account Email verification API Validation credits Gmail Account OAuth2 authentication Configured sending profile n8n Installation Cloud or self-hosted instance Import workflow configuration Configure API credentials Future Enhancement Suggestions 🤖 AI-powered email personalization 📊 Advanced lead scoring mechanisms 🔄 Automated follow-up sequences 📈 Integration with CRM platforms 🌐 Multi-platform lead generation Ethical Considerations Respect email communication guidelines Comply with anti-spam regulations Provide clear opt-out mechanisms Maintain professional, value-driven outreach Connect With Me Ready to supercharge your lead generation? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your outreach strategy with intelligent, automated workflows!
by Naveen Choudhary
Who is this for? Marketing agencies, sales teams, lead generation specialists, and business development professionals who need to build comprehensive business databases with contact information for outreach campaigns across any industry. What problem is this workflow solving? Finding businesses and their contact details manually is time-consuming and inefficient. This workflow automates the entire process of discovering businesses through Google Maps and extracting their digital contact information from websites, saving hours of manual research. What this workflow does This automated workflow runs every 30 minutes to: Scrape business data from Google Maps using Apify's Google Places crawler Save basic business information (name, address, phone, website) to Google Sheets Filter businesses that have websites Scrape each business's website content using Firecrawl Extract contact information including emails, LinkedIn, Facebook, Instagram, and Twitter profiles Store all extracted data in organized Google Sheets for easy access and follow-up Setup Required Services: Google Sheets account with OAuth2 setup Apify account with API access for Google Places scraping Firecrawl account with API access for website scraping Pre-setup: Copy this Google Sheet Configure your Apify and Firecrawl API credentials in n8n Set up Google Sheets OAuth2 connection Update the Google Sheet ID in all Google Sheets nodes Quick Start: The workflow includes detailed sticky notes explaining each phase. Simply configure your API credentials and Google Sheet, then activate the workflow. How to customize this workflow to your needs Change search criteria**: Modify the Apify scraping parameters to target different business types (restaurants, gyms, salons, etc.) or locations Adjust schedule**: Change the trigger interval from 30 minutes to your preferred frequency Add more contact fields**: Extend the extraction code to find additional contact information like WhatsApp or Telegram Filter criteria**: Modify the filter conditions to target businesses with specific characteristics Batch size**: Adjust the batch processing to handle more or fewer websites simultaneously Perfect for lead generation, competitor research, and building targeted marketing lists across any industry or business type.