by Ferenc Erb
Use Case Extend Bitrix24 tasks with custom widgets that display relevant task information and enable seamless interaction through a custom tab interface. What This Workflow Does Processes incoming webhook requests from Bitrix24 task interfaces Handles authentication and secure token validation Manages application installation and placement registration Displays task data in a custom formatted view Stores and retrieves configuration settings persistently Provides user-friendly HTML interfaces for task information Setup Instructions Configure Bitrix24 webhook endpoints for the task widget Set up authentication credentials in your Bitrix24 account Install the application and register the task view tab placement Customize the task data display format as needed Deploy and test the application functionality within Bitrix24 tasks
by Tharwat Mohamed
๐ AI Resume Screener (n8n Workflow Template) An AI-powered resume screening system that automatically evaluates applicants from a simple web form and gives you clear, job-specific scoring โ no manual filtering needed. โก What the workflow does ๐ Accepts CV uploads via a web form (PDF) ๐ง Extracts key info using AI (education, skills, job history, city, birthdate, phone) ๐ฏ Dynamically matches the candidate to job role criteria stored in Google Sheets ๐ Generates an HR-style evaluation and a numeric score (1โ10) ๐ฅ Saves the result in a Google Sheet and uploads the original CV to Google Drive ๐ก Why youโll love it FeatureBenefitAI scoringInstantly ranks candidate fit without reading every CVGoogle Sheet-drivenEasily update job profiles โ no code changesFast setupConnect your accounts and you're live in ~15 minsScalableWorks for any department, team, or organizationDeveloper-friendlyExtend with Slack alerts, translations, or automations ๐งฐ Requirements ๐ OpenAI or Google Gemini API Key ๐ Google Sheet with 2 columns: Role, Profile Wanted โ๏ธ Google Drive account ๐ n8n account (self-hosted or cloud) ๐ Setup in 5 Steps Import the workflow into n8n Connect Google Sheets, Drive, and OpenAI or Gemini Add your job roles and descriptions in Google Sheets Publish the form and test with a sample CV Watch candidate profiles and scores populate automatically ๐ค Want help setting it up? Includes free setup guidance by the creator โ available by email or WhatsApp after purchase. Iโm happy to assist you in customizing or deploying this workflow for your team. ๐ง Email: tharwat.elsayed2000@gmail.com ๐ฌ WhatsApp: +20106 180 3236
by Oneclick AI Squad
An intelligent food menu update notification system that automatically detects changes in your restaurant's special menu and sends personalized notifications to customers via multiple channels - WhatsApp, Email, and SMS. This workflow ensures your customers are always informed about new dishes, price changes, and menu availability in real-time. What's the Goal? Automatically monitor special menu updates from Google Sheets Detect menu changes and generate alert messages using AI Send multi-channel notifications (WhatsApp, Email, SMS) based on customer preferences Maintain comprehensive notification logs for tracking and analytics Provide seamless customer communication for menu updates Enable restaurant owners to keep customers engaged with latest offerings By the end, you'll have a fully automated menu notification system that keeps your customers informed and engaged with your latest culinary offerings. Why Does It Matter? Manual menu update communication is time-consuming and often missed by customers. Here's why this workflow is essential for restaurants: Real-Time Updates**: Customers receive instant notifications about menu changes Multi-Channel Reach**: WhatsApp, Email, and SMS ensure maximum customer reach Personalized Experience**: Customers receive notifications via their preferred channels Increased Sales**: Immediate awareness of new items drives orders Customer Retention**: Regular updates keep customers engaged and coming back Operational Efficiency**: Eliminates manual notification tasks for staff Data-Driven Insights**: Comprehensive logging for marketing analytics Think of it as your restaurant's digital menu announcer that never misses an update. How It Works Here's the complete workflow process: Step 1: Menu Monitoring Node**: Daily Menu Update Scheduler Function**: Triggers the workflow on a scheduled basis Frequency**: Configurable (hourly, daily, or real-time) Step 2: Data Retrieval Node**: Fetch Special Menu Data Function**: Pulls current menu data from Google Sheets (Sheet 1) Data**: Retrieves item details, prices, descriptions, and availability Step 3: Change Detection Node**: Detect Menu Changes Function**: Compares current data with previous state Logic**: Identifies new items, price changes, or availability updates Step 4: AI Content Generation Node**: Generate Menu Alert Message Function**: Creates engaging notification content using AI Output**: Formatted message with new items, descriptions, and prices Step 5: Customer Data Processing Node**: Fetch Customer Contact List Function**: Retrieves customer preferences from Google Sheets (Sheet 2) Filter**: Segments customers by notification preferences Step 6: Multi-Channel Delivery The workflow splits into three parallel notification channels: WhatsApp Branch Node**: Filter WhatsApp Users Function**: Identifies customers with WhatsApp notifications enabled Node**: Send WhatsApp Notification Function**: Delivers menu updates via WhatsApp Node**: Log WhatsApp Status Function**: Records delivery status in Sheet 3 Email Branch Node**: Filter Email Users Function**: Identifies customers with email notifications enabled Node**: Send Menu Email Function**: Delivers formatted email notifications Node**: Log Email Status Function**: Records delivery status in Sheet 3 SMS Branch Node**: Filter SMS Users Function**: Identifies customers with SMS notifications enabled Node**: Send Twilio SMS Alert Function**: Delivers text message notifications via Twilio Node**: Log SMS Status Function**: Records delivery status in Sheet 3 Step 7: Comprehensive Logging All notification activities are logged in Sheet 3 for tracking and analytics. Google Sheets Structure Sheet 1: Special Menu | Column | Description | Example | |--------|-------------|---------| | Item ID | Unique identifier for menu item | "ITEM001" | | Item Name | Name of the dish | "Truffle Risotto" | | Price | Item price | "$28.99" | | Description | Detailed item description | "Creamy arborio rice with black truffle, parmesan, and wild mushrooms" | | Nutritions | Nutritional information | "Calories: 450, Protein: 15g" | | Category | Menu category | "Main Course" | | Available | Availability status | "Yes" / "No" | Sheet 2: Customer Database | Column | Description | Example | |--------|-------------|---------| | Customer Name | Customer's full name | "ABC" | | Email | Customer's email address | "abc@gmail.com" | | Phone Number | Customer's phone number | "91999999999" | | WhatsApp Number | Customer's WhatsApp number | "91999999999" | | Email Notifications | Email preference | "Yes" / "No" | | SMS Notifications | SMS preference | "Yes" / "No" | | WhatsApp Notifications | WhatsApp preference | "Yes" / "No" | Sheet 3: Notification Logs | Column | Description | Example | |--------|-------------|---------| | Timestamp | Notification send time | "2025-07-09T12:51:09.587Z" | | Customer Name | Recipient name | "ABC" | | Notification Type | Channel used | "Email" / "SMS" / "WhatsApp" | | Status | Delivery status | "Sent" / "Failed" / "Pending" | | Message | Content sent | "SPECIAL MENU UPDATE..." | How to Use the Workflow Prerequisites Google Sheets Setup: Create three sheets with the required structure n8n Account: Access to n8n workflow platform WhatsApp Business API: WhatsApp Business account with API access Email Service: Gmail or SMTP service for email notifications Twilio Account: Twilio account for SMS functionality AI Model Access: OpenAI or similar AI service for content generation Importing the Workflow in n8n Step 1: Obtain the Workflow JSON Export the workflow from your n8n instance or obtain the JSON file Ensure you have the complete workflow configuration Step 2: Access n8n Workflow Editor Log in to your n8n instance (Cloud or self-hosted) Navigate to the Workflows section Click "Add Workflow" to create a new workflow Step 3: Import the Workflow Option A: Import from Clipboard Click the three dots (โฏ) in the top-right corner Select "Import from Clipboard" Paste the JSON code into the text box Click "Import" to load the workflow Option B: Import from File Click the three dots (โฏ) in the top-right corner Select "Import from File" Choose the .json file from your computer Click "Open" to import the workflow Configuration Setup Google Sheets Integration Authentication: Connect your Google account in n8n Sheet 1 Configuration: Set spreadsheet ID and range for menu data Sheet 2 Configuration: Set spreadsheet ID and range for customer data Sheet 3 Configuration: Set spreadsheet ID and range for notification logs WhatsApp Integration WhatsApp Business API: Set up WhatsApp Business API credentials Webhook Configuration: Configure webhook URLs for message delivery Message Templates: Create approved message templates for menu updates Email Integration Gmail/SMTP Setup: Configure email service credentials Email Templates: Design HTML email templates for menu notifications Sender Configuration: Set sender name and email address Twilio SMS Integration Twilio Account: Set up Twilio Account SID and Auth Token Phone Number: Configure Twilio phone number for SMS sending Message Templates: Create SMS message templates AI Content Generation API Configuration: Set up OpenAI or preferred AI service credentials Prompt Customization: Configure prompts for menu update content Content Parameters: Set message tone, length, and style Workflow Execution Automatic Execution Scheduled Triggers: Set up cron expressions for regular checks Webhook Triggers: Configure real-time triggers for immediate updates Manual Triggers: Enable manual execution for testing Monitoring and Maintenance Execution Logs: Monitor workflow execution through n8n interface Error Handling: Set up error notifications and retry mechanisms Performance Monitoring: Track execution times and success rates Sample Notification Message SPECIAL MENU UPDATE ๐ฝ๏ธ NEW ITEMS: โข Truffle Risotto - $28.99 Creamy arborio rice with black truffle, parmesan, and wild mushrooms โข Chocolate Lava Cake - $18.99 Warm chocolate cake with molten center, vanilla ice cream Total Menu Items: 2 Updated: 7/9/2025, 12:10:50 PM Visit our restaurant or call to place your order! ๐ Best Practices Data Management Regularly validate customer contact information Keep menu data updated and accurate Maintain clean customer preference settings Notification Strategy Send notifications during optimal hours (lunch/dinner time) Limit frequency to avoid customer fatigue Personalize messages based on customer preferences Content Quality Use engaging language and emojis appropriately Include clear pricing and descriptions Add call-to-action for immediate orders Performance Optimization Batch process notifications to avoid rate limits Implement retry logic for failed deliveries Monitor API quotas and usage limits Troubleshooting Common Issues Authentication Errors**: Verify API credentials and permissions Rate Limiting**: Implement delays between notifications Message Delivery**: Check phone number formats and email addresses Sheet Access**: Ensure proper sharing permissions Error Handling Set up notification alerts for workflow failures Implement fallback mechanisms for service outages Maintain backup notification methods Analytics and Reporting Key Metrics Delivery Rates**: Track successful notifications by channel Customer Engagement**: Monitor response rates and feedback Menu Performance**: Analyze which items generate most interest Channel Effectiveness**: Compare performance across WhatsApp, Email, and SMS Reporting Features Automated daily/weekly reports Customer preference analytics Notification performance dashboards Revenue correlation with menu updates Security and Compliance Data Protection Secure storage of customer contact information Compliance with GDPR and local privacy laws Regular security audits of API access Rate Limiting Respect platform rate limits (WhatsApp, Twilio, Email) Implement queuing systems for high-volume notifications Monitor and adjust sending frequencies Conclusion The Food Menu Update Notifier transforms restaurant communication from reactive to proactive, ensuring customers are always informed about your latest offerings. By leveraging multiple communication channels and AI-generated content, this workflow creates a seamless bridge between your kitchen innovations and customer awareness. This system not only improves customer engagement but also drives immediate sales through timely notifications about new menu items, special offers, and seasonal dishes. The comprehensive logging and analytics capabilities provide valuable insights for menu optimization and marketing strategy refinement.
by Artem Boiko
Revit to HTML Quantity Takeoff Generator Automates extraction of wall quantities from Revit models and creates a professional interactive HTML report. Key Features Automated wall quantity analysis Calculates volumes by wall type ("Type Name") Generates interactive HTML QTO report Includes summary statistics: total elements, total and average volumes Provides detailed breakdown by element type How it works Upload a Revit file as input Workflow extracts wall quantities and types Creates and saves a ready-to-share HTML dashboard with QTO data No API keys required Runs offline Output is a professional, ready-to-use HTML report
by Intuz
This n8n template delivers a complete AI-powered solution for automated LinkedIn posts, including unique content, custom images, and optimized hashtags. Use cases are many: Generate and schedule tailored LinkedIn content for different use-cases. By feeding the AI specific prompts, you can create specific post depending upon the topics and visuals to maintain a consistency yet and an online presence. How it works Maintaining a consistent and engaging presence on LinkedIn can be time-consuming, requiring constant ideation, content creation, and manual posting. This workflow takes that burden off your shoulders, delivering a fully automated solution for generating and publishing high-quality LinkedIn content. Scheduled Content Engine: Each day (or on your chosen schedule), the workflow kicks into gear, ensuring a fresh stream of content. Smart Topic & Content Generation: Using the power of Google Gemini, it intelligently crafts unique content topics and then expands them into full, engaging posts, ensuring your message is always fresh and relevant. Dynamic Image Creation: To make your posts stand out, the workflow leverages an AI image generator (like DALL-E) to produce a custom, eye-catching visual that perfectly complements your generated text. SEO-Optimized Hashtag Generation: Google Gemini then analyzes your newly created post and automatically generates a set of relevant, trending, and SEO-friendly hashtags, significantly boosting your content's reach and discoverability. Seamless LinkedIn Publishing: Finally, all these elementsโyour compelling text, unique image, and powerful hashtagsโare merged and automatically published to your LinkedIn profile, establishing you as a thought leader with minimal effort. How to Use: Quick Start Guide This guide will get your AI LinkedIn Content Automation workflow up and running in n8n. Import Workflow Template: Download the template's JSON file and import it into your n8n instance via "File" > "Import from JSON." Configure Credentials: Google Gemini: Set up and apply your API key credentials to all "Google Gemini Chat Model" nodes. AI Image Generation (e.g., OpenAI): Create and apply API key credentials for your chosen image generation service to the "Generate an Image" node. LinkedIn: Set up and apply OAuth credentials to the "Create a post" node for your LinkedIn account. Customize Schedule & AI Prompts: Schedule Trigger: Double-click "Schedule Trigger 1" to set how often your workflow runs (e.g., daily, weekly). AI Prompts: Review and edit the prompts within the "Content Topic Generator," "Content Creator," and "Hashtag Generator / SEO" nodes to guide the AI for your desired content style and topics. Test & Activate: Test Run: Click "Execute Workflow" to perform a test run and verify all steps are working as expected. Activate: Once satisfied, toggle the workflow "Active" switch to enable automated posting on your defined schedule. Requirements To use this workflow template, you will need: n8n Instance: A running n8n instance (cloud or self-hosted) to import and execute the workflow. Google Gemini Account: For content topic generation, content creation, and hashtag generation (requires Google Gemini API Key) from Google AI Studios. AI Image Generation Service Account: For creating images (e.g., OpenAI DALL-E API Key or similar service that the "Generate an Image" node uses). LinkedIn Account: For publishing the generated posts (requires LinkedIn OAuth Credentials for n8n connection). Connect with us Website: https://www.intuz.com/cloud/stack/n8n Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz
by Bright Data
๐ Yelp Business Finder: Scraping Local Businesses by Keyword, Category & Location Using Bright Data and Google Sheets Description: Automate local business data collection from Yelp using AI-powered input validation, Bright Data scraping, and automatic Google Sheets integration. Perfect for market research, lead generation, and competitive analysis. ๐ ๏ธ How It Works Form Submission: Users submit a simple form with country, location, and business category parameters. AI Validation: Google Gemini AI validates and cleans input data, ensuring proper formatting and Yelp category alignment. Data Scraping: Bright Data's Yelp dataset API scrapes business information based on the cleaned parameters. Status Monitoring: The workflow monitors scraping progress and waits for data completion. Data Export: Final business data is automatically appended to your Google Sheets for easy analysis. ๐ Setup Steps โฑ๏ธ Estimated Setup Time: 10-15 minutes Prerequisites โ Active n8n instance (cloud or self-hosted) โ Google account with Sheets access โ Bright Data account with Yelp scraping dataset โ Google Gemini API access Configuration Steps Import Workflow: Copy the provided JSON workflow In n8n: Go to Workflows โ + Add workflow โ Import from JSON Paste the JSON and click Import Configure Google Sheets: Create a new Google Sheet or use an existing one Set up OAuth2 credentials in n8n Update the Google Sheets node with your document ID Configure column mappings for business data Setup Bright Data: Add your Bright Data API credentials to n8n Replace BRIGHT_DATA_API_KEY with your actual API key Verify your Yelp dataset ID in the HTTP request nodes Test the connection Configure Google Gemini: Add your Google Gemini API credentials Test the AI Agent connection Verify the model configuration Test & Activate: Activate the workflow using the toggle switch Test with sample data: country="US", location="New York", category="restaurants" Verify data appears correctly in your Google Sheet ๐ Data Output ๐ Business Name Official business name from Yelp โญ Overall Rating Average customer rating (1-5 stars) ๐ Reviews Count Total number of customer reviews ๐ท๏ธ Categories Business categories and tags ๐ Website URL Official business website ๐ Phone Number Contact phone number ๐ Address Full business address ๐ Yelp URL Direct link to Yelp listing ๐ฏ Use Cases ๐ Market Research Analyze local business landscapes and competition ๐ Lead Generation Build prospect lists for B2B outreach ๐ช Location Analysis Research business density by area and category ๐ Competitive Intelligence Monitor competitor ratings and customer feedback โ ๏ธ Important Notes: Ensure you comply with Yelp's terms of service and rate limits Bright Data usage may incur costs based on your plan AI validation helps improve data quality and reduce errors Monitor your Google Sheets for data accuracy ๐ง Troubleshooting Common Issues: API Rate Limits:** Implement delays between requests if needed Invalid Categories:** AI agent helps standardize category names Empty Results:** Verify location spelling and category alignment Authentication Errors:** Check all API credentials and permissions ๐ Ready to start scraping Yelp business data efficiently!
by Dvir Sharon
๐ผ LinkedIn Job Finder Automation using Bright Data API & Google Sheets A comprehensive n8n automation that searches LinkedIn job postings using Bright Dataโs API and automatically organizes results in Google Sheets for efficient job hunting and recruitment workflows. ๐ Overview This workflow provides an automated LinkedIn job search solution that collects job postings based on your search criteria and organizes them in Google Sheets. Perfect for job seekers, recruiters, HR professionals, and talent acquisition teams. โจ Key Features ๐ Smart Job Search:** Form-based input for city, job title, country, and job type ๐ LinkedIn Integration:** Uses Bright Dataโs LinkedIn dataset for accurate job posting data ๐ Automated Organization:** Populates Google Sheets with structured job data ๐ง Real-time Processing:** Processes job search requests in real-time ๐ Data Storage:** Stores job details including company info, locations, and apply links ๐ Batch Processing:** Handles multiple job postings efficiently โก Fast & Reliable:** Built-in error handling for scraping ๐ฏ Customizable Filters:** Advanced job filtering based on criteria ๐ฏ What This Workflow Does Input Job Search Criteria:** City, job title, country, and optional job type Search Parameters:** Configurable filters and limits Output Preferences:** Google Sheets destination Processing Steps Form Submission Data Request to Bright Data API Status Monitoring Data Extraction Data Filtering Sheet Update Error Handling Output Data Points Field Description Example Job Title Position title from posting Senior Software Engineer Company Name Employer company name Tech Solutions Inc. Job Detail Job summary/description Remote position requiring 5+ yearsโฆ Location Job location San Francisco, CA Company URL Company profile link View Profile Apply Link Direct application link Apply Now ๐ Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with LinkedIn dataset access Steps Import the Workflow: Use JSON import in n8n Configure Bright Data: Add API credentials and dataset ID Configure Google Sheets: Create sheet, set credentials, map columns Update Workflow Settings: Replace placeholders with your actual data Test & Activate: Submit test form and verify data in Google Sheets ๐ Usage Guide Submitting Job Searches Go to your webhook URL and fill in the form with: City:** e.g., New York Job Title:** e.g., Software Engineer Country:** e.g., US Job Type:** Optional (Full-Time, Remote, etc.) Understanding Results Comprehensive job data Company info and profile links Direct application links Location and job descriptions Customizing Search Parameters Edit the Create Snapshot ID node to change: Time range (e.g., โPast monthโ) Result limits Company filters ๐ง Customization Options More Data Points:** Add salary, seniority, applicants, etc. Custom Form Fields:** Add filters for salary, experience, industry Multiple Sheets:** Route results by job type or location ๐จ Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No job data extracted:** Verify search parameters and API limits Google Sheets permission denied:** Re-authenticate and check sharing Form not working:** Check webhook URL and field mappings Filter issues:** Review logic and data types Execution failed:** Check logs, retry logic, and network status ๐ Use Cases & Examples Job Seeker Dashboard:** Automate job search and track applications Recruitment Pipeline:** Source candidates and monitor hiring trends Market Research:** Analyze job trends and salary benchmarks HR Analytics:** Support workforce planning and competitive insights โ๏ธ Advanced Configuration Batch Processing:** Queue multiple searches with delays Search History:** Track and analyze past searches Tool Integration:** Connect to CRM, Slack, databases, BI tools ๐ Performance & Limits Processing Time:** 30โ60 seconds per search Concurrent Requests:** 2โ3 (depends on Bright Data plan) Data Accuracy:** 95%+ Success Rate:** 90%+ Daily Capacity:** 50โ200 searches Memory:** ~50MB per execution API Calls:** 3โ4 Bright Data + 1 Google Sheets per search ๐ค Support & Community n8n Community:** community.n8n.io Documentation:** docs.n8n.io Bright Data Support:** Via your Bright Data dashboard GitHub Issues:** Report bugs and request features ๐ฏ Ready to Use! Your workflow is ready for automated LinkedIn job searching. Customize it to your recruiting or job search needs. Webhook URL: https://your-n8n-instance.com/webhook/linkedin-job-finder What Gets Extracted: * โ Job Title * โ Company Information * โ Location Data * โ Job Details * โ Application Links * โ Processing Timestamps ### Use Cases: * ๐ Job Search Automation * ๐ Recruitment Intelligence * ๐ Market Research * ๐ฏ HR Analytics
by berke
Who's it for This workflow is perfect for sales teams, customer service departments, and businesses that frequently handle spare parts inquiries via email. It's especially valuable for companies managing multiple products with complex pricing structures who want to automate their quotation process while maintaining professional, multilingual communication. What it does This workflow: Monitors your Gmail inbox** for incoming spare parts requests Automatically generates professional HTML price quotes** in the sender's language Sends personalized replies** Uses AI to detect the email language (supports Turkish, English, German, and more) Extracts project or part codes** Fetches pricing data from Google Sheets** Calculates totals accurately** Formats everything** into a clean, professional quote that matches your brand How it works Schedule Trigger runs every minutes to check for new emails Gmail node fetches the latest unread email Keyword detection filters for spare parts-related terms in multiple languages AI Agent processes the request by: Detecting the email's language Extracting project/part codes Querying three Google Sheets: CRM, Bill of Materials, Pricing Calculating line totals and grand total Generating a professional HTML quote in the sender's language Gmail reply sends the quote and marks the original email as read Requirements n8n self-hosted or cloud instance Gmail account with OAuth2 authentication Google Sheets with proper structure (3 sheets for CRM, BoM, and Pricing data) Google Gemini API key for AI processing Basic understanding of Google Cloud Console for OAuth setup How to set up Import the workflow into your n8n instance Create three Google Sheets with the following column structure: CRM Sheet: Email, ProjectCode, CustomerName Bill of Materials: ProjectCode, PartCode, PartDescription, Quantity Pricing Sheet: PartCode, UnitPriceEUR, PartDescription Configure credentials: Set up Gmail OAuth2 in Google Cloud Console Configure Google Sheets OAuth2 (can use same project) Get your Google Gemini API key from Google AI Studio Update the workflow: Replace placeholder Sheet IDs in the CRM, BoM, and Pricing nodes Adjust company name in the AI Agentโs system message Modify keyword detection if needed Test with a sample email before activating How to customize the workflow Add more languages**: Update the keyword detection node with additional terms Modify the quote template**: Edit the HTML in the AI Agent's message to match your branding Change data sources**: Replace Google Sheets with PostgreSQL or MySQL nodes Add approval steps**: Insert a manual approval node for quotes above a certain value Include attachments**: Add PDF or product spec file nodes Enhance notifications**: Add Slack or Teams notifications after quote is sent Implement follow-ups**: Create a separate workflow for reminder emails This template provides a solid foundation for automating your quotation process, while staying flexible to fit your specific business needs. Feel free to contact me for further implementation guidelines: LinkedIn: Berke
by Ranjan Dailata
Who this is for The Async Structured Bulk Data Extract with Bright Data Web Scraper workflow is designed for data engineers, market researchers, competitive intelligence teams, and automation developers who need to programmatically collect and structure high-volume data from the web using Bright Data's dataset and snapshot capabilities. This workflow is built for: Data Engineers - Building large-scale ETL pipelines from web sources Market Researchers - Collecting bulk data for analysis across competitors or products Growth Hackers & Analysts - Mining structured datasets for insights Automation Developers - Needing reliable snapshot-triggered scrapers Product Managers - Overseeing data-backed decision-making using live web information What problem is this workflow solving? Web scraping at scale often requires asynchronous operations, including waiting for data preparation and snapshots to complete. Manual handling of this process can lead to timeouts, errors, or inconsistencies in results. This workflow automates the entire process of submitting a scraping request, waiting for the snapshot, retrieving the data, and notifying downstream systems all in a structured, repeatable fashion. It solves: Asynchronous snapshot completion handling Reliable retrieval of large datasets using Bright Data Automated delivery of scraped results via webhook Disk persistence for traceability or historical analysis What this workflow does Set Bright Data Dataset ID & Request URL: Takes in the Dataset ID and Bright Data API endpoint used to trigger the scrape job HTTP Request: Sends an authenticated request to the Bright Data API to start a scraping snapshot job Wait Until Snapshot is Ready: Implements a loop or wait mechanism that checks snapshot status (e.g., polling every 30 seconds) until completion i.e ready state Download Snapshot: Downloads the structured dataset snapshot once ready Persist Response to Disk: Saves the dataset to disk for archival, review, or local processing Webhook Notification: Sends the final result or a summary of it to an external webhook Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. Update the Set Dataset Id, Request URL for setting the brand content URL. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Polling Strategy : Adjust polling interval (e.g., every 15โ60 seconds) based on snapshot complexity Input Flexibility : Accept datasetId and request URL dynamically from a webhook trigger or input form Webhook Output : Send notifications to - Internal APIs โ for use in dashboards Zapier/Make โ for multi-step automation Persistence Save output to: Remote FTP or SFTP storage Amazon S3, Google Cloud Storage etc.
by Stefan
Track n8n Node Definitions from GitHub and Export to Google Sheets Overview This workflow automatically retrieves and processes metadata from the official n8n GitHub repository, filters all available .node.json files, parses their structure, and appends structured information into a Google Sheet. Perfect for developers, community managers, and technical writers who need to maintain up-to-date information about n8n's evolving node ecosystem. Setup Instructions Prerequisites Before setting up this workflow, ensure you have: A GitHub account with API access A Google account with Google Sheets access An active n8n instance (cloud or self-hosted) Step 1: GitHub API Configuration Navigate to GitHub Settings โ Developer Settings โ Personal Access Tokens Generate a new token with public_repo permissions Copy the generated token and store it securely In n8n, create a new "GitHub API" credential Paste your token in the credential configuration and save Step 2: Google Sheets Setup Create a new Google Sheets document Set up the following column headers in the first row: node (Column A) - Node identifier/name nodeVersion (Column B) - Version of the node codexVersion (Column C) - Codex version number categories (Column D) - Node categories credentialDocumentation (Column E) - Credential documentation URL primaryDocumentation (Column F) - Primary documentation URL Note down the Google Sheets document ID from the URL Configure Google Sheets OAuth2 credentials in n8n Step 3: Workflow Configuration Import the workflow into your n8n instance Update the following placeholder values: Replace YOUR_GOOGLE_SHEETS_DOCUMENT_ID with your actual document ID Replace YOUR_WEBHOOK_ID if using webhook functionality Configure the GitHub API credentials in the HTTP Request nodes Set up Google Sheets credentials in the Google Sheets nodes Share your Google Sheets document with the email address associated with your Google OAuth2 credentials Grant "Editor" permissions to allow the workflow to write data Google Sheets Template Details The workflow creates a structured dataset with these columns: node**: Node identifier (e.g., n8n-nodes-base.slack) nodeVersion**: Version of the node (e.g., 1.0.0) codexVersion**: Codex version number (e.g., 1.0.0) categories**: Node categories (e.g., Communication, Productivity) credentialDocumentation**: URL to credential documentation primaryDocumentation**: URL to primary node documentation Customization Options Modifying Data Extraction You can customize the "Format Data" node to extract additional fields: Add new assignments in the Set node Modify the column mapping in the Google Sheets node Update your spreadsheet headers accordingly Changing Update Frequency To run this workflow on a schedule: Replace the Manual Trigger with a Cron node Set your desired schedule (e.g., daily, weekly) Configure appropriate timing to avoid API rate limits Adding Filters Customize the "Filter Node Files" code node to: Filter specific node types Include/exclude certain categories Process only recently updated nodes Features Fetches all node definitions from the n8n-io/n8n repository Filters for .node.json files only Downloads and parses metadata automatically Extracts key fields like node names, versions, categories, and documentation URLs Appends structured data to Google Sheets with batch processing Includes error handling and retry mechanisms Clears existing data before appending new information for fresh results Use Cases This workflow is ideal for: Track changes in official n8n node definitions over time Audit node categories and documentation links for completeness Build custom dashboards from node metadata Community management and documentation maintenance Integration planning and compatibility analysis
by InfraNodus
Optimize Your Top Performing Website Content with Google Analytics, Firecrawl, and InfraNodus This templates helps you extract** the top performing pages from your website using Google Analytics scrape** the content of the pages using Firecrawl API (HTTP node provided) build a knowledge graph* for all these pages with the *topics* and *gaps** identified using InfraNodus understand the main concepts and topical clusters in your top-performing content, so you can create more of it, while also identifying the content gaps โย structural holes between the topics that you can use to generate new content ideas have access to a knowledge graph visualization of your top performing content to explore it using the interactive network interface How it works This template uses the InfraNodus to visualize and analyze your top performing content. It will extract the top pages from the Google Analytics data for the website you choose and scrape their text content using the high-quality Firecrawl API. Then it will ingest every page into an InfraNodus graph you specify. The graph can be used to explore the content visually. The insights from the graph, such as the main topics and gaps between them will be shown to you in the end of the workflow. You can use these insights to understand what kind of content you should focus on creating to get the highest number of views* and to establish *topical authority* in your area, which is good for *SEO* and *LLM optimization** โ focusing on the topics identified in the top content discover the content gaps โ which topics are not connected yet that you could link with new content ideas and publish โย this caters to your audience's interests, but connects your existing ideas in a new way. So you deliver the content that's relevant but also novel. Here's a description step by step: Note:* you can replace the PDF to Text convertor node with a better quality *PDF convertor* from ConvertAPI which respects the original file layout and doesn't split text into small chunks Trigger the workflow Extract a list of top (25, 50) pages from your Google Analytics account (you'll need to connect it via the Google Cloud API) Fix the extracted data andย add a correct URL prefix to each page (if your Analytics has relative paths only Loop through each page extracted Extract the text content of every page using the high-quality Firecrawl API Ingest the text content into the InfraNodus graph that you specify Once all the pages are ingested into the InfraNodus graph, access the AI insights endpoint in InfraNodus and get the information about the main topics and gaps Display this information to the user How to use You need an InfraNodus API account and key to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Requirements An InfraNodus account and API key Optional: A Google Analytics account for your property (alternatively, you can modify this workflow to provide a list of the most popular pages) Optional: A Google Cloud API access (to access the data from Google Analytic saccount โย follow the n8n instructions) Optional: A Firecrawl API key API key for better quality web page scraping (otherwise, use the standard HTTP to Text node from n8n) Customizing this workflow You can customize this workflow by using a list of the URL pages you want to analyze from a Google sheet. Alternatively, you can use the Google SERP node to extract top search results for a query and get the main topics for them. For support and feedback, please, contact us at https://support.noduslabs.com To learn more about InfraNodus: https://infranodus.com
by Cyril Nicko Gaspar
๐ AI Agent Template with Bright Data MCP Tool Integration This template obtains all the possible tools from Bright Data MCP, process this through chatbot, then run any tool based on the user's query โ Problem It Solves The problem that the MCP solves is the complexity and difficulty of traditional automation, where users need to have specific knowledge of APIs or interfaces to trigger backend processes. By allowing interaction through natural language, automatically classifying and routing queries, and managing context and memory effectively, MCP simplifies complex data operations, customer support, and workflow orchestration scenarios where inputs and responses change dynamically. ๐งฐ Pre-requisites Before deploying this template, ensure you have: An active n8n instance (self-hosted or cloud). A valid OpenAI API key (or any AI models) Access to Bright Data MCP API with credentials. Basic familiarity with n8n workflows and nodes. โ๏ธ Setup Instructions **Install the MCP Community Node in N8N In your N8N self-hosted instance, go to Settings โ Community Nodes. Search and install n8n-nodes-mcp. Configure Credentials: Add your OpenAI API key or any AI mdeols to the relevant nodes. If you want other AI model, please replace all associated nodes of OpenAI in the workflow Set up Bright Data MCP client credentials in the installed community node (STDIO) Obtain your API in Bright Data and put it in Environment field in the credentials window. It should be written as API_Key=<your api key from Bright Data> ๐ Workflow Functionality (Summary) User message** triggers the workflow. AI Classifier** (OpenAI) interprets the intent and maps it to a tool from Bright Data MCP. If no match is found, the user is notified. If more information is needed, the AI requests it. Memory** preserves context for follow-up actions. The tool is executed, and results are returned contextually to the user. > ๐ง Optional memory buffer and chat memory manager nodes keep conversations context-aware across multiple messages. ๐งฉ Use Cases Data Scraping Automation**: Trigger scraping tasks via chat. Lead Generation Bots**: Use MCP tools to fetch, enrich, or validate data. Customer Support Agents**: Automatically classify and respond to queries with tool-backed answers. Internal Workflow Agents**: Let team members trigger backend jobs (e.g., reports, lookups) by chatting naturally. ๐ ๏ธ Customization Tool Matching Logic**: Modify the AI classifier prompt and schema to suit different APIs or services. Memory Size and Retention**: Adjust memory buffer size and filtering to fit your appโs complexity. Tool Execution**: Extend the "Execute the tool" sub-workflow to handle additional actions, fallback strategies, or logging. Frontend Integration**: Connect this with various platforms (e.g., WhatsApp, Slack, web chatbots) using the webhook. โ Summary This template delivers a powerful no-code/low-code agent that turns chat into automation, combining AI intelligence with real-world tool execution. With minimal setup, you can build contextual, dynamic assistants that drive backend operations using natural language.