by Jonathan
This workflow will check a mailbox for new emails and if the Subject contains Expenses or Reciept it will send the attachment to Mindee for processing then it will update a Google sheet with the values. To use this node you will need to set the Email Read node to use your mailboxes credentials and configure the Mindee and Google Sheets nodes to use your credentials.
by David w/ SimpleGrow
This n8n workflow tracks user engagement in a specific WhatsApp group by capturing incoming messages via a Whapi webhook. It first filters messages to ensure they come from the correct group, then identifies the message type—text, emoji reaction, voice, or image. The workflow searches for the user in an Airtable database using their WhatsApp ID and increments their message count by one. It updates the Airtable record with the new count and the date of the last interaction. This automated process helps measure user activity and supports engagement initiatives like weekly raffles or rewards. The system is flexible and can be expanded to include more message types or additional actions. Overall, it provides a seamless way to encourage and track user participation in your WhatsApp community.
by Agent Circle
This n8n template demonstrates how to use the tool to crawl comments from a YouTube video and simply get all the results in a linked Google Sheet. Use cases are many: Whether you're a YouTube creator trying to understand your audience, a marketer running sample analysis, a data analyst compiling engagement metrics, or part of a growth team tracking YouTube or social media campaign performance, this workflow helps you extract real, actionable insights from YouTube video comments at scale. How It Works The workflow starts when you manually click Test Workflow or Execute Workflow in N8N. It reads the list of YouTube video URLs from the Video URLs tab in the connected YouTube – Get Video Comments Google Sheet. Only the URLs marked with the Ready status will be processed. The tool loops through each video and sends an HTTP request to the YouTube API to fetch comment data. Then, it checks whether the request is successful before continuing. If comments are found, they are split and processed. Each comment is then inserted in the Results tab of the connected YouTube – Get Video Comments Google Sheet. Once a URL has been finished, its status in the Video URLs tab of the YouTube – Get Video Comments Google Sheet is updated to Finished. How To Use Download the workflow package. Import the workflow package into your N8N interface. Duplicate the "YouTube - Get Video Comments" Google Sheet template into your Google Sheets account. Set up Google Cloud Console credentials in the following nodes in N8N, ensuring enabled access and suitable rights to Google Sheets and YouTube services: For Google Sheets access, ensure each node is properly connected to the correct tab in your connected Google Sheet template: Node Google Sheets - Get Video URLs → connected to the Video URLs tab; Node Google Sheets - Insert/Update Comment → connected to the Results tab; Node Google Sheets - Update Status connected to the Video URLs tab. For YouTube access: Set up a GET method in Node HTTP Request - Get Comments. Open the template in your Google Sheets account. In the tab Video URLs, fill in the video URLs you want to crawl in Column B and update the status for each row in Column A to Ready. Return to the N8N interface and click Execute Workflow. Check the results in the Results tab of the template - the collected comments will appear there. Requirements Basic setup in Google Cloud Console (OAuth or API Key method enabled) with enabled access to YouTube and Google Sheets. How To Customize By default, the workflow is manually triggered in N8N. However, you can automate the process by adding a Google Sheets trigger that monitors new entries in your connected YouTube – Get Video Comments template and starts the workflow automatically. Need Help? Join our community on different platforms for support, inspiration and tips from others. Website: https://www.agentcircle.ai/ Etsy: https://www.etsy.com/shop/AgentCircle Gumroad: http://agentcircle.gumroad.com/ Discord Global: https://discord.gg/d8SkCzKwnP FB Page Global: https://www.facebook.com/agentcircle/ FB Group Global: https://www.facebook.com/groups/aiagentcircle/ X: https://x.com/agent_circle YouTube: https://www.youtube.com/@agentcircle LinkedIn: https://www.linkedin.com/company/agentcircle
by Jesse White
Automate High-Quality Voice with Google Text-to-Speech & n8n Effortlessly convert any text into stunningly realistic, high-quality audio with this powerful n8n workflow. Leveraging Google's advanced Text-to-Speech (TTS) AI, this template provides a complete, end-to-end solution for generating, storing, and tracking voiceovers automatically. Whether you're a content creator, marketer, or developer, this workflow saves you countless hours by transforming your text-based scripts into ready-to-use audio files. The entire process is initiated from a simple form, making it accessible for users of all technical levels. Features & Benefits 🗣️ Studio-Quality Voices: Leverage Google's cutting-edge AI to produce natural and expressive speech in a wide variety of voices and languages. 🚀 Fully Automated Pipeline: From text submission to final file storage, every step is handled automatically. Simply input your script and let the workflow do the rest. ☁️ Seamless Cloud Integration: Automatically uploads generated audio files to Google Drive for easy access and sharing. 📊 Organized Asset Management: Logs every generated audio file in an Airtable base, complete with the original script, a direct link to the file, and its duration. ⚙️ Simple & Customizable: The workflow is ready to use out-of-the-box but can be easily customized. Change the trigger, add notification steps, or integrate it with other services in your stack. Perfect For a Variety of Use Cases 🎬 Content Creators: Generate consistent voiceovers for YouTube videos, podcasts, and social media content without needing a microphone. 📈 Marketers: Create professional-sounding audio for advertisements, product demos, and corporate presentations quickly and efficiently. 🎓 Educators: Develop accessible e-learning materials, audiobooks, and language lessons with clear, high-quality narration. 💻 Developers: Integrate dynamic voice generation into applications, build interactive voice response (IVR) systems, or provide audio feedback for user actions. How The Workflow Operates Initiate with a Form: The process begins when you submit a script, a desired voice, and language through a simple n8n Form Trigger. Synthesize Speech: The workflow sends the text to Google's Text-to-Speech API, which generates the audio and returns it as a base64 encoded file. Process and Upload: The data is converted into a binary audio file and uploaded directly to a specified folder in your Google Drive. Enrich Metadata: The workflow then retrieves the audio file's duration using the fal.ai ffmpeg API, adding valuable metadata. Log Everything: Finally, it creates a new record in your Airtable base, storing the asset name, description (your script), content type, file URLs from Google Drive, and the audio duration for perfect organization. What You'll Need To use this workflow, you will need active accounts for the following services: Google Cloud oAuth2 Client Credentials:** With the Text-to-Speech API enabled. Google Drive:** For audio file storage. Airtable:** For logging and asset management. fal.ai:** For the ffmpeg API used to get audio duration.
by Agent Circle
This N8N template demonstrates how to use our tool to collect a list of videos from any YouTube channel - including video URLs, titles, descriptions, thumbnail links, publish dates, and more - all saved cleanly into a connected Google Sheet. Use cases are many: Whether you're a YouTube content strategist tracking competitors, a marketing team building dashboards from video metadata, or an automation pro connecting YouTube to downstream workflows, researchers and analysts collecting structured video data at scale, this tool gives you what you need! How It Works The workflow begins when you click Execute Workflow or Test Workflow manually in N8N. It reads the list of full channel URLs, custom channel URLs or channel IDs from the Channel URLs tab in the connected Google Sheet. Only the channels with the Ready status will be processed. A Switch node detects whether the input is a full/custom channel URL, or a raw channel ID, and routes it accordingly. If the input is already a channel ID, the tool prepares the data structure before sending it to the YouTube API call in the next step. If the input is a full channel URL or a custom channel URL, the workflow extracts the username, then sends a HTTP Request to the YouTube API to retrieve the corresponding Channel ID, and prepares the data structure before continuing. Once the valid Channel ID is set, the tool sends a request to YouTube API endpoint to retrieve a list of public videos. By default, the number of videos extracted per channel is limited to 10. The API response is checked for success: If successful, the video data is split into individual entries, cleaned, and added to the Videos tab in the connected Google Sheet. The original rows' status in the Channel URLs tab is marked as Finished. If an error occurs, the rows' status in the Channel URLs tab is marked as Error for later review. How To Use Download the workflow package. Import the workflow package into your N8N interface. Duplicate the YouTube - Get Channel Videos Google Sheet template into your Google Sheets account. Set up Google Cloud Console credentials in the following nodes in N8N, ensuring enabled access and suitable rights to Google Sheets and YouTube services: For Google Sheets access, ensure each node is properly connected to the correct tab in your connected Google Sheet: Node Google Sheets - Get Channel URLs → connected to the Channel URLs tab; Node Google Sheets - Update Data → connected to the Videos tab; Node Google Sheets - Update Data - Success → connected to the Channel URLs tab; Node Google Sheets - Update Data - Error → connected to the Channel URLs tab. For YouTube access, set up a GET method to connect to YouTube API in the following nodes: Node HTTP Request - Get Channel ID; Node HTTP Request - Get Channel Videos. In your connected Google Sheet, enter the full channel URLs, custom channel URLs or channel IDs that you want to crawl and set the rows' status to Ready. Run the workflow by clicking Execute Workflow or Test Workflow in N8N. View the results in your connected Google Sheet: Successful fetches will update the original rows' status to Finished and the videos' information show up in the Videos tab. If any URL or ID fails, the rows' status will be marked as Error. Requirements Basic setup in Google Cloud Console (OAuth or API Key method enabled) with enabled access to YouTube and Google Sheets. How To Customize By default, the workflow is manually triggered in N8N. However, you can automate the process by adding a Google Sheets trigger that monitors new entries automatically. If you want to fetch more video metadata like durations, or view counts, you can expand the HTTP Request and post-processing nodes to include those. The workflow, by default, collects up to 10 videos per channel. If you’d like to fetch more, in the connected Google Sheet, simply enter your desired video number limit in Column C in the Channel URLs tab. The tool will use that value when calling the YouTube API. Need Help? Join our community on different platforms for support, inspiration and tips from others. Website: https://www.agentcircle.ai/ Etsy: https://www.etsy.com/shop/AgentCircle Gumroad: http://agentcircle.gumroad.com/ Discord Global: https://discord.gg/d8SkCzKwnP FB Page Global: https://www.facebook.com/agentcircle/ FB Group Global: https://www.facebook.com/groups/aiagentcircle/ X: https://x.com/agent_circle YouTube: https://www.youtube.com/@agentcircle LinkedIn: https://www.linkedin.com/company/agentcircle
by Baptiste Fort
Who is it for? This workflow is perfect for marketers, sales teams, agencies, and local businesses who want to save time by automating lead generation from Google Maps. It’s ideal for real estate agencies, restaurants, service providers, and any local niche that needs a clean database of fresh contacts, including emails, websites, and phone numbers. ✅ Prerequisites Before starting, make sure you have: Apify account** → to scrape Google Maps data OpenAI API key** → for GPT-4 email extraction Airtable account & base** → for structured lead storage Gmail account with OAuth** → to send personalized outreach emails Your Airtable base should have these columns: | Title | Street | Website | Phone Number | Email | URL | |-------------------------|-------------------------|--------------------|-----------------|------------------------|----------------------| | Paris Real Estate Agency| 10 Rue de Rivoli, Paris | https://agency.fr | +33 1 23 45 67 | contact@agency.fr | maps.google.com/... | 🏡 Example Use Case To keep things clear, we’ll use real estate agencies in Paris as an example. But you can replace this with restaurants, plumbers, lawyers, or even hamster trainers (you never know). 🔄 How the workflow works Scrape Google Maps leads with Apify Clean & structure the data (name, phone, website) Visit each website & extract emails with GPT-4 Save all leads into Airtable Automatically send a personalized email via Gmail This works for any industry, keyword, or location. Step 1 – Scraping Google Maps with Apify Start simply: Open your n8n workflow and choose the trigger: “Execute Workflow” (manual trigger). Add an HTTP Request node (POST method). Now, head over to Apify Google Maps Extractor. Fill in the fields according to your needs: Keyword: e.g., "real estate agency" (or restaurant, plumber...) Location: "Paris, France" Number of results: 50 (or more) Optional: filters (with/without a website, by categories…) Click Run to test the scraper. Then click API → select API endpoints tab. Choose “Run Actor synchronously and get dataset items”. Copy the URL, go back to n8n, and paste it into your HTTP Request node (URL field). Then enable: Body Content Type → JSON Specify Body Using JSON Go back to Apify, click the JSON tab, copy everything, and paste it into the JSON field of your HTTP Request. If you now run your workflow, you'll get a nice structured table filled with Google Maps data. Pretty magical already—but we're just getting started! Step 2 – Cleaning Things Up (Edit Fields) Raw data is cool, but messy. Add an Edit Fields node next, using Manual Mapping mode. Here’s what you keep (copy-paste friendly): Title → {{ $json.title }} Address → {{ $json.address }} Website → {{ $json.website }} Phone → {{ $json.phone }} URL → {{ $json.url }} Now, you have a clean, readable table ready to use. Step 3 – Handling Each Contact Individually (Loop Over Items) Next, we process each contact one by one. Add the Loop Over Items node: Set Batch Size to 20 or more, depending on your needs. This node is simple but crucial to avoid traffic jams in the automation. Step 4 – Isolating Websites (Edit Fields again) Add another Edit Fields node (Manual Mapping). This time, keep just: Website → {{ $json.website }} We've isolated the websites for the next step: scraping them one by one. Step 5 – Scraping Each Website (HTTP Request) Now, we send our little robot to visit each website automatically. Add another HTTP Request node: Method: GET URL: {{ $json.website }} (from the previous node) This returns the raw HTML content of each site. Yes, it's ugly, but we won't handle it manually. We'll leave the next step to AI! Step 6 – Extracting Emails with ChatGPT We now use OpenAI (Message a Model) to politely ask GPT to extract only relevant emails. Configure as follows: Model: GPT-4-1-mini or higher Operation: Message a Model Simplify Output: ON Prompt to copy-paste: Look at this website content and extract only the email I can contact this business. In your output, provide only the email and nothing else. Ideally, this email should be of the business owner, so if you have 2 or more options, try for the most authoritative one. If you don't find any email, output 'Null'. Exemplary output of yours: name@examplewebsite.com {{ $json.data }} ChatGPT will kindly return the perfect email address (or 'Null' if none is found). Step 7 – Neatly Store Everything in Airtable Almost done! Add an Airtable → Create Record node. Fill your Airtable fields like this: | Airtable Field | Content | n8n Variable | | ------------------ | ------------------------------- | ------------------------------------------ | | Title | Business name | {{ $('Edit Fields').item.json.Title }} | | Street | Full address | {{ $('Edit Fields').item.json.Address }} | | Website | Website URL | {{ $('Edit Fields').item.json.Website }} | | Phone Number | Phone number | {{ $('Edit Fields').item.json.Phone }} | | Email | Email retrieved by the AI agent | {{ $json.message.content }} | | URL | Google Maps link | {{ $('Edit Fields').item.json.URL }} | Now, you have a tidy Airtable database filled with fresh leads, ready for action. Step 8 – Automated Email via Gmail (The Final Touch) To finalize the workflow, add a Gmail → Send Email node after your Airtable node. Here’s how to configure this node using the data pulled directly from your Airtable base (from the previous step): Recipient (To): Retrieve the email stored in Airtable ({{ $json.fields.Email }}). Subject: Use the company name stored in Airtable ({{ $json.fields.Title }}) to personalize the subject line. Body: You can include several fields directly from Airtable, such as: Company name: {{ $json.fields.Title }} Website URL: {{ $json.fields.Website }} Phone number: {{ $json.fields["Phone Number"] }} Link to the Google Maps listing: {{ $json.fields.URL }} All of this data is available in Airtable because it was automatically inserted in the previous step (Step 7). This ensures that each email sent is fully personalized and based on clear, reliable, and structured information.
by Jonathan
This workflow will take a timer entry from Clockify and submit it to a matching ticket in Syncro. It saves the time entry ID from Clockify and the time entry ID from Syncro into a Google Sheets. Then, it will check if a match already exists from a previous update and will update the same time entry if the description or time is changed in Clockify. There is a Set node with the name and Syncro IDs of technicians. If you have multiple technicians with the same name, this won't work for you. Likewise, if the name in Clockify doesn't exactly match what you put in the Set, it won't work. You also need to setup a webhook in Clockify set to trigger on "Time entry updated (anyone)" and pointed at your workflow. Configured this way, you can start and stop time entries at will and it won't do anything until you change the description. > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
by n8n Team
Who this template is for This template is for everyone who needs to work with XML data a lot and wants to convert it to JSON instead. Use case Many products still work with XML files as their main language. Unfortunately, not every software still supports XML, as many switched to more modern storing languages such as JSON. This workflow is designed to handle the conversion of XML data to JSON format via a webhook call, with error handling and Slack notifications integrated into the process. How this workflow works Triggering the workflow: This workflow initiates upon receiving an HTTP POST request at the webhook endpoint specified in the "POST" node. The endpoint, designated as , can be accessed externally by sending a POST request to that URL. Data routing and processing: Upon receiving the POST request, the Switch node routes the workflow's path based on conditions determined by the content type of the incoming data or any encountered errors. The Extract From File and Edit Fields (Set) nodes manage XML input processing, adapting their actions according to the data's content type. XML to JSON conversion: The XML data extracted from the input is passed through the "XML" node, which performs the conversion process, transforming it into JSON format. Response handling: If the XML-to-JSON conversion is successful, a success response is sent back with a status of "OK" and the converted JSON data. If there are any errors during the XML-to-JSON conversion process, an error response is sent back with a status of "error" and an error message. Error handling: in case of an error during processing, the workflow sends a notification to a Slack channel designated for error reporting. Set up steps Set up your own in the Webhook node. While building or testing a workflow, use a test webhook URL. When your workflow is ready, switch to using the production webhook URL. Set credentials for Slack.
by Olek
This error handling workflow emails detailed notifications on workflow execution and trigger errors. It extends Send email via Gmail on workflow error template by covering trigger-level errors. Features Get notifications on both main workflow trigger and execution time errors. Subject line will have failed workflow id, name, error source (execution or trigger), error message. Body will contain links to both failed and error handling workflows as well as execution or trigger-level error details. Body will also contain a machine readable and enriched JSON from Error Trigger describing the error. Use this error handling workflow for as many workflows as you need. Configiration Copy this workflow to your workspace and, optionally, move it under the project that contains your main workflow In this error handling workflow settings, set This workflow can be called by as appropriate In Config node, define your app url, notifications recipient email, and sender name (useful to build filters in your inbox) In Gmail node, create and select credentials In your main workflow settings, pick this error handling workflow in the Error Workflow field (How to...) Related resources n8n Error Trigger documentation. Author Reach out Olek on community.n8n.io Olek on n8n creators hub
by Evoort Solutions
🧲 AI-Powered Lead Magnet Idea Generation from Topic List This n8n workflow automatically generates lead magnet ideas based on topics and website URLs stored in a Google Sheet. It uses the Lead Magnet Idea Generator AI API to produce relevant, value-driven ideas that marketers can turn into checklists, guides, templates, and more. 🔧 What This Workflow Does Monitors a Google Sheet for new or updated rows using a Drive trigger. Reads all rows and identifies entries where: The Topic column is not empty The Content (idea) column is empty Sends a request to the Lead Magnet Idea Generator AI API: Input: Topic + Website URL Output: AI-generated lead magnet idea Writes the idea back to the same Google Sheet with a timestamp. Repeats the process automatically every minute. 🌐 API Used Name:** Lead Magnet Idea Generator AI API Endpoint:** https://lead-magnet-idea-generator-ai.p.rapidapi.com/index.php Method:** POST Headers:** x-rapidapi-host: lead-magnet-idea-generator-ai.p.rapidapi.com x-rapidapi-key: YOUR_RAPIDAPI_KEY Body Params:** topic website ✅ Benefits | Feature | Value | |----------------------------------|------------------------------------------------------------------| | 🔄 Automated Flow | No manual entry needed — runs every minute | | 🧠 AI-Based Content Ideation | Smart suggestions tailored to your topic and brand | | 📝 Google Sheets Integration | Easy to manage, edit, and view input/output in one place | | 🕒 Timestamp Tracking | Know exactly when each idea was generated | | 🚫 No Duplicate Processing | Only rows missing ideas are sent to the API | | 💼 Scalable for Teams | Plug-and-play for any team managing multiple content ideas | ❌ Challenges This Solves | Old Challenge | New Workflow Solution | |--------------------------------------------|------------------------------------------------------------------| | Manual brainstorming of lead magnet ideas | Fully automated idea generation via API | | Missing or inconsistent content in sheets | Only incomplete rows are updated with valid ideas | | Lack of traceability | Timestamp logs show when each idea was generated | | Wasting time on repetitive tasks | Workflow handles idea generation while you focus on execution | 📌 Requirements ✅ A valid RapidAPI key ✅ Google Sheets & Google Drive credentials set up in n8n ✅ Google Sheet structured with the following columns: | Column Name | Purpose | Required | |------------------|----------------------------------------------------------|----------| | Topic | Main subject for which the idea is generated | ✅ Yes | | Website Url | Optional URL to provide brand context for the API | ❌ No | | Content | Will be filled with the AI-generated lead magnet idea | ✅ Yes | | Generated Date | Timestamp when the idea was created | ✅ Yes | 🧩 Technologies Used n8n** – Automation platform Google Sheets** – For storing topics and generated ideas Google Drive Trigger** – To initiate the workflow Lead Magnet Idea Generator AI API** – For content generation HTTP Request node** – To communicate with the API If / Wait / Split In Batches nodes** – For conditional logic and throttling 🧠 Example Use Cases Content marketing teams planning lead magnets for blog posts Agencies creating assets for multiple clients Email list-building strategists generating downloadable content ideas Business owners who want quick suggestions without manual brainstorming Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Jonathan | NEX
Drowning in security alerts? Spending hours translating technical logs from Wazuh, your SIEM, or other tools into business-friendly reports for leadership? This n8n workflow is your automated Security Analyst, designed to save you time and bridge the communication gap between technical teams and non-technical executives. Using a powerful two-stage AI process via the NixGuard Security RAG connector, this workflow transforms raw security event data into a concise, actionable daily briefing. How It Works: Stage 1: Intelligent Filtering & Data Structuring: On a daily schedule, the workflow first calls the AI to sift through all recent security events. It intelligently identifies significant alerts and structures them into a clean, machine-readable JSON array, cutting through the noise. Stage 2: Executive Summarization: If critical alerts are found, the workflow feeds this structured JSON into a second AI prompt. It tasks the AI to act as a Senior Security Analyst, generating a high-level summary that focuses on business impact, key threat patterns, and a clear, single recommendation—all in plain English. Automated Delivery: The final Markdown report is automatically converted to HTML and emailed as a professional daily security briefing to your stakeholders. Key Features & Benefits: Slash Reporting Time:** Automate the manual, time-consuming process of daily security analysis and reporting. Bridge the Technical Gap:** Deliver clear, non-technical summaries that executives can understand and act upon instantly. Reduce Alert Fatigue:** Let AI filter out the low-level noise and only escalate what truly matters. Two-Stage AI Processing:** Leverage a sophisticated AI chain for more accurate and relevant results than a single prompt. Highly Customizable:** Easily adapt the prompts, schedule, and data sources (any system compatible with the NixGuard RAG connector) to fit your exact needs. Who is this for? Security Analysts, Engineers, and Managers** who need to automate daily reporting. SecOps and DevOps Teams** looking to integrate security intelligence into their automated workflows. IT Directors and VPs** who need to provide consistent security posture updates to leadership. Anyone responsible for communicating cybersecurity risk to non-technical stakeholders. Stop copying and pasting logs. Download this workflow to automate your security reporting and deliver real business value today! Don't have the main workflow yet? Get it HERE! 🔗 Learn more about NixGuard: thenex.world 🔗 Get started with a free security subscription: thenex.world/security/subscribe Tags / Keywords: AI, Security, Automation, Cybersecurity, Wazuh, SIEM, Reporting, Executive Summary, Daily Briefing, Alert Fatigue, SecOps, Generative AI, LLM, NixGuard, Email, JSON
by phil
This workflow is your ultimate solution for reliable image retrieval from any web source, including those heavily protected. It operates with a smart, cost-effective strategy: it first attempts to fetch the image using a Classic Image Getter node (a standard, free HTTP request). In approximately 80% of cases, this method will be sufficient. However, for the remaining instances where you encounter IP blocking, CAPTCHAs, rate limiting, or other advanced anti-bot measures, the workflow seamlessly switches to a robust BrightData Web Unblocker service as a fallback. It leverages BrightData’s Image Unblocker to retrieve these blocked images. This template is indispensable for anyone needing consistent and complete access to web images, ensuring you get the data you need without unnecessary overhead. Why Use This Image Scrapper Workflow? Maximum Success Rate**: Retrieves images even from the most challenging or protected websites. Cost-Optimized Strategy**: Prioritizes free, standard HTTP requests, only incurring costs when advanced unblocking is truly necessary. Automated Resilience**: Intelligently handles failed direct attempts by automatically engaging the BrightData failover via the "Unlock Image" node. Versatile Image Scraping**: Perfect for market research, content aggregation, or data enrichment that demands reliable image access. How It Works When clicking ‘Execute workflow’: The workflow is initiated manually, allowing for easy testing and integration into larger processes. image: A Set node defines the target image URL. This can be easily configured to accept dynamic URLs from preceding nodes. Classic Image Getter: This HTTP Request node performs a direct image download. It's the primary, free, and efficient method for readily accessible images. Unlock Image (BrightData Web Unblocker): Configured as an error handler and failover, this HTTP Request node activates only if the "Classic Image Getter" encounters an error. It then routes the image URL through BrightData's Web Unblocker, designed to bypass advanced protective measures and successfully retrieve the image data. 🔑 Prerequisites To enable the advanced capabilities of this workflow, specifically the BrightData Web Unblocker functionality, you will need a BrightData account and a correctly configured Web Unblocker zone. Setting Up Your BrightData Web Unblocker: BrightData Account: Ensure you have an active account with BrightData. If you don't, you can sign up on their website. Create a Web Unblocker Zone: Log in to your BrightData dashboard. Navigate to the "Proxy & Scraping Infrastructure" section, then "Zones." Click "Add new zone." Select "Web Unblocker" as the product type. Give your zone a clear name (e.g., n8n-image-unlocker). Confirm the creation of the zone. Retrieve API Key: Once your Web Unblocker zone is active, go to its settings. Locate your API Key (often referred to as "password" for proxy access) within the "Access Parameters" or "Credentials" section. Configure in n8n: In the Unlock Image HTTP Request node within this workflow, update the Authorization header. Replace "Bearer yourkey" with "Bearer YOUR_BRIGHTDATA_API_KEY". Important: For production workflows, it's highly recommended to use n8n credentials to store your BrightData API key securely, rather than hardcoding it directly in the node. This template uses a placeholder for demonstration purposes. Crucially, ensure that the zone parameter in the Unlock Image node matches the exact Zone ID you created in your BrightData account. You will need to replace the placeholder web_unlocker with your actual BrightData zone ID. Phil | Inforeole