by Airtop
About The Airtop Automation Are you tired of being shocked by unexpectedly high energy bills? With this automation using Airtop and n8n, you can take control of your daily energy costs and ensure you’re always informed. How to monitor your daily energy consumption With this automation, we’ll walk you through setting up an automation that retrieves your PG&E (Pacific Gas and Electric) energy usage data, calculates costs, and emails you the details—all without manual effort. What You’ll Need To get started, make sure you have the following: A free Airtop API Key PG&E Account Credentials - with minor adaptations, this will also work with other providers An Email Address - To receive the energy cost updates Estimated setup time: 5 minutes Understanding the Process This automation works by: Logging into your PG&E account using your credentials Navigating to your energy usage data Extracting relevant details about energy consumption and costs Emailing the daily summary directly to your inbox The automation is straightforward and ensures you have real-time insights into your energy usage, empowering you to adjust your habits and save money. Setting Up Your Automation We’ve created a step-by-step guide to help you set up this workflow. Here’s how: Insert Your Credentials: In the tools section, add your PG&E login details as variables In Airtop, add your Airtop API Key Configure your email address to receive the updates Run the Automation: Start the scenario, and watch as the automation retrieves your energy data and sends you a detailed email summary. Customization Options While the default setup works seamlessly, you can tweak it to suit your needs: Data Storage: Store energy usage data in a database for long-term tracking and analysis Visualization: Plot graphs of your energy usage trends over time for better insights Notifications: Change the automation to only send alerts on high usage instead of a daily email Real-World Applications This automation isn’t just about monitoring energy usage and taking control. Here are some practical applications: Daily Energy Management: Receive updates every morning and adjust your energy consumption based on costs Smart Home Integration: Use the data to automate appliances during off-peak hours Budgeting: Track energy expenses over weeks or months to plan your budget more effectively Happy automating!
by Tom
This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from Excel 365 (the online version of Microsoft Excel). For a version working with regular Excel files check out this workflow instead. To run the workflow: Make sure you have both Excel 365 and Salesforce authenticated with n8n. Have a Microsoft Excel workbook with contacts and their account names ready: Select the workbook and sheet in the Microsoft Excel node of the workflow, then configure the range to read data from: Hit the Execute Workflow button at the bottom of the n8n canvas: Here is how it works: The workflow first searches for existing Salesforce accounts by name. It then branches out depending on whether the account already exists in Salesforce or not. If an account does not exist yet, it will be created. The data is then normalised before both branches converge again. Finally the contacts are created or updated as needed in Salesforce.
by Yaron Been
LinkedIn Hiring Signal Scraper — Jobs & Prospecting Using Bright Data Purpose: Discover recent job posts from LinkedIn using Bright Data's Dataset API, clean the results, and log them into Google Sheets — for both job hunting and identifying high-intent B2B leads based on hiring activity. Use Cases: Job Seekers** – Spot relevant openings filtered by role, city, and country. Sales & Prospecting** – Use job posts as buying signals. If a company is hiring for a role you support (e.g. marketers, developers, ops) — it's the perfect time to reach out and offer your services. Tools Needed: n8n Nodes:** Form Trigger HTTP Request Wait If Code Google Sheets Sticky Notes (for embedded guidance) External Services:** Bright Data (Dataset API) Google Sheets API Keys & Authentication Required: Bright Data API Key** → Add in the HTTP Request headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Google Sheets OAuth2** → Connect your account in n8n to allow read/write access to the spreadsheet. General Guidelines: Use descriptive names for all nodes. Include retry logic in polling to avoid infinite loops. Flatten nested fields (like job_poster and base_salary). Strip out HTML tags from job descriptions for clean output. Things to be Aware Of: Bright Data snapshots take ~1–3 minutes — use a Wait node and polling. Form filters affect output significantly: 🔍 We recommend filtering by "Last 7 days" or "Past 24 hours" for fresher data. Avoid hardcoding values in the form — leave optional filters empty if unsure. Post-Processing & Outreach: After data lands in Google Sheets, you can use it to: Personalize cold emails based on job titles, locations, and hiring signals. Send thoughtful LinkedIn messages (e.g., "Saw you're hiring a CMO...") Prioritize outreach to companies actively growing in your niche. Additional Notes: 📄 Copy the Google Sheet Template: Click here to make your copy → Rename for each campaign or client. Form fields include: Job Location (city or region) Keyword (e.g., CMO, Backend Developer) Country (2-letter code, e.g., US, UK) This workflow gives you a competitive edge — 📌 For candidates: Be first to apply. 📌 For sellers: Be first to pitch. All based on live hiring signals from LinkedIn. STEP-BY-STEP WALKTHROUGH Step 1: Set up your Google Sheet Open this template Go to File → Make a copy You'll use this copy as the destination for the scraped job posts Step 2: Fill out the Input Form in n8n The form allows you to define what kind of job posts you want to scrape. Fields: Job Location** → e.g. New York, Berlin, Remote Keyword** → e.g. CMO, AI Architect, Ecommerce Manager Country Code (2-letter)** → e.g. US, UK, IL 💡 Pro Tip: For best results, set the filter inside the workflow to: time_range = "Past 24 hours" or "Last 7 days" This keeps results relevant and fresh. Step 3: Trigger Bright Data Snapshot The workflow sends a request to Bright Data with your input. Example API Call Body: [ { "location": "New York", "keyword": "Marketing Manager", "country": "US", "time_range": "Past 24 hours", "job_type": "Part-time", "experience_level": "", "remote": "", "company": "" } ] Bright Data will start preparing the dataset in the background. Step 4: Wait for the Snapshot to Complete The workflow includes a Wait Node and Polling Loop that checks every few minutes until the data is ready. You don't need to do anything here — it's all automated. Step 5: Clean Up the Results Once Bright Data responds with the full job post list: ✔️ Nested fields like job_poster and base_salary are flattened ✔️ HTML in job descriptions is removed ✔️ Final data is formatted for export Step 6: Export to Google Sheets The final cleaned list is added to your Google Sheet (first tab). Each row = one job post, with columns like: job_title, company_name, location, salary_min, apply_link, job_description_plain Step 7: Use the Data for Outreach or Research Example for Job Seekers: You search for: Location: Berlin Keyword: Product Designer Country: DE Time range: Past 7 days Now you've got a live list of roles — with salary, recruiter info, and apply links. → Use it to apply faster than others. Example for Prospecting (Sales / SDR): You search for: Location: London Keyword: Growth Marketing Country: UK And find companies hiring growth marketers. → That's your signal to offer help with media buying, SEO, CRO, or your relevant service. Use the data to: Write personalized cold emails ("Saw you're hiring a Growth Marketer…") Start warm LinkedIn outreach Build lead lists of companies actively expanding in your niche API Credentials Required: Bright Data API Key** Used in HTTP headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Google Sheets OAuth2** Allows n8n to read/write to your spreadsheet Adjustments & Customization Tips: Modify the HTTP Request body to add more filters (e.g. job_type, remote, company) Increase or reduce polling wait time depending on Bright Data speed Add scoring logic to prioritize listings based on title or location Final Notes: 📄 Google Sheet Template: Make your copy here ⚙️ Bright Data Dataset API: Visit BrightData.com 📬 Personalization works best when you act quickly. Use the freshest data to reach out with context — not generic pitches. This workflow turns LinkedIn job posts into sales insights and job leads. All in one click. Fully automated. Ready for your next move.
by Preston Zeller
How It Works This workflow automates the real estate lead qualification process by leveraging property data from BatchData. The automation follows these steps: When a new lead is received through your CRM webhook, the workflow captures their address information It then makes an API call to BatchData to retrieve comprehensive property details A sophisticated scoring algorithm evaluates the lead based on property characteristics like: Property value (higher values earn more points) Square footage (larger properties score higher) Property age (newer constructions score higher) Investment status (non-owner occupied properties earn bonus points) Lot size (larger lots receive additional score) Leads are automatically classified into categories (high-value, qualified, potential, or unqualified) The workflow updates your CRM with enriched property data and qualification scores High-value leads trigger immediate follow-up tasks for your team Notifications are sent to your preferred channel (Slack in this example) The entire process happens within seconds of receiving a new lead, ensuring your sales team can prioritize the most valuable opportunities immediately.. Who It's For This workflow is perfect for: Real estate agents and brokers looking to prioritize high-value property leads Mortgage lenders who need to qualify borrowers based on property assets Home service providers (renovators, contractors, solar installers) targeting specific property types Property investors seeking specific investment opportunities Real estate marketers who want to segment audiences by property value Home insurance agents qualifying leads based on property characteristics Any business that bases lead qualification on property details will benefit from this automated qualification system. About BatchData BatchData is a comprehensive property data provider that offers detailed information about residential and commercial properties across the United States. Their API provides: Property valuation and estimates Ownership information Property characteristics (size, age, bedrooms, bathrooms) Tax assessment data Transaction history Occupancy status (owner-occupied vs. investment) Lot details and dimensions By integrating BatchData with your lead management process, you can automatically verify and enrich leads with accurate property information, enabling more intelligent lead scoring and routing based on actual property characteristics rather than just contact information. This workflow demonstrates how to leverage BatchData's property API to transform your lead qualification process from manual research into an automated, data-driven system that ensures high-value leads receive immediate attention.
by PretenderX
This template automates sending a DingTalk message on new Azure Dev Ops Pull Request Created Events. It uses a MySQL database to store mappings between Azure users and DingTalk users; so the right users get notified. Set up instructions Define the path value of ReceiveTfsPullRequestCreatedMessage Webhook node of your own, copy the webhook url to create a Azure DevOps ServiceHook that call webhook with Pull Request Created event. In order to configure the LoadDingTalkAccountMap node, you need to create a MySQL table as below: |Name|Type|Length|Key| |-|-|-|-| |TfsAccount|varchar|255| |UserName|varchar|255| |DingTalkMobile|varchar|255| You can customize the Ding Talk message content by editing the BuildDingTalkWebHookData node. Define the URL of SendDingTalkMessageViaWebHook Http Request node as your Ding Talk group chat robot webhook URL. Send test of production message from Azure DevOps to test.
by dataplusminus+-
🎯 Project Purpose This project automates the process of collecting and managing new leads submitted through a web form. It eliminates the need for manual data entry and ensures that each lead is: Properly recorded and time-stamped in a structured format Automatically communicated to the sales or support team Ready for follow-up, with a reminder system in place It’s a lightweight but effective solution suitable for freelancers, small teams, and growing businesses that want to streamline their lead intake process. 🛠️ Tools & Technologies Used Google Forms / Web Form** – Frontend for capturing leads Google Sheets** – Central database for storing lead information n8n** – Automation platform that connects and coordinates all services Gmail** – Handles email notifications for new leads Slack* *(optional) – Provides instant team notifications Date & Time nodes** – Tracks and manages lead response timing Conditional (IF) nodes** – Filters out duplicate and incomplete entries 🔄 Workflow Overview ✨ Key Features ✅ No-code integration using n8n ✅ Instant alerts via Gmail and/or Slack ✅ Google Sheets as an easily accessible backend ✅ Modular design — easy to expand with CRM tools (like HubSpot) ✅ Clean JSON structure and logic, beginner-friendly 📈 Possible Improvements Add email validation via external API (e.g., NeverBounce, Hunter) Integrate with a CRM for deeper automation Add lead scoring based on answers Include automatic follow-up emails after X days Schedule weekly summary reports via email 🧑🏻💻 Creator Information Developed by: Adem Tasin Adem T. 🌐 Website: Dataplusminus+- 📧 Email:dataplusminuss@gmail.com 💼 LinkedIn: Adem Tasin
by Akhil Varma Gadiraju
📬 Gmail to Google Drive Email Export Workflow (n8n) 🧩 Overview This n8n workflow automates the process of: Retrieving all emails from a specific sender using Gmail. Extracting essential fields like subject, message, and date. Formatting the email date to the desired time zone (e.g., IST). Exporting the parsed data as a CSV file. Uploading the file to a specified folder in Google Drive. 🛠 Nodes Breakdown 1. Start Workflow (Manual Trigger) Type**: Manual Trigger Purpose**: Initiates the workflow manually. 2. Gmail Node (Get All Emails) Type**: Gmail Operation**: getAll Filters**: sender: akhilgadiraju@gmail.com Returns**: All emails from the specified sender. Credentials**: Gmail OAuth2 - Akhil 3. Parse Data (Set Node) Purpose**: Extracts key fields from the email JSON. Mapped Fields**: id: Email ID subject: Email subject message: Email text time: Email date 4. Convert Time Field (Code Node) Purpose**: Converts the email time (ISO 8601) to a human-readable format. Output Format**: Local time using Asia/Kolkata timezone. Format: "Month Day, Year, Hour:Minute AM/PM" Customizable**: Change the timezone as needed: timeZone: 'Asia/Kolkata' 5. Convert to File Type**: Convert to File Node Purpose**: Converts JSON data to a downloadable .csv file. Output File**: CSV containing id, subject, message, and time. 6. Google Drive Type**: Google Drive Purpose**: Uploads the generated CSV file to Google Drive. Drive**: My Drive Folder**: Root File Name**: Current timestamp + _n8n_export.csv 7. End Workflow (NoOp) Purpose**: Final node to explicitly end the workflow. ✅ Use Cases Personal Email Archiving**: Back up or export emails from a specific sender (e.g., invoices, reports). Audit Logs**: Save conversations for compliance. Team Reports**: Aggregate project emails into a central file store. 🔧 Customization Guide | Customization | How to Do It | |---------------------------|------------------------------------------------------------| | Change Sender Email | Update the sender field in the Gmail node. | | Filter by Date/Subject | Add filters in the Gmail node settings. | | Change Time Zone | Edit timeZone in the Code node. | | Add More Email Fields | Modify the Set node to include more fields. | | Change File Format | Use a different format in the Convert to File node. | | Rename Output File | Adjust the name in the Google Drive node. | | Change Upload Folder | Set a different folderId in the Google Drive node. | 🚀 Deployment Tips Schedule the Workflow**: Replace Manual Trigger with a Cron node. Avoid Duplicates**: Store email IDs and skip duplicates using conditional logic. Security**: Use environment variables for sensitive credentials. 🧪 Testing Steps Manually trigger the workflow. Verify email data is parsed and formatted. Confirm CSV is generated correctly. Ensure the file is uploaded to Google Drive. 🧰 Requirements Connected Gmail and Google Drive OAuth2 credentials. n8n instance (self-hosted or cloud). Required nodes available in the n8n environment. > 💡 Need more features? You can add: > - Error handling > - Slack/Email notifications > - Conditional filters > - Google Sheets integration instead of Drive
by Harshil Agrawal
This workflow appends, lookup, updates, and reads data from a Google Sheet spreadsheet. Set node: The Set node is used to generate data that we want to add to Google Sheets. Depending on your use-case you might have data coming from a different source. For example, you might be fetching data from a WebHook call. Add the node that will fetch the data that you want to add to the Google Sheet. Use can then use the Set node to set the data that you want to add to the Google Sheets. Google Sheets node: This node will add the data from the Set node in a new row to the Google Sheet. You will have to enter the Spreadsheet ID and the Range to specify which sheet you want to add the data to. Google Sheets1 node: This node looks for a specific value in the Google Sheet and returns all the rows that contain the value. In this example, we are looking for the value Berlin in our Google Sheet. If you want to look for a different value, enter that value in the Lookup Value field, and specify the column in the Lookup Column field. Set1 node: The Set node sets the value of the rent by $100 for the houses in Berlin. We pass this new data to the next nodes in the workflow. Google Sheets2 node: This node will update the rent for the houses in Berlin with the new rent set in the previous node. We are mapping the rows with their ID. Depending on your use-case, you might want to map the values with a different column. To set this enter the column name in the Key field. Google Sheets3 node: This node returns the information from the Google Sheet. You can specify the columns that should get returned in the Range field. Currently, the node fetches the data for columns A to D. To fetch the data only for columns A to C set the range to A:C. This workflow can be broken down into different workflows each with its own use case. For example, we can have a workflow that appends new data to a Google Sheet, and another workflow that lookups for a certain value and returns that value. You can learn to build this workflow on the documentation page of the Google Sheets node.
by Daniel Ng
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Restore n8n Credentials from Google Drive Backup This template enables you to restore your n8n credentials from a backup file in Google Drive. It's an essential companion to a credential backup workflow, ensuring you can recover your setup in case of data loss, instance migration, or disaster recovery. The workflow intelligently checks for existing credentials to prevent accidental overwrites of credentials with the same name that are already present. This workflow is manually triggered. We recommend you use this restore workflow in conjunction with a backup solution like our "Auto Backup Credentials to Google Drive" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Who is this for? This workflow is for n8n administrators and users who have backed up their n8n credentials to Google Drive (e.g., using a companion backup template) and need to restore them to the same or a different n8n instance. It's crucial for those managing self-hosted instances. What problem is this workflow solving? / use case If an n8n instance becomes corrupted, needs to be migrated, or if credentials are accidentally deleted, a manual re-creation of all credentials can be extremely time-consuming and error-prone. This workflow automates the restoration process from a known backup, saving significant time and ensuring accuracy. It's particularly useful for: Disaster recovery. Migrating n8n instances. Quickly setting up a new n8n instance with existing credentials. What this workflow does The workflow is manually triggered and performs the following operations: Fetch Current Credentials: An "On Click Trigger" starts the process. It executes the command npx n8n export:credentials --all --decrypted via the "Execute Command Get All Credentials" node to get a list of all credentials currently in your n8n instance. This list is then processed by "JSON Formatting Data" and "Aggregate Credentials" nodes to extract just the names of existing credentials for comparison. Download Backup File from Google Drive: The "Google Drive Get Credentials File" node searches your Google Drive for the n8n_backup_credentials.json file. The "Google Drive Download File" node then downloads the found file. Process Backup Data: The "Convert Files To JSON" (an Extract From File node) converts the downloaded file content, expected to be JSON, into a usable JSON object. "Split Out" nodes then process this data to handle individual credential entries from the backup file. Loop and Restore Credentials: The "Loop Over Items" (a SplitInBatches node) iterates through each credential from the backup file. Duplicate Check: For each credential, an "IF" node ("Check For Skipped Credentials") checks two conditions using an OR combinator: If the credential name from the backup ($('Loop Over Items').item.json.name) is empty. If a credential with the same name already exists in the current n8n instance (by checking against the list from the "Aggregate Credentials" node). Conditional Restore: If the credential name is NOT empty AND it does NOT already exist (i.e., the conditions in the IF node are false), the workflow proceeds to the "Restore N8n Credentials" node (an n8n API node). This node uses the name, type, and data for each new credential from the backup file to create it in the n8n instance. Credentials with empty names or those already present are skipped as they take the true path of the IF node, which loops back. A "Wait" node introduces a 1-second delay after each restoration attempt, to prevent API rate limiting before looping to the next item. Step-by-step setup n8n Instance Environment (for current credentials check): The n8n instance must have access to npx and n8n-cli for the "Execute Command Get All Credentials" node to function. Google Drive Credentials: Configure the "Google Drive Get Credentials File" and "Google Drive Download File" nodes with your Google OAuth2 credentials. n8n API Credentials: Configure the "Restore N8n Credentials" node with your n8n API credentials. This API key needs permissions to manage credentials. Backup File Name: The workflow is configured to search for a file named n8n_backup_credentials.json in the "Google Drive Get Credentials File" node. If your backup file has a different name or you want to specify a path, update the "Query String" parameter in this node. How to customize this workflow to your needs Backup File Location/Query:** Modify the "Google Drive Get Credentials File" node parameters if your backup file is in a specific folder, has a different naming convention, or if you want more specific query logic. Overwrite Logic:** The current workflow skips existing credentials by name. If you need to update/overwrite existing credentials, you would need to modify the logic in the "Check For Skipped Credentials" (IF) node and potentially use an "update" operation in the "n8n" API node if available for credentials (note: updates often require the credential ID, which might not be in the backup file). Notifications:** Add notification steps (e.g., Email, Slack) to report on the success or failure of the restoration process, and to list which credentials were restored or skipped. Selective Restore:** To restore only specific credentials, you could add a filter step after "Split Out1" or modify the IF condition in "Check For Skipped Credentials" to check for particular credential names or types from the backup file. Error Handling:** Implement more robust error handling for API errors (e.g., from the n8n API node or Google Drive nodes), file not found issues, or problems during command execution. Important Note on Credential Security Decrypted Backup File:** This workflow assumes the n8n_backup_credentials.json file contains decrypted credential data, typically created by a companion backup workflow. Execution Environment:** The "Execute Command Get All Credentials" node requires npx n8n-cli access on the server running n8n.
by Mauricio Perera
Overview This workflow exposes an HTTP endpoint (webhook) that accepts a JSON definition of an n8n workflow, validates it, and—if everything is correct—dynamically creates that workflow in the n8n instance via its internal API. If any validation fails or the API call encounters an error, an explanatory message with details is returned. Workflow Diagram Webhook │ ▼ Validate JSON ── fails validation ──► Validation Error │ └─ passes ─► Validation Successful? │ ├─ true ─► Create Workflow ──► API Successful? ──► Success Response │ │ │ └─ false ─► API Error └─ false ─► Validation Error Step-by-Step Details 1. Webhook Type**: Webhook (POST) Path**: /webhook/create-workflow Purpose**: Expose a URL to receive a JSON definition of a workflow. Expected Input**: JSON containing the main workflow fields (name, nodes, connections, settings). 2. Validate JSON Type**: Code Node (JavaScript) Validations Performed**: Ensure that payload exists and contains both name and nodes. Verify that nodes is an array with at least one item. Check that each node includes the required fields: id, name, type, position. If missing, initialize connections, settings, parameters, and typeVersion. Output if Error**: { "success": false, "message": "<error description>" } Output if Valid**: { "success": true, "apiWorkflow": { "name": payload.name, "nodes": payload.nodes, "connections": payload.connections, "settings": payload.settings } } 3. Validation Successful? Type**: IF Node Condition**: $json.success === true Branches**: true: proceed to Create Workflow false: route to Validation Error 4. Create Workflow Type**: HTTP Request (POST) URL**: http://127.0.0.1:5678/api/v1/workflows Authentication**: Header Auth with internal credentials Body**: The apiWorkflow object generated earlier Options**: continueOnFail: true (to handle failures in the next IF) 5. API Successful? Type**: IF Node Condition**: $response.statusCode <= 299 Branches**: true: proceed to Success Response false: route to API Error 6. Success Response Type**: SET Node Output**: { "success": "true", "message": "Workflow created successfully", "workflowId": "{{ $json.data[0].id }}", "workflowName": "{{ $json.data[0].name }}", "createdAt": "{{ $json.data[0].createdAt }}", "url": "http://localhost:5678/workflow/{{ $json.data[0].id }}" } 7. API Error Type**: SET Node Output**: { "success": "false", "message": "Error creating workflow", "error": "{{ JSON.stringify($json) }}", "statusCode": "{{ $response.statusCode }}" } 8. Validation Error Type**: SET Node Output**: { "success": false, "message": "{{ $json.message }}" } Example Webhook Request curl --location --request POST 'http://localhost:5678/webhook/create-workflow' \ --header 'Content-Type: application/json' \ --data-raw '{ "name": "My Dynamic Workflow", "nodes": [ { "id": "start-node", "name": "Start", "type": "n8n-nodes-base.manualTrigger", "typeVersion": 1, "position": [100, 100], "parameters": {} }, { "id": "set-node", "name": "Set", "type": "n8n-nodes-base.set", "typeVersion": 1, "position": [300, 100], "parameters": { "values": { "string": [ { "name": "message", "value": "Hello from a webhook-created workflow!" } ] } } } ], "connections": { "Start": { "main": [ [ { "node": "Set", "type": "main", "index": 0 } ] ] } }, "settings": {} }' Expected Success Response { "success": "true", "message": "Workflow created successfully", "workflowId": "abcdef1234567890", "workflowName": "My Dynamic Workflow", "createdAt": "2025-05-31T12:34:56.789Z", "url": "http://localhost:5678/workflow/abcdef1234567890" } Validation Error Response { "success": false, "message": "The 'name' field is required in the workflow" } API Error Response { "success": "false", "message": "Error creating workflow", "error": "{ ...full API response details... }", "statusCode": 401 }
by Nikan Noorafkan
🧾 Template: Extract Ad Creatives from Google’s Ads Transparency Center This n8n workflow pulls ad creatives from Google's Ads Transparency Center using SerpApi, filtered by a specific domain and region. It extracts, filters, categorizes, and exports ads into neatly formatted CSV files for easy analysis. 👤 Who’s it for? Marketing Analysts** researching competitive PPC strategies Ad Intelligence Teams** monitoring creatives from specific brands Digital Marketers** gathering visual and copy trends Journalists & Watchdogs** reviewing ad activity transparency ✅ Features Fetch creatives** using SerpApi's google_ads_transparency_center engine Filter results** to include only ads with an exact match to your target domain Categorize** by ad format: text, image, or video Export CSVs**: Generates a downloadable file for each format under the /files/ directory 🛠 How to Use Edit the “Set Domain & Region” node domain: e.g. example.com region: SerpApi numeric region code → See codes Add your SerpApi API key In the “Get Ads Page 1” node’s credentials section. Run the workflow Click "Test workflow" to initiate the process. Download your results Navigate to /files/ to find: text_{domain}_ads.csv image_{domain}_ads.csv video_{domain}_ads.csv 📌 Notes Only the first page (up to 50 creatives) is fetched; pagination is not included. Sticky Notes inside the workflow nodes offer helpful internal annotations. CSV files include creative-level details: ad copy, images, video links, etc.
by Jimleuk
This n8n template showcases the new HTTP tool released in version 1.47.0. Overall, the tool helps simplify AI Agent workflows where custom sub-workflows were performing the same simple http requests. Comparisons 1. AI agent that can scrape webpages Remake of https://n8n.io/workflows/2006-ai-agent-that-can-scrape-webpages/ Changes: Replaces Execute Workflow Tool and Subworkflow Replaces Response Formatting 2. Allow your AI to call an API to fetch data Remake of https://n8n.io/workflows/2094-allow-your-ai-to-call-an-api-to-fetch-data/ Changes: Replaces Execute Workflow Tool and Subworkflow Replaces Manual Query Params Definitions Replaces Response Formatting