by Anthony
What this workflow does Linkedin tracks which Chrome extensions are installed in your browser. This workflow uses a huge raw JSON of chrome extension ids, extracted from Linkedin pages, and builds a pretty Google Sheet with the list of these extensions. This workflow web scrapes Google to search for chrome extension id - and extracts the first search result. Setup Clone this Google Sheet template: https://docs.google.com/spreadsheets/d/1nVtoqx-wxRl6ckP9rBHSL3xiCURZ8pbyywvEor0VwOY/edit?gid=0#gid=0 Get API key for Google SERP API access here: https://rapidapi.com/restyler/api/serp-api1 Create n8n header auth for Google SERP API Some context and discussion https://www.linkedin.com/feed/update/urn:li:activity:7245006911807393792/ Follow the author and get the final Google Sheet with 1300+ Chrome extensions: https://www.linkedin.com/in/anthony-sidashin/
by Akram Kadri
Who is this for? This workflow is designed for YouTubers who want to update their video descriptions in bulk without manually editing each one. It's especially useful for creators who include a standard set of links in their descriptions and need to insert a new link between existing ones across multiple videos. What problem does this workflow solve? Manually updating video descriptions for multiple videos can be tedious and time-consuming. If you have a section in your video descriptions that contains important links, adding a new one in a specific position (e.g., between two existing links) can be a challenge. This workflow automates that process, allowing you to insert a specific string between two predefined rows in all of your video descriptions at once. What this workflow does Fetches all videos from your YouTube channel. Iterates through each video to retrieve its existing description. Identifies two predefined rows in the description. Inserts a new row between the two specified rows. Updates the video description with the modified text. Setup Connect your YouTube account to n8n and grant necessary permissions. Define your variables in the "Set String to Insert" node: rowBefore: The existing row after which the new row will be inserted. rowToInsert: The new text or link to insert. rowAfter: The existing row before which the new row will be inserted. Run the workflow using the manual trigger. Review the updated descriptions to ensure accuracy. How to customize this workflow to your needs Change the insertion criteria** by modifying the rowBefore and rowAfter values. Insert multiple rows** by adjusting the JavaScript code in the Code node. Extend the workflow** by adding conditions (e.g., only updating descriptions of videos with certain tags). Filter specific** videos instead of updating all by modifying the "Get All Videos" node. This workflow ensures that all your YouTube descriptions stay updated and consistent with minimal effort.
by Kevin
Monitor Postgres Data Freshness and Email Alert If Stale This template monitors a set of tables inside a Postgres database to ensure they're getting updated. If the table hasn't been updated in 3 days (configurable), an email alert is sent containing the tables that are stale. Requirements You must have a Postgres database containing one or more tables that you'd like to monitor. Each table to monitor must have a date or timestamp column that tracks when data was pushed. For example, this might be: A timestamp column if your table holds event/timeseries data A last_updated column if your rows are expected to be modified Usage Use this template Add your Postgres and email credentials Adjust the Produce tables + date columns node to produce pairs of [table, date_column] that should be monitored for freshness 💁♂️ Note that a timestamp column also works (Optional) Adjust the Remove fresh tables node for your desired staleness window (default is 3 days, but you can adjust as you please) (Optional) Customize the Send alerts node to call whichever alerting workflow you please (I recommend my alerting workflow for easiest plug-and-play) How it works This template works by: Pulling the most recent row for each table Calculating how out-of-date each table is, in days Dropping fresh tables that have been updated within the past 3 days Sending an email alert with the stale tables that haven't been updated within the past 3 days
by Prakash
Who is this for? This workflow is ideal for: Developers** who want to stay updated on issues without constantly checking GitHub. Managers** tracking issue progress in a Telegram group. DevOps teams that need automated notification alerts for new or updated issues. What problem does this workflow solve? Keeping track of GitHub issues manually can be tedious. Teams often miss critical updates because notifications are buried in emails or dashboards. This workflow automates the process by fetching new or open GitHub issues and instantly sending notifications to a specified Telegram chat. What this workflow does This workflow connects GitHub and Telegram to provide real-time issue notifications: Fetch GitHub Issues – Retrieves new or open issues from a selected GitHub repository. Format the Issue Details – Extracts key information like issue title, number, status, and URL. Send to Telegram – Posts the formatted issue details to a Telegram group or private chat. Setup Guide Prerequisites Before setting up the workflow, ensure you have: GitHub Personal Access Token**: Required to fetch issue details. Generate it under Developer Settings with repo or public_repo permissions. Telegram Bot Token**: Create a bot via BotFather on Telegram and obtain the token. Telegram Chat ID**: Find the chat ID where the bot should send messages using this method. Step-by-Step Setup Set Up GitHub Node Authenticate using your GitHub token. Choose the repository you want to track. Configure filters (e.g., fetch only open issues). Format Issue Details Extract key details like title, issue number, assignee, and status. Customize the message structure for better readability. Send Message to Telegram Add the Telegram node and enter your bot token. Use the Chat ID to define the recipient. Format the message to include issue details and links. Schedule the Workflow (Optional) Use the Cron node to run this workflow periodically (e.g., every hour). How to Customize This Workflow Filter Issues by Labels**: Modify the GitHub node to fetch only issues with specific labels. Include Additional Fields**: Add issue comments, priority, or assignee details in the message. Send Alerts Based on Priority**: Use conditional logic to send high-priority issues to a different chat. Trigger on Issue Events**: Instead of fetching periodically, use GitHub webhooks (if permitted in the repo) to trigger the workflow on issue creation or updates. Why Use This Workflow? Automates GitHub issue tracking** without manually checking repositories. Instant notifications in Telegram** ensure quick response times. Fully customizable** to fit different team workflows.
by n8n custom workflows
Introduction The namesilo Bulk Domain Availability workflow is a powerful automation solution designed to check the registration status of multiple domains simultaneously using the Namesilo API. This workflow efficiently processes large lists of domains by splitting them into manageable batches, adhering to API rate limits, and compiling the results into a convenient Excel spreadsheet. It eliminates the tedious process of manually checking domains one by one, saving significant time for domain investors, web developers, and digital marketers. The workflow is particularly valuable during brainstorming sessions for new projects, when conducting domain portfolio audits, or when preparing domain acquisition strategies. By automating the domain availability check process, users can quickly identify available domains for registration without the hassle of navigating through multiple web interfaces. Who is this for? This workflow is ideal for: Domain investors and flippers who need to check multiple domains quickly Web developers and agencies evaluating domain options for client projects Digital marketers researching domain availability for campaigns Business owners exploring domain options for new ventures IT professionals managing domain portfolios Users should have basic familiarity with n8n workflow concepts and a Namesilo account to obtain an API key. No coding knowledge is required, though understanding of domain name systems would be beneficial. What problem is this workflow solving? Checking domain availability one-by-one is a time-consuming and tedious process, especially when dealing with dozens or hundreds of potential domains. This workflow solves several key challenges: Manual Inefficiency: Eliminates the need to individually search for each domain through registrar websites. Rate Limiting: Handles API rate limits automatically with built-in waiting periods. Data Organization: Compiles availability results into a structured Excel file rather than scattered notes or multiple browser tabs. Bulk Processing: Processes up to 200 domains per batch, with the ability to handle unlimited domains across multiple batches. Time Management: Frees up valuable time that would otherwise be spent on repetitive manual checks. What this workflow does Overview The workflow takes a list of domains, processes them in batches of up to 200 domains per request (to comply with API limitations), checks their availability using the Namesilo API, and compiles the results into an Excel spreadsheet showing which domains are available for registration and which are already taken. Process Input Setup: The workflow begins with a manual trigger and uses the "Set Data" node to collect the list of domains to check and your Namesilo API key. Domain Processing: The "Convert & Split Domains" node transforms the input list into batches of up to 200 domains to comply with API limitations. Batch Processing: The workflow loops through each batch of domains. API Integration: For each batch, the "Namesilo Requests" node sends a request to the Namesilo API to check domain availability. Data Parsing: The "Parse Data" node processes the API response, extracting information about which domains are available and which are taken. Rate Limit Management: A 5-minute wait period is enforced between batches to respect Namesilo's API rate limits. Data Compilation: The "Merge Results" node combines all the availability data. Output Generation: Finally, the "Convert to Excel" node creates an Excel file with two columns: Domain and Availability (showing "Available" or "Unavailable" for each domain). Setup Import the workflow: Download the workflow JSON file and import it into your n8n instance. Get Namesilo API key: Create a free account at Namesilo and obtain your API key from https://www.namesilo.com/account/api-manager Configure the workflow: Open the "Set Data" node Enter your Namesilo API key in the "Namesilo API Key" field Enter your list of domains (one per line) in the "Domains" field Save and activate: Save the workflow and run it using the manual trigger. How to customize this workflow to your needs Modify domain input format**: You can adjust the code in the "Convert & Split Domains" node if your domain list comes in a different format. Change batch size**: If needed, you can modify the batch size (currently set to 200) in the "Convert & Split Domains" node to accommodate different API limitations. Adjust wait time**: If you have a premium API account with different rate limits, you can modify the wait time in the "Wait" node. Enhance output format**: Customize the "Convert to Excel" node to add additional columns or formatting to the output file. Add domain filtering**: You could add a node before the API request to filter domains based on specific criteria (length, keywords, TLDs). Integrate with other services**: Connect this workflow to domain registrars to automatically register available domains that meet your criteria.
by Hubschrauber
A single workflow with 2 flows/paths that combine to handle the backup sequence for Zigbee device configuration from HomeAssistant / zigbee2mqtt. This provides a way to automate a periodic capture of Zigbee coordinators and device pairings to speed the recovery process when/if the HomeAssistant instance needs to be rebuilt. Setting up similar automation without n8n (e.g. shell scripts and system timers) is consiterably more challenging. n8n makes it easy and this template should remove any other excuse not to do it. Flow 1 Triggered by Cron/Timer set whatever interval for backups sends mqtt message to request zigbee2mqtt backup (via separate message) Flow 2 Triggered by zigbee2mqtt backup message Extracts zip file from the message and stores somewhere, with a date-stamp in the filename, via sftp Setup Create a MQTT connection named "MQTT Account" with the appropriate protocol (mqtt), host, port (1883), username, and password Create an sftp connection named "SFTP Zigbee Backups" with the appropriate host, port (22), username, and password or key. Reference This article describes the mqtt parts.
by Hiroshi
What this workflow does This workflow in n8n demonstrates how to send a message in Lark using a Lark bot. It begins with a manual trigger and then retrieves the necessary Lark token via a POST request. The token is used to authenticate and send a message to a specific chat using the Lark API. The input node provides the required app_id, app_secret, chat_id, and message content. After obtaining the token, the message is sent with the Lark API's message/v4/send/ endpoint. Who This Is For This n8n workflow is ideal for organizations, teams, and developers who need to automate message sending within Lark, especially those managing notifications, alerts, or team reminders. It can help users reduce manual messaging tasks by leveraging a Lark bot to deliver messages at specific intervals or based on particular conditions, enhancing team communication and responsiveness. Setup Fill the Input node with your values Exchange the bearer token in the Send Message node with your token Author: Hiroshi
by phil
This workflow automates the backup of your n8n workflows data to Google Drive every day. It ensures that important configurations and execution logs are securely stored, reducing the risk of data loss and improving workflow resilience. 🔹 Why Use This? ✅ Automates routine backups effortlessly. ✅ Reduces manual intervention and potential data loss. ✅ Securely stores critical workflow configurations in Google Drive. With this workflow, you can focus on innovation while n8n takes care of your backups. 🔐✨ 🚀 How It Works This workflow operates seamlessly with a combination of scheduled triggers, JSON data transformation, and secure cloud storage. 🛠 Setup Steps Trigger the backup – Choose between manual execution or automated scheduling at 1:30 AM daily. Data preparation – Your workflow parameters define the backup location and organize files effectively. Transformation & Encoding – The data is processed and converted into a JSON file in base64 format. Cloud Storage – The backup is securely uploaded to your designated Google Drive folder. 🔧 Customization Options You can modify various aspects of the backup workflow to better suit your needs: 1️⃣ Adjusting Backup Frequency By default, the workflow runs daily at 1:30 AM. To change this: Open the Trigger Node in n8n. Modify the Cron Expression or select a different frequency (e.g., hourly, weekly, or custom intervals). 2️⃣ Selecting Specific Workflows to Backup Instead of backing up all workflows, you can filter which ones to include: Add a Filter Node before exporting data. Define specific workflow IDs or names to include in the backup. 3️⃣ Changing the Backup Destination The default destination is Google Drive, but you can change this: Replace the Google Drive Node with a different storage provider (e.g., Dropbox, AWS S3, or local storage via FTP/SFTP). Configure authentication for the new destination. 4️⃣ Modifying Data Format By default, the workflow stores data in JSON format. If you need a different format: Convert JSON to CSV using the Spreadsheet File Node. Store backups in a compressed format (ZIP) by adding a Compression Node. 5️⃣ Encrypting the Backup for Extra Security For added protection: Use the Crypto Node to encrypt the JSON file before uploading. Set up an Access-Controlled Folder in Google Drive with limited permissions. ✅ Verify That Your Backup Works Before relying on this workflow for your automated backups, make sure it works correctly by performing a quick test: Manually trigger the workflow in n8n and check if the backup file appears in your Google Drive. Open Google Drive, navigate to the backup folder, and download the JSON file. Verify its content by checking if the data matches your workflow’s execution logs. Try to import the JSON file back into n8n using the “Import File” function to ensure the workflow structure is intact. Alternatively, copy and paste a test file into Google Drive and confirm that it appears correctly in your workflow logs. This quick test will confirm that your backup is running smoothly and that your data is retrievable whenever needed. 📁 How to Find Your Google Drive Directory ID To ensure that the backup is uploaded to the correct folder, you need to retrieve your Google Drive Directory ID. Follow these simple steps: Open Google Drive. Navigate to the folder where you want to store your backups. Click on the folder and check the URL in your browser. The Directory ID is the long string of characters at the end of the URL after /folders/. Example: 📌 If your folder URL is: https://drive.google.com/drive/folders/14oUlH_LW_NT0Xb2woZWvuzRncV-bhla Then, your Directory ID is: 14oUlH_LW_NT0Xb2woZWvuzRncV-bhla Copy this Directory ID and use it in the workflow's parameters to ensure the backup is saved in the correct location. Phil | Inforeole
by Hunyao
What it does Captures token usage and cost from your AI Agent/LLM. Logs model, tokens, cost, tool use, and conversation I/O to Google Sheets for simple observability and billing. Perfect for Developers adding usage monitoring to AI agents. Teams needing cost transparency in prototypes. How it works Chat Trigger collects user input for the AI Agent. A Set node injects metadata like workflow, execution, and client IDs. LangChain Code node returns a configured Chat model with a callback that reads usage metadata. The callback computes input, output, and total costs based on per‑million token prices you define. It appends token metrics to a Google Sheet via the Google Sheets Tool. The Agent records intermediate tool calls. An If node checks whether a tool was used. When tools are used, the workflow logs input, output, tool name, and metadata to an Observability sheet. How to use SELF-HOSTED N8N ONLY - the Langchain Code node is only available in the self-hosted version of n8n. It is not available in n8n cloud. Requirements Self-hosted version of n8n If you have any questions in running the workflow, see the attached video: https://youtu.be/JSulRS128MA
by n8n Team
This workflow creates an Asana task when a new ticket is created in Zendesk. Subsequent comments on the ticket in Zendesk are added as comments to the task in Asana. Prerequisites Zendesk account and Zendesk credentials. Asana account and Asana credentials. Asana workspace to create tasks in. How it works The workflow listens for new tickets in Zendesk. When a new ticket is created, the workflow creates a new task in Asana. The Asana GID is then saved in one of the ticket's fields (in setup we call this "Asana GID"). The next time a comment is added to the ticket, the workflow retrieves the Asana GID from the ticket's field and adds the comment to the task in Asana. Setup This workflow requires that you set up a webhook in Zendesk. To do so, follow the steps below: In the workflow, open the On new Zendesk ticket node and copy the webhook URL. In Zendesk, navigate to Admin Center > Apps and integrations > Webhooks > Actions > Create Webhook. Add all the required details which can be retrieved from the On new Zendesk ticket node. The webhook URL gets added to the “Endpoint URL” field, and the “Request method” should match what is shown in n8n. Save the webhook. In Zendesk, navigate to Admin Center > Objects and rules > Business rules > Triggers > Add trigger. Give trigger a name such as “New tickets”. Under “Conditions” in “Meet ALL of the following conditions”, add “Status is New”. Under “Actions”, select “Notify active webhook” and select the webhook you created previously. In the JSON body, add the following: { "id": "{{ticket.id}}", "comment": "{{ticket.latest_comment_html}}" } Save the Zendesk trigger. You will also need to set up a field in Zendesk to store the Asana GID. To do so, follow the steps below: In Zendesk, navigate to Admin Center > Objects and rules > Tickets > Fields > Add field. Use the number field option and give the field a name such as “Asana GID”. Save the field. In n8n, open the Update ticket node and select the field you created in Zendesk.
by ist00dent
This n8n template enables you to instantly generate high-quality screenshots of any specified public URL by simply sending a webhook request. It’s an indispensable tool for developers, content creators, marketers, or anyone needing on-demand visual captures of web pages without manual intervention, all while including crucial security measures. 🔧 How it works Receive URL Webhook: This node acts as the entry point for the workflow. It listens for incoming POST requests and expects a JSON body containing a url property with the website you want to screenshot. You can trigger it from any application or service capable of sending an HTTP POST request. Validate URL for SSRF: This is a crucial security step. This Function node validates the incoming url to prevent Server-Side Request Forgery (SSRF) vulnerabilities. It checks for valid http:// or https:// protocols and, more importantly, ensures the URL does not attempt to access internal/private IP addresses or localhost. If the URL is deemed unsafe or invalid, it flags it for an error response. IF URL Valid: This IF node checks the isValidUrl flag set by the previous validation step. If the URL is valid (true), the workflow proceeds to take the screenshot. If the URL is invalid or flagged for security (false), the workflow branches to Respond with Validation Error. Take Screenshot: This node sends an HTTP GET request to the ScreenshotMachine API to capture an image of the validated URL. Remember to replace YOUR_API_KEY in the URL field of this node with your actual API key from ScreenshotMachine. Respond with Screenshot Data: This node sends the data received directly from the Take Screenshot node back to the original caller of the webhook. This response typically includes information about the generated screenshot, such as the URL to the image file, success status, and other metadata from the ScreenshotMachine API. Respond with Validation Error: If the IF URL Valid node determines the URL is unsafe or invalid, this node sends a descriptive error message back to the webhook caller, explaining why the request was denied due to security concerns or an invalid format. 🔒 Security Considerations This template includes a dedicated Validate URL for SSRF node to mitigate Server-Side Request Forgery (SSRF) vulnerabilities. SSRF attacks occur when an attacker can trick a server-side application into making requests to an unintended location. Without validation, an attacker could potentially use your n8n workflow to scan internal networks, access sensitive internal resources, or attack other services from your n8n server. The validation checks for: Only http:// or https:// protocols. Prevention of localhost or common private IP ranges (e.g., 10.x.x.x, 172.16.x.x - 172.31.x.x, 192.168.x.x). While this validation adds a significant layer of security, always ensure your n8n instance is properly secured and updated. 👤 Who is it for? This workflow is ideal for: Developers: Automate screenshot generation for testing, monitoring, or integrating visual content into applications. Content Creators: Quickly grab visuals for articles, presentations, or social media posts. Marketing Teams: Create dynamic visual assets for campaigns, ads, or competitive analysis. Automation Enthusiasts: Integrate powerful screenshot capabilities into existing automated workflows. Website Owners: Monitor how your website appears across different tools or over time. 📑 Prerequisites To use this template, you will need: An n8n instance (cloud or self-hosted). An API Key from ScreenshotMachine. You can obtain one by signing up on their website: https://www.screenshotmachine.com/ 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "url": "https://www.example.com" } If the URL is valid, the workflow will return the JSON response directly from the ScreenshotMachine API. This response typically includes information about the generated screenshot, such as the URL to the image file, success status, and other metadata: { "status": "success", "hash": "...", "url": "https://www.screenshotmachine.com/...", "size": 12345, "mimetype": "image/jpeg" } If the URL is invalid or blocked by the security validation, the workflow will return an error response similar to this: { "status": "error", "message": "Access to private IP addresses is not allowed for security reasons." } ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "File" > "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive URL Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /website-screenshot). Add ScreenshotMachine API Key: Double-click the Take Screenshot node. In the 'URL' parameter, locate YOUR_API_KEY and replace it with your actual API key obtained from ScreenshotMachine. Example URL structure: http://api.screenshotmachine.com/?key=YOUR_API_KEY&url={{ $json.validatedUrl }} Activate Workflow: Save and activate the workflow. 📝 Tips Processing Screenshots: You're not limited to just responding with the screenshot data! You can insert additional nodes after the Take Screenshot node (and before the Respond with Screenshot Data node) to further process or utilize the generated image. Common extensions include: Saving to Cloud Storage: Use nodes for Amazon S3, Google Drive, or Dropbox to store the screenshots automatically, creating an archive. Sending via Email: Attach the screenshot to an email notification using an Email or Gmail node for automated alerts or reports. Posting to Chat Platforms: Share the screenshot directly in a Slack, Discord, or Microsoft Teams channel for team collaboration or visual notifications. Image Optimization: Use an image processing node (if available via an API or a custom function) to resize, crop, or compress the screenshot before saving or sending. Custom Screenshot Parameters: The ScreenshotMachine API supports various optional parameters (e.g., width, height, quality, delay, fullpage). Upgrade: Extend the Receive URL Webhook to accept these parameters in the incoming JSON body (e.g., {"url": "...", "width": 1024, "fullpage": true}). Leverage: Dynamically pass these parameters to the Take Screenshot HTTP Request node's URL to customize your screenshots for different use cases. Scheduled Monitoring: Upgrade: Combine this workflow with a Cron or Schedule node. Set it to run periodically (e.g., daily, hourly). Leverage: Automatically monitor your website or competitors' sites for visual changes. You could then save screenshots to cloud storage and even trigger a comparison tool if a change is detected. Automated Visual Regression Testing: Upgrade: After taking a screenshot, store it with a unique identifier. In subsequent runs, take a new screenshot, then use an external image comparison API or a custom function to compare the new screenshot with a baseline. Leverage: Get automated alerts if visual elements on your website change unexpectedly, which is critical for quality assurance. Dynamic Image Generation for Social Media/Marketing: Upgrade: Feed URLs (e.g., for new blog posts, product pages) into this workflow. After generating the screenshot, use it to create dynamic social media images or marketing assets. Leverage: Streamline the creation of engaging visual content, saving design time.
by Corentin Ribeyre
This template can be used to verify email addresses with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into four steps : The workflow initiates with a manual trigger (On clicking ‘execute’). It reads your Google Sheet file. It converts your file to an array. It connects to your Icypeas account. It performs an HTTP request to verify the emails. Set up steps You will need a formated Google sheet file with email addresses. You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need email addresses to verify them.