by Anthony
What this workflow does Linkedin tracks which Chrome extensions are installed in your browser. This workflow uses a huge raw JSON of chrome extension ids, extracted from Linkedin pages, and builds a pretty Google Sheet with the list of these extensions. This workflow web scrapes Google to search for chrome extension id - and extracts the first search result. Setup Clone this Google Sheet template: https://docs.google.com/spreadsheets/d/1nVtoqx-wxRl6ckP9rBHSL3xiCURZ8pbyywvEor0VwOY/edit?gid=0#gid=0 Get API key for Google SERP API access here: https://rapidapi.com/restyler/api/serp-api1 Create n8n header auth for Google SERP API Some context and discussion https://www.linkedin.com/feed/update/urn:li:activity:7245006911807393792/ Follow the author and get the final Google Sheet with 1300+ Chrome extensions: https://www.linkedin.com/in/anthony-sidashin/
by Jaruphat J.
Who is this for? This workflow is ideal for businesses, accountants, and finance teams who receive bank slip images via LINE and want to automate the extraction of transaction details. It eliminates manual data entry and speeds up financial tracking. What problem does this workflow solve? Many businesses receive bank transfer slips via LINE from customers, but manually recording transaction details into spreadsheets is time-consuming and error-prone. This workflow automates the entire process, extracting structured data from the bank slips and storing it in Google Sheets for seamless record-keeping. What this workflow does: Receives bank slip images from LINE BOT Extracts transaction details (sender, receiver, amount, transaction ID) using SpaceOCR Automatically logs extracted data into Google Sheets Works with Standard Bank Slips & PromptPay transactions Eliminates manual data entry and reduces errors Setup Instructions: 1. Prerequisites A LINE BOT with Messaging API enabled A SpaceOCR API Key (Get from https://spaceocr.com/) A Google Sheets account to store extracted data An n8n instance running (Cloud or Self-hosted) 2. Setup Google Sheets Create a Google Sheet with the following structure: A (Date) B (Time) C (Sender) D (Receiver) E (Bank Name) F (Amount) G (Transaction ID) Ensure your Google Sheets API is enabled and connected to n8n. For an example of the required format, check this Google Sheets template: Google Sheets Template 3. Configure n8n Workflow 1. Webhook Node (Receives bank slip from LINE BOT) Set method:* Set Path:* 2. HTTP Request (Download Image from LINE Message) Retrieves image URL from the LINE message payload 3. SpaceOCR Node (Extract Text from Bank Slip) Input:* API Key:* #### 4. Google Sheets Node (Save Transaction Data) Select your Google Sheet Map extracted data (sender, receiver, amount, etc.) to the respective columns 4. Deploy & Test Activate the workflow in n8n Set Webhook URL in LINE Developer Console Send a test bank slip image to the LINE BOT Check Google Sheets for extracted transaction data
by n8n custom workflows
Introduction The namesilo Bulk Domain Availability workflow is a powerful automation solution designed to check the registration status of multiple domains simultaneously using the Namesilo API. This workflow efficiently processes large lists of domains by splitting them into manageable batches, adhering to API rate limits, and compiling the results into a convenient Excel spreadsheet. It eliminates the tedious process of manually checking domains one by one, saving significant time for domain investors, web developers, and digital marketers. The workflow is particularly valuable during brainstorming sessions for new projects, when conducting domain portfolio audits, or when preparing domain acquisition strategies. By automating the domain availability check process, users can quickly identify available domains for registration without the hassle of navigating through multiple web interfaces. Who is this for? This workflow is ideal for: Domain investors and flippers who need to check multiple domains quickly Web developers and agencies evaluating domain options for client projects Digital marketers researching domain availability for campaigns Business owners exploring domain options for new ventures IT professionals managing domain portfolios Users should have basic familiarity with n8n workflow concepts and a Namesilo account to obtain an API key. No coding knowledge is required, though understanding of domain name systems would be beneficial. What problem is this workflow solving? Checking domain availability one-by-one is a time-consuming and tedious process, especially when dealing with dozens or hundreds of potential domains. This workflow solves several key challenges: Manual Inefficiency: Eliminates the need to individually search for each domain through registrar websites. Rate Limiting: Handles API rate limits automatically with built-in waiting periods. Data Organization: Compiles availability results into a structured Excel file rather than scattered notes or multiple browser tabs. Bulk Processing: Processes up to 200 domains per batch, with the ability to handle unlimited domains across multiple batches. Time Management: Frees up valuable time that would otherwise be spent on repetitive manual checks. What this workflow does Overview The workflow takes a list of domains, processes them in batches of up to 200 domains per request (to comply with API limitations), checks their availability using the Namesilo API, and compiles the results into an Excel spreadsheet showing which domains are available for registration and which are already taken. Process Input Setup: The workflow begins with a manual trigger and uses the "Set Data" node to collect the list of domains to check and your Namesilo API key. Domain Processing: The "Convert & Split Domains" node transforms the input list into batches of up to 200 domains to comply with API limitations. Batch Processing: The workflow loops through each batch of domains. API Integration: For each batch, the "Namesilo Requests" node sends a request to the Namesilo API to check domain availability. Data Parsing: The "Parse Data" node processes the API response, extracting information about which domains are available and which are taken. Rate Limit Management: A 5-minute wait period is enforced between batches to respect Namesilo's API rate limits. Data Compilation: The "Merge Results" node combines all the availability data. Output Generation: Finally, the "Convert to Excel" node creates an Excel file with two columns: Domain and Availability (showing "Available" or "Unavailable" for each domain). Setup Import the workflow: Download the workflow JSON file and import it into your n8n instance. Get Namesilo API key: Create a free account at Namesilo and obtain your API key from https://www.namesilo.com/account/api-manager Configure the workflow: Open the "Set Data" node Enter your Namesilo API key in the "Namesilo API Key" field Enter your list of domains (one per line) in the "Domains" field Save and activate: Save the workflow and run it using the manual trigger. How to customize this workflow to your needs Modify domain input format**: You can adjust the code in the "Convert & Split Domains" node if your domain list comes in a different format. Change batch size**: If needed, you can modify the batch size (currently set to 200) in the "Convert & Split Domains" node to accommodate different API limitations. Adjust wait time**: If you have a premium API account with different rate limits, you can modify the wait time in the "Wait" node. Enhance output format**: Customize the "Convert to Excel" node to add additional columns or formatting to the output file. Add domain filtering**: You could add a node before the API request to filter domains based on specific criteria (length, keywords, TLDs). Integrate with other services**: Connect this workflow to domain registrars to automatically register available domains that meet your criteria.
by Sleak
Who is this template for? This workflow template is designed for people seeking alerts when certain specific changes are made to any web page. Leveraging agentic AI, it analyzes the page every day and autonomously decides whether to send you an e-mail notification. Example use cases Track price changes on [competitor's website]. Notify me when the price drops below €50. Monitor new blog posts on [industry leader's website] and summarize key insights. Check [competitor's job page] for new job postings related to software development. Watch for new product launches on [e-commerce site] and send me a summary. Detect any changes in the terms and conditions of [specific website]. Track customer reviews for [specific product] on [review site] and extract key themes. How it works When clicking 'test workflow' in the editor, a new browser tab will open where you can fill in the details of your espionage assignment Make sure you be as concise as possible when instructing AI. Instruct specific and to the point (see examples at the bottom). After submission, the flow will start off by extracting both the relevant website url and an optimized prompt. OpenAI's structured outputs is utilized, followed by a code node to parse the results for further use. From here on, the endless loop of daily checks will begin: Initial scrape 1 day delay Second scrape AI agent decides whether or not to notify you Back to step 1 You can cancel an espionage assignment at any time in the executions tab Set up steps Insert your OpenAI API key in the structured outputs node (second one) Create a Firecrawl account and connect your Firecrawl API key in both 'Scrape page'-nodes Connect your OpenAI account in the AI agents' model node Connect your Gmail account in the AI agents' Gmail tool node
by ömer
Generate and Publish AI Content to LinkedIn and X (Twitter) with n8n Overview This n8n workflow automates the generation and publishing of AI-powered social media content across LinkedIn and X (formerly Twitter). By leveraging AI, this workflow helps social media managers, marketers, and content creators streamline their posting process. Who is this for? Social media managers Content creators Digital marketers Businesses looking to automate content generation Features AI-powered content creation** tailored for LinkedIn and X (Twitter) Automated publishing** to both platforms Structured output parsing** to ensure consistency OAuth2 authentication** for secure posting Merge and confirmation steps** to track successful postings Setup Instructions Prerequisites Before using this workflow, ensure you have: An n8n instance set up API credentials for: Google Gemini AI (for content generation) X Developer Account with OAuth2 authentication LinkedIn Developer Account with OAuth2 authentication A form submission service integrated with n8n Workflow Breakdown 1. Trigger: Form Submission A user submits a form containing the post title. The form is secured with Basic Authentication. The submitted title is passed to the AI Agent. 2. AI Content Generation The Google Gemini Chat Model processes the title and generates: LinkedIn post content Twitter (X) post content Hashtags Call-to-action (LinkedIn) Character limit check (Twitter) 3. Parsing AI Output A structured output parser converts the AI-generated content into a JSON format. Ensures correct formatting for LinkedIn and Twitter (X). 4. Publishing to Social Media X (Twitter) Posting Extracts the Twitter post from the AI output. Publishes it via an OAuth2-authenticated X (Twitter) account. LinkedIn Posting Extracts the LinkedIn post from the AI output. Publishes it via an OAuth2-authenticated LinkedIn account. 5. Merging Post Results Merges the response data from both LinkedIn and Twitter after publishing. 6. Confirmation Step Displays a final confirmation form once the posts are successfully published. Benefits Save time** by automating content creation and publishing. Ensure consistency** across platforms with structured AI-generated posts. Secure authentication** using OAuth2 for LinkedIn and Twitter. Increase engagement** with AI-optimized hashtags and CTAs. This workflow enables seamless social media automation, helping professionals post engaging AI-powered content effortlessly. 🚀
by FORK SOFTWARE TECHNOLOGIES INC.
Overview This n8n workflow is specifically designed to monitor the USDT ERC-20 balance within a specific wallet. It uses Etherscan's public blockchain database, which does not require API authentication, to periodically check and process transaction data. This workflow is ideal for users who need an automated solution to track ERC-20 wallet transactions. Features Automatic Monitoring**: Executes every 5 minutes to capture new transactions. Customizable Filters**: Customize tracking based on parameters like transaction duration and wallet addresses. Data Aggregation**: Compiles transaction data into a single, structured list. Formatted Outputs**: Presents processed data in an organized format. Telegram Tracking**: Tracks wallet balances via Telegram notifications using the bot. Requirements n8n Setup**: Requires a self-hosted or cloud-based n8n instance. Basic Understanding**: Basic knowledge of n8n workflows and nodes. Installation and Configuration Import Workflow: Load the provided JSON workflow into your n8n instance. Configure the User Data Node: Enter your ERC-20 Wallet Address in the 'Your Wallet Address' field. Enter your Etherscan API Key in the “Your Etherscan API Key” field. Enter your USDT ERC-20 Contract Address in the "Your ERC-20 USDT Contract Address" field (0xdAC17F958D2ee523a2206206994597C13D831ec7). You can also monitor another token by entering a different contract address. Configure the Telegram Node: Go to Telegram and search for "BotFather". Select /newbot from the BotFather menu to create your bot. Get the API key BotFather provides. Go to Telegram and search for "Get My ChatID". Start the conversation and get your ChatID. Use this information to configure the Telegram Node. Schedule Trigger Node: By default, the workflow is triggered every 5 minutes. Adjust this according to your needs. Test the Workflow: Execute the workflow manually to ensure everything is working as expected. How It Works Schedule Trigger: Starts the workflow at predetermined intervals. Edit Fields: Sets the wallet address, Etherscan API key, and USDT ERC-20 token address. Edit Telegram Settings: Create a bot via BotFather. Configure the API key and Telegram Chat ID. Etherscan Data Import: Collects transaction data from the ERC-20 wallet using Etherscan's public database. Final Results: Organizes and formats the transaction data for review. Telegram Bot Message Sending: If there is a balance change, it sends a formatted message about the balance change. If there is no balance change, it sends a message that your balance has not changed. You can configure it to avoid sending a message when there is no change.
by Luke
Built this for a dedicated Slack outage-notifications channel — works well on both desktop and mobile. This is for: IT Administrators & small MSPs looking to streamline M365 alerts from one or multiple mailboxes into a single or specific Slack channels IT Admins who prefer ChatOps over management-by-email What does it do Scans for M365 outage alerts emails (every 1 min) Checks if it impacts a specific user region (if the alert calls it out, countries have to be manually set) Summarizes the incident using OpenAI o4-mini (cheap model - or you can swap for local Ollama) Sends a Slack Block to your outage channel with incident link (can be extended) Deletes the original alert email after successful delivery Credentials Outlook: Create an Outlook credential (OAuth2.0) to point to the mailbox (regular or shared) where M365 service alerts will be received Slack: Create a Slack bot credential with access to the slack channel you want updates posted to OpenAI: Create a OpenAI credential that has access to the GPT-4O-MINI model. Recommend you use projects in OpenAI so that you may set a per-project-budget and not impact other projects. Review this OpenAI documentation for more info on managing Projects in the API portal. Expect this to consume no more than 1-2 cents per month on average. Setup Download & import the workflow Modify the first Outlook block (Check for 365 Service Alert) to use the Outlook credential Modify the OpenAI block's system prompt to call out the countries your users reside in ie. "- Assume the organization has users primarily in the U.S. and Australia. If those regions are affected, state: "Your users may have been affected." Otherwise, add: "No impact expected for your user base."" ← swap U.S. & Australia for desired countries Modify the Slack block (Post outage to Slack) to specify the channel updates will be posted to Sample Slack Output Workflow Diagram
by Ranjan Dailata
Who is this for? The Capture Website Screenshots with Bright Data Web Unlocker and Save to Disk workflow is built for automation professionals and developers who need reliable, high-quality screenshots from any website even those protected by anti-bot technologies. It is ideal for: Compliance Teams - Capturing visual records of web content for legal or audit purposes. Product Managers - Tracking visual changes across competitor landing pages. Digital Marketers - Archiving campaign pages and offer variations. Developers and QA Teams - Validating UI deployments or rendering issues. Growth Hackers and Scrapers - Who need to bypass bot protection and capture visual snapshots of restricted content. What problem is this workflow solving? Websites today are highly protected with anti-bot tools like Cloudflare, bot detection scripts, and geo-restrictions. These protections often break traditional screenshot tools or prevent headless browsers from accessing content. This workflow solves the following problems: Bypasses anti-bot defenses using Bright Data Web Unlocker. Automatically captures screenshots without manual browser steps. Stores images locally for easy access or reporting. Operates headlessly and at scale, perfect for automations or scheduled jobs. What this workflow does Sets the target URL, file name, and Bright Data zone name using the Set URL, Filename and Bright Data Zone node. Sends an HTTP POST request to Bright Data Web Unlocker API to capture a screenshot. Saves the screenshot image (.png) to a specified disk location using the Write a file to disk node. Pre-conditions You need to have a Bright Data account and do the necessary setup as mentioned in the "Setup" section below. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. Ensure the URL, file name, and Bright Data zone name are correctly set in the Set URL, Filename and Bright Data Zone node. Set the desired local path in the Write a file to disk node to save the screenshot. How to customize this workflow to your needs Change the target URL: Modify the value in the **Set URL, Filename and Bright Data Zone node to capture different websites. Set dynamic filenames**: Use expressions in n8n to generate filenames based on date/time or URL. Specify custom save paths: Adjust the path in the **Write a file to disk node to store screenshots in your preferred directory. Enhance with notifications**: Add additional nodes to send alerts or log activity after each screenshot is taken. Integrate with external systems**: Send screenshots to cloud storage (e.g: AWS S3, Google Drive) or link into monitoring/reporting tools.
by Airtop
Automating LinkedIn Profile Discovery with Verification Use Case Accurately identifying and verifying a person’s LinkedIn profile is essential for prospecting, recruiting, or contact enrichment. This automation ensures high accuracy by combining search logic with optional profile validation. What This Automation Does This automation locates and verifies a LinkedIn profile using the following inputs: Person_info**: Any identifying information about the person (e.g., name, company, email). Airtop_profile**: Your Airtop Profile authenticated on LinkedIn, used for verifying the profile. How It Works Extracts a likely LinkedIn URL by performing a Google search using the provided person info. Validates the result (if Airtop Profile is provided): Visits the LinkedIn profile. Verifies match by checking the content (e.g., experience, role) against the person info. Returns a verified LinkedIn profile URL or "NA" if not found or not valid. Setup Requirements Airtop API Key Optional but recommended: an Airtop Profile authenticated on LinkedIn. Next Steps Combine with Email Lookup**: Use email-to-profile tools upstream to gather inputs. CRM Integration**: Automatically append LinkedIn profiles to contact records. Automate Outreach**: Use the verified URLs for personalized LinkedIn engagement workflows. Read more about how find and verify Linkedin profiles
by Juan Carlos Cavero Gracia
Image Carousel Publisher for Instagram and TikTok Description This automation template is designed for content creators, digital marketers, and social media managers looking to streamline their image carousel posting workflow. It automates the process of uploading multiple images as carousels to Instagram and slideshows to TikTok, making your visual content management more efficient across platforms. Who Is This For? Content Creators & Influencers:** Simplify posting image collections and focus more on creating visual content. Digital Marketers:** Ensure consistent carousel posts across multiple platforms with minimal manual effort. Social Media Managers:** Automate repetitive image uploading tasks and maintain visual engagement. What Problem Does This Workflow Solve? Manually uploading image carousels to different platforms can be time-consuming and inconsistent. This workflow addresses these challenges by: Automating Multi-Image Uploads:** Processes multiple images and prepares them for platform-specific formats. Supporting Cross-Platform Publishing:** Simultaneously posts your image carousels to Instagram and TikTok slideshows. Maintaining Visual Consistency:** Ensures your visual stories remain consistent across platforms. Streamlining Batch Processing:** Handles the technical complexity of multi-image uploads with a single workflow trigger. How It Works Image Selection: Trigger the workflow with your selected images. Image Processing: The workflow automatically processes and prepares your images for both platforms. Content Distribution: Uploads the images as a carousel to Instagram and as a slideshow to TikTok. Platform Optimization: Formats the uploads according to each platform's requirements. Setup API Token Generation: Visit upload-post.com and create an account Navigate to the API settings section Generate a new API token Copy the token for use in the next steps Platform Configuration: In the "Upload to Instagram" node: Paste your API token in the designated field Configure your Instagram account settings Set your preferred posting parameters In the "Upload to TikTok" node: Add the same API token Set up your TikTok account credentials Adjust posting preferences Content Parameters Setup: Rename the "HTTP Request" node to "Social Media Upload Request" Configure your account information: Username Account ID Content title format Posting schedule (if applicable) Image Source Configuration: Set up your image source directory Configure image format requirements Test with sample images before going live About upload-post.com Upload-post.com is a third-party service that acts as a bridge between your workflow and social media platforms. It provides: Secure API endpoints for multi-platform posting Image format validation and optimization Queue management for scheduled posts Analytics and posting status tracking Cross-platform compatibility handling Requirements Accounts:** upload-post.com account with access to Instagram and TikTok publishing. API Keys:** Upload-post.com API token. Images:** Properly formatted images that meet Instagram and TikTok specifications: Instagram: Up to 10 images per carousel, 1:1 to 4:5 aspect ratio TikTok: Compatible with slideshow format, 9:16 aspect ratio recommended Use this template to enhance your visual storytelling, maintain consistency across social platforms, and engage your audience with compelling image carousels and slideshows.
by Praveena
Idea The idea for app came since I wanted to build a unique gift for my niece because she gets excited for her birthday (which Im going to miss this year). The web app has a simple countdown (in html and JS) but more importantly, there is an AI agent that will answer some specific questions and know her preferences. How it works The questions from app are sent via web hook to N8N which has pulls preferences file (about her likes, dislikes, personality) from postgre and AI Agent that will answer questions/respond. The current status is stored back in postgre (especially about status of cat and universe happenings) before responding back. Features Integrated AI chatbot via N8N webhook Persistent conversation history Minimizable chat interface Fallback support for offline testing Features: -- Wheres Mittens - This is a query to track her lost cat in multiverse. -- Multiverse updates with recent update stored Pre Requisites Postgre SQL database is available. Alternatively, use any other database but change the N8N nodes. LLM Api Key. Step by Step Instructions Export this N8N Workflow. Modify LLM API Key, I used openAI, 4.1 For web app scofflding,you will need Node, HTML and Javascript. I've created a mini version using Node and JS with web app and N8N connection settings here: <https://github.com/productiser/FiBirthdayAgent> PostgreSQL Database Script (1 table for memory and context storage): CREATE TABLE fifi_world_context ( id TEXT PRIMARY KEY, -- e.g., 'agent_fifi' cat_location TEXT, -- e.g., "Bubble Nebula" cat_activity TEXT, -- e.g., "Playing laser tag with moon mice" fifi_preferences JSONB, -- e.g., likes/dislikes/foods/shows world_history TEXT, -- Summary of narrative events last_updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP ); 5.Modify system prompt as per your needs. Built With N8N Self hosted Self hosted web app Hosted on Vercel Total spend = <£1 (AI costs only) Total Time = <1 day Support Watch this video for web app overview and how it looks. <https://youtu.be/e7PlrTdvwoM> Contact me on info@pankstr.com/ superllmuser@gmail.com for any queries Hope you enjoy!!
by Pavel Duchovny
Building agentic AI workflows often requires multiple moving parts: memory management, document retrieval, vector similarity, and orchestration. Until now, these pieces had to be custom-wired. But with the new native n8n nodes for MongoDB Atlas, we reduce that overhead dramatically. With just a few clicks: Store and recall long-term memory from MongoDB Query vector embeddings stored in Atlas Vector Search Use these results in your LLM chains and automation logic In this example we present an ingestion and AI Agent flows that focus around Travel Planning. The different interest points that we want the agent to know about can be ingested into the vector store. The AI Agent will use the vector store tool to get relevant context about those points of interest if it needs to. Prerequisites MongoDB Atlas project and Cluster OpenAI Valid API Key for embeddings (can be other provider) Gemini API Key for the LLM (can be other provider) How it works: There are 2 main flows. One is ingesting flow: Gets a document from a webhook and use MongoDB Vector Atlas to embed the document title and description into points_of_interest collection. Embeddings are stored in a field named embedding Embeddings used are OpenAI's but it can be any type of supported embedders. Second flow is an AI Agent node with Chat Memory Stored in MongoDB Atlas and a Vector Search node as a tool: Chat Message Trigger**: Chatting with the AI Agent will trigger the conversation store in the MongoDB Chat Memory node. When data is necessary like a location search or details it will go to the "Vector Search" tool. Vector Search Tool** - uses Atlas Vector Search index created on the points_of_interest collection: // index name : "vector_index" // If you change an embedding provider make sure the numDimensions correspond to the model. { "fields": [ { "type": "vector", "path": "embedding", "numDimensions": 1536, "similarity": "cosine" } ] } Additional Resources MongoDB Atlas Vector Search n8n Atlas Vector Search docs