by Nskha
n8n Creators Template: Creator Profile Stats Updater This n8n workflow template is designed to automate the process of updating a creator's profile statistics, including total workflows, complex workflows, approved workflows, pending workflows, total nodes, and total views. It utilizes various nodes to fetch data, process it, and update a SVG file hosted on GitHub to reflect the latest stats. Workflow Overview Schedule Trigger: Triggers the workflow execution at specified intervals. Config: Sets up configuration details like creator username, colors for text, icons, border, and card. Get Workflows: Fetches workflows associated with the creator from the n8n API. Workflows Data: Processes the fetched data to calculate various statistics. Get User: Fetches user details from the n8n API. Download Image: Downloads the creator's profile image. Extract From File: Extracts binary data from the downloaded image file. SVG: Generates an SVG file with updated stats and visual representation. GitHub: Commits the updated SVG file to the specified GitHub repository. Final: Prepares the final data set for further processing or output. Sticky Note: Provides a visual note or reminder within the workflow editor. Embed & Live Preview Since it's a .SVG format you can host it anywhere. treat it like normal image so you can embed it with any site, forum, page that support posting images. here's example code for markdown: Here's the result Or served through CDN & Cache Setup Instructions GitHub Credentials: Ensure you have GitHub credentials set up in your n8n instance to allow the workflow to commit changes to your repository. Configure Trigger: Adjust the Schedule Trigger node to set the desired execution intervals for the workflow. Set Configuration: Customize the Config node with your GitHub username and preferred aesthetic options for the SVG. Deploy Workflow: Import the workflow into your n8n instance and deploy it. Customization Options Text and Icon Colors**: Customize the colors used in the SVG by modifying the respective fields in the Config node. Profile Image Size**: Adjust the image size in the Download Image node URL if needed. Commit Messages**: Modify the commit messages in the GitHub nodes to suit your version control conventions [I've used $now funaction to include current time in message which will gives allways a diffrent commit value]. Requirements n8n (Self-hosted or Cloud version compatible with 2024 releases and up) GitHub account and repository Basic understanding of n8n workflow configuration Support and Contributions For support, please refer to the n8n community forum or the official n8n documentation. Contributions to the template can be made you're allowed to reuse this workflow and reshare with edit (like new design/colors etc..) under your name.
by Tony Duffy
. IOT device control with MQTT and webhook This workflow is for users wanting a practical example of how to control IOT systems using the MQTT protocol in an an n8n environment. The template provides typical n8n MQTT and Webhook node implementation and configuration settings necessary to set IOT device inputs and outputs. How it works A webpage with IOT control 'on and 'off' buttons is presented to the user. When a button is selected on the webpage the value is sent via a webhook to trigger the active workflow. The workflow set node then prepares the received value into a message payload. It then passes the message to the MQTT node for publishing the topic with the payload to a cloud based MQTT broker. A remote ESP32 micro-controller subscribes to the broker and reads the payload contained in the topic. The ESP32 will then toggle the GPIO pin depending on the topic payload value. The IOT control webpage The webpage is a simple HTML page containing the clickable 'on' and 'off' buttons. It also has the get webhook URL that sends the selected value to the n8n workflow in this case running locally. The URL webhook format is http://localhost:5678/webhook/pin-control?value=action The webpage code IOT-control.html IOT device The IOT device is an ESP32 micro-controller running on a remote network. To keep it simple GPIO2 is selected as the control output. In this case when the received value is "on" GPIO2 goes high a led will turn on in the ESP32. It will go off when the received value is "off". The program for the ESP32 IOT control is 'main.py' . You will require a micropython interpreter to be uploaded to the ESP32 for the program to run automatically. The code can be easily edited and modified to accommodate any further attached IOT devices. The ESP32 main.py code main.py How to customise this workflow to your needs ESP32 You will need a working ESP32 installed with a micro-python interpreter. The code main.py is provided. The main.py program can loaded and edited with a python IDE. I used Thonny for this example. Use a free MQTT broker to get started. I used "broker.emqx.io" in the code. IOT Control Webpage The webpage contains HTML and can be easily edit to enhance functionality. The embedded webhook is configured for n8n production mode. http://localhost:5678/webhook/pin-control?value=action If you want to run the page in test mode you will use the following URL. http://localhost:5678/webhook-test/pin-control?value=action n8n workflow. The workflow is a good demonstration of how to control IOT devices using n8n. Following these steps will give a good insight for microcontroller automation.
by Yaron Been
Automated pipeline to collect and analyze investor data from Crunchbase, tracking investment patterns, funding history, and portfolio companies for market analysis and lead generation. 🚀 What It Does Investor Profiling**: Collects comprehensive data on investors and VC firms Investment Pattern Analysis**: Tracks funding history and investment preferences Portfolio Monitoring**: Keeps tabs on investor portfolios and new investments Data Enrichment**: Enhances raw data with additional context and metrics 🎯 Perfect For Startup founders seeking investors Market research analysts Investment professionals Business development teams Competitive intelligence ⚙️ Key Benefits ✅ Comprehensive investor profiles ✅ Real-time investment tracking ✅ Market trend analysis ✅ Data-driven investment decisions ✅ Time-saving automation 🔧 What You Need Crunchbase API access n8n instance Storage solution (database or spreadsheet) 📊 Data Points Collected Investor/Firm details Investment history Portfolio companies Funding rounds participated in Investment focus areas Contact information (when available) 🛠️ Setup & Support Quick Setup Deploy in 30 minutes with our step-by-step configuration guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Transform your investor research with automated data collection and analysis. Spend less time gathering data and more time making strategic decisions.
by Corentin Ribeyre
This template can be used to verify email addresses with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into four steps : The workflow initiates with a manual trigger (On clicking ‘execute’). It reads your Google Sheet file. It connects to your Icypeas account. It performs an HTTP request to scan the domains/companies. Set up steps You will need a formated Google sheet file with company/domain names. You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need domain/companies names to scan them.
by Kees Bosch - Browserflow
Auto find & invite LinkedIn Leads This n8n template automates LinkedIn lead generation by scraping profiles, filtering out existing connections, and sending connection requests — all in a controlled, looped workflow. Ideal for outreach campaigns, recruitment, or lead gen efforts. ⚠️ Disclaimer – Community Node Notice This template uses a verified community node available inside the n8n cloud environment. To use it, go to "Nodes" → search for: Browserflow for Linkedin …and click Install. It’s officially verified and accessible directly from n8n cloud. In case you wish to run this template locally, you need to go to the settings, click community nodes and search for n8n-nodes-browserflow. Then after installing you can start using the actions in this node. 🛠️ How to Use Trigger: Manual Start Initiates the workflow manually via the “Test workflow” button, giving you full control. Scrape LinkedIn Profiles Uses the Browserflow automation to extract profile links from a LinkedIn search or keyword query. Split Out Results Converts the list of profiles into individual items for single-profile processing. Loop Through Each Profile Ensures each LinkedIn profile is handled one at a time, avoiding simultaneous actions. Check Existing Connection Verifies if you’re already connected with the lead on LinkedIn. Conditional Logic ✅ Already Connected → Skip to next profile ❌ Not Connected → Continue to next step Send Connection Invite Sends a LinkedIn connection request, optionally with a personalized message. 📦 Requirements n8n (cloud or self-hosted) Installed community node: Browserflow for Linkedin LinkedIn account Valid Browserflow acount (you can set up a free 7-day trial at https://browserflow.io) ⚙️ Setup Instructions Install the Browserflow Community Node Search “Browserflow for Linkedin” > Install. Get your API key Get your API key at https://browserflow.io Setup your Browserflow account After registering, setup your Browserflow and connect with Linkedin using the wizard at https://browserflow.io Connect with Browserflow by making a credential Click on the Browserflow actions to setup a connection with Browserflow by adding your API key to a credential. 🧩 Customization Tips Targeting: Adjust the Browserflow actions to scrape specific roles, industries, or locations. Messaging: You can add a message to the connection invite but remind that LinkedIn limits the amount of messages that can be send each month. Use variables in the message for personalization (e.g., {firstName}). Trigger: Replace manual trigger with a cron node for scheduled outreach. Integration: Combine with CRM tools (e.g., HubSpot, Notion, Airtable) for syncing leads or integrate with AI Agents.
by Lucas Perret
This workflow enriches new accounts in Pipedrive using Datagma API by adding data about ICP (ideal customer profile). Instead of Pipedrive, you can use any other CRM. In this example, ideal buyers are heads of sales/business development. Prerequisites Pipedrive account and Pipedrive credentials How it works Pipedrive trigger node starts the workflow when a new company is created. HTTP Request node queries data from Datagma. Pipedrive node updates Pipedrive contact with new data from Datagma. The Item Lists node simplifies returned data from Datagma that contain lists (arrays), enabling you to easily modify the structure for further processing without the need to use Function nodes and write custom JavaScript. IF node identifies if the lead corresponds ICP. HTTP Request node searches for emails in Datagma. Set node prepares data for further merging. Merge node combines data from multiple streams. Pipedrive node adds a new person in Pipedrive.
by AlQaisi
Template Information Who is this template for? This template is for users looking to retrieve email information from LinkedIn profiles and update Google Sheets with the collected data. 🎥 quick set up video How it works** The template utilizes a series of nodes to fetch email information from LinkedIn profiles. It starts with a Schedule Trigger node that sets the interval for the workflow. The Conditional Check node verifies if certain fields like Name, Gender, Job Title, Summary, and LinkedIn URL are not empty. The HTTP Request node sends a POST request to the specified URL with API key and profile information. The Data Merge node merges the data collected. The Field Editing node modifies the fields as needed. Finally, the Google Sheets Update node updates the Google Sheets with the gathered information. Set Up Instructions Make sure to have the necessary credentials and permissions for accessing LinkedIn and Google Sheets. Set up the API key required for the HTTP Request node. Configure the Google Sheets Update node with the appropriate document ID and sheet name. Check and adjust field mappings in the Field Editing node according to your needs. Run the workflow and monitor the updates in your Google Sheets document. Overview: The workflow is designed to find contact information for LinkedIn profile URLs stored in a Google Sheet. It involves various nodes for different operations such as making HTTP requests, scheduling triggers, reading from and updating Google Sheets, field editing, data merging, and conditional checks. A video demonstrating the workflow process can be accessed here. Copy this template to get started : Google Sheets Using Prospeo.io LinkedIn Email Finder API with cURL To use the API endpoint "https://api.prospeo.io/linkedin-email-finder" with cURL, follow these steps: Use the cURL command with the following parameters: curl -X POST \ -H "Content-Type: application/json" \ -H "X-KEY: your_api_key" \ -d '{ "url": "https://www.linkedin.com/in/john-doe/" }' \ "https://api.prospeo.io/linkedin-email-finder" Replace "your_api_key" with your actual API key. Update the "url" field in the JSON data with the LinkedIn profile URL for which you want to find the email address. To get access to this API and obtain your API key, you need to sign up on the Prospeo platform and subscribe to their LinkedIn email finder service. Once you have subscribed, you will receive an API key that you can use to authenticate your requests to the API endpoint. Description: Schedule Trigger:** Triggers the workflow based on a defined schedule interval, in this case, based on minutes. Schedule Trigger Node Documentation Google Sheets Read:** Reads data from a Google Sheets document and sheet based on the provided document ID and sheet name. Google Sheets Node Documentation Conditional Check:** Checks multiple conditions based on the input data and performs actions accordingly. Conditional Node Documentation HTTP Request:** Sends an HTTP POST request to a specified URL with headers and body parameters. HTTP Request Node Documentation No Operation, do nothing:** Placeholder node that does not perform any operation. Data Merge:** Merges data based on specified mode and combination settings. Merge Node Documentation Field Editing:** Edits fields by setting specific values for each field based on input data. Set Node Documentation Google Sheets Update:** Updates data in a Google Sheets document and sheet based on specified columns and values. Google Sheets Node Documentation
by Artur
Overview This automated workflow fetches Upwork job postings using Apify, removes duplicate job listings via MongoDB, and sends new job opportunities to Slack. Key Features: Automated job retrieval** from Upwork via Apify API Duplicate filtering** using MongoDB to store only unique jobs Slack notifications** for new job postings Runs every 20 minutes** during working hours (9 AM - 5 PM) This workflow requires an active Apify subscription to function, as it uses the Apify Upwork API to fetch job listings. Who is This For? This workflow is ideal for: Freelancers looking to track Upwork jobs in real time Recruiters automating job collection for analytics Developers who want to integrate Upwork job data into their applications What Problem Does This Solve? Manually checking Upwork for jobs is time-consuming and inefficient. This workflow: Automates job discovery based on your keywords Filters out duplicate listings, ensuring only new jobs are stored Notifies you on Slack when new jobs appear How the Workflow Works 1. Schedule Trigger (Every 20 Minutes) Triggers the workflow at 20-minute intervals Ensures job searches are only executed during working hours (9 AM - 5 PM) 2. Query Upwork for Jobs Uses Apify API to scrape Upwork job posts for specific keywords (e.g., "n8n", "Python") 3. Find Existing Jobs in MongoDB Searches MongoDB to check if a job (based on title and budget) already exists 4. Filter Out Duplicate Jobs The Merge Node compares Upwork jobs with MongoDB data The IF Node filters out jobs that are already stored in the database 5. Save Only New Jobs in MongoDB The Insert Node adds only new job listings to the MongoDB collection 6. Send a Slack Notification If a new job is found, a Slack message is sent with job details Setup Guide Required API Keys Upwork Scraper (Apify Token) – Get your token from Apify MongoDB Credentials – Set up MongoDB in n8n using your connection string Slack API Token – Connect Slack to n8n and set the channel ID (default: #general) Configuration Steps Modify search keywords in the 'Assign Parameters' node (startUrls) Adjust the Working Hours in the 'If Working Hours' node Set your Slack channel in the Slack node Ensure MongoDB is connected properly Adjust the 'If Working Hours' node to match your timezone and hours, or remove it altogether to receive notifications and updates constantly. How to Customize the Workflow Change keywords: update the startUrls in the 'Assign Parameters' node to track different job categories Change 'If Working Hours': Modify conditions in the IF Node to filter times based on your needs Modify Slack Notifications: Adjust the Slack message format to include additional job details Why Use This Workflow? Automated job tracking without manual searches Prevents duplicate entries in MongoDB Instant Slack notifications for new job opportunities Customizable – adapt the workflow to different job categories Next Steps Run the workflow and test with a small set of keywords Expand job categories for better coverage Enhance notifications by integrating Telegram, Email, or a dashboard This workflow ensures real-time job tracking, prevents duplicates, and keeps you updated effortlessly.
by Martijn Smit
This workflow template helps Todoist users get a weekly overview of their completed tasks via email, making it easier to review their past week. Why use this workflow? Todoist doesn’t provide completed task reports or filters in its built-in reports or n8n app. This workflow solves that by using Todoist’s public API to fetch your completed tasks. How it works Runs every Friday afternoon (or manually). Uses the Todoist public API to retrieve completed tasks. Excludes specific projects you set (e.g., a grocery list). Sends an email summary, grouping tasks by the day they were completed. Set up steps Copy your Todoist API token (found here). Create a Todoist API credential in n8n. Create an SMTP credential in n8n. Alternatively, use a preferred email service like Brevo, Mailjet, etc. Import this workflow template. In the Get completed tasks via Todoist API step, select your Todoist API credential. In the Send Email step: Select your SMTP credential. Set the sender and recipient email addresses. Run the workflow manually and check your inbox! Ignoring specific projects If you do not want your grocery list, workouts, or other tasks from specific Todoist projects showing up in your weekly summary, modify the step called Optional: Ignore specific projects and change this line: const ignoredProjects = ['2335544024']; This should be an array with the id of each project you'd like to ignore. You can find a list of your projects (inc. their Ids) by visiting this link: https://api.todoist.com/rest/v2/projects
by AlQaisi
Example: @SubAlertMe_Bot Summary: The automated image analysis and response workflow using n8n is a sophisticated solution designed to streamline the process of analyzing images sent via Telegram and delivering insightful responses based on the analysis outcomes. This cutting-edge workflow employs a series of meticulously orchestrated nodes to ensure seamless automation and efficiency in image processing tasks. Use Cases: This advanced workflow caters to a myriad of scenarios where real-time image analysis and response mechanisms are paramount. The use cases include: Providing immediate feedback on images shared within Telegram groups. Enabling automated content moderation based on the analysis of image content. Facilitating rapid categorization and tagging of images based on the results of the analysis. Detailed Workflow Setup: To effectively implement this workflow, users must adhere to a meticulous setup process, which includes: Access to the versatile n8n platform, ensuring seamless workflow orchestration. Integration of a Telegram account to facilitate image reception and communication. Utilization of an OpenAI account for sophisticated image analysis capabilities. Configuration of Telegram and OpenAI credentials within the n8n environment for seamless integration. Proficiency in creating and interconnecting nodes within the n8n workflow for optimal functionality. Detailed Node Description: Get the Image (Telegram Trigger): Actively triggers upon receipt of an image via Telegram, ensuring prompt processing. Extracts essential information from the received image message to initiate further actions. Merge all fields To get data from trigger: Seamlessly amalgamates all relevant data fields extracted from the trigger node for comprehensive data consolidation. Analyze Image (OpenAI): Harnesses the powerful capabilities of OpenAI services to conduct in-depth analysis of the received image. Processes the image data in base64 format to derive meaningful insights from the visual content. Aggregate all fields: Compiles and consolidates all data items for subsequent processing and analysis, ensuring comprehensive data aggregation. Send Content for the Analyzed Image (Telegram): Transmits the analyzed content back to the Telegram chat interface for seamless communication. Delivers the analyzed information in textual format, enhancing user understanding and interaction. Switch Node: The Switch node is pivotal for decision-making based on predefined conditions within the workflow. It evaluates incoming data to determine the existence or absence of specific elements, such as images in this context. Utilizes a set of rules to assess the presence of image data in the message payload and distinguishes between cases where images are detected and when they are not. This crucial node plays a pivotal role in directing the flow of the workflow based on the outcomes of its evaluations. Conclusion: The automation of image analysis processes through this sophisticated workflow not only enhances operational efficiency but also revolutionizes communication dynamics within Telegram interactions. By incorporating this advanced workflow solution, users can optimize their image analysis workflows, bolster communication efficacy, and unlock new levels of automation in image processing tasks.
by Manuel
Who is this template for? This workflow template is designed for everyone with a Gmail address, who wants to forward all Netflix emails, including temporary login codes, to friends and family effortlessly. How it works Scans your Gmail inbox every minute for new e-mails from Netflix Forwards all Netflix e-mails to all desired e-mail addresses via the e-mail provider Mailjet Setup Steps Connect your Google Mail Account to n8n following the official n8n instructions Add all recipients you want to the recipients array at the "Set all recipients" node. Create and connect your Mailjet Account to n8n following the official n8n instructions. Note: You cannot use an Gmail e-mail address as the sender address, as mailjet does not support this. I recommend using your own email address from a custom domain. This works perfectly.
by felipe biava cataneo
What this template does This template uses GROQ LLAVA V1.5 7B API that offers fast inference for multimodal models with vision capabilities for understanding and interpreting visual data from images. . The users send a image and get a description of the image from the model. Setup Open the Telegram app and search for the BotFather user (@BotFather) Start a chat with the BotFather Type /newbot to create a new bot Follow the prompts to name your bot and get a unique API token Save your access token and username Once you set your bot, you can send the image, and get the descriptions.