by Simon
Address Validation Workflow About This workflow automates the process of validating and correcting client shipping addresses in Billbee, ensuring accurate delivery information. It's ideal for e-commerce businesses looking to save time and reduce errors in their order fulfillment process. The workflow uses Billbee, an order management platform for small to medium-sized online retailers, and the Endereco API for address validation. Who Is This For? E-Commerce Businesses**: Streamline order fulfillment by automatically correcting common shipping address errors. Warehouse Teams**: Reduce manual work and ensure packages are shipped to the correct address. Small to Medium-Sized Retailers**: Businesses using Billbee to manage orders and requiring efficient, automated solutions for address validation. How it Works Trigger: Workflow starts via a Billbee Webhook when an order is imported. Fetch Data: Retrieve the client's shipping address using the Order ID. Validate Address: Send the address to the Endereco API for validation and correction (e.g., house number errors). Conditional Actions: Valid Address: Update the address in Billbee. Invalid Address: Tag the order with "Validation Error." Track Status: Add tags in Billbee for processed orders. Setup Steps API Keys: Obtain Billbee Developer/User API Keys and Endereco API Key. Billbee Rule: Create an automation rule: Trigger: Order imported. Action: Call External URL with OrderId to trigger n8n workflow. Optional: Use a secondary trigger (e.g., order state changes to "gepackt") for manual corrections. Customization Options Filter Delivery Addresses: Customize filters to exclude specific delivery types, such as pickup shops ("Postfiliale," "Paketshop," or "Packstation"). Filters can be adjusted within Billbee or in the workflow. Error Handling: Configure additional actions for orders that fail validation, such as notifying your team or flagging orders for manual review. Order Tags: Define custom tags in Billbee to better track order statuses (e.g., "Address Corrected," "Validation Error"). Trigger Types: Use additional triggers such as changes to order states (e.g., "gepackt" or "In Fulfillment") for manual corrections or validations. Address Fields: Modify the workflow to focus on specific address components, such as postal codes, city names, or country codes. Validation Rules: Adjust Endereco API settings or add custom logic to refine validation criteria based on your business needs. API Documentation Endereco**: Endereco API Docs Billbee**: Billbee API Docs
by Dustin
Short an simple: This Workflow will sync (add and delete) your Liked Songs to an custom playlist that can be shared. Setup: Create an app on the Spotify Developer Dashboard. Create Spotify Credentials - Just click on one of the Spotify Nodes in the Workflow an click on "create new credentials" and follow the guide. Create the Spotify Playlist that you want to sync to. Copy the exact name of you playlist, go into Node "Edit set Vars" and replace the value "CHANGE MEEEE" with your playlist name. Set your Spotify Credentiels on every Spotify Node. (Should be marekd with Yellow and Red Notes) Do you use Gotify? - No: Delete the Gotify Nodes (all the way to the right end of the Workflow) - Yes: Customize the Gotify Nodes to your needs.
by Angel Menendez
Analyze Emails for Security Insights Who is this for? This workflow is ideal for security teams, IT Ops professionals, and managed service providers (MSPs) responsible for monitoring and validating email traffic. It’s especially useful for organizations that need to identify potential phishing attempts, spam, or compromised accounts by analyzing email headers and IP reputation. What problem is this workflow solving? This workflow helps identify malicious or suspicious emails by verifying email authentication headers (SPF, DKIM, DMARC) and analyzing the reputation of the originating IP address. By automating these checks, it reduces manual analysis time and flags potential threats efficiently. What this workflow does Email Monitoring:** Polls a specified Microsoft Outlook folder for new emails in real-time. Header Analysis:** Retrieves and processes email headers to extract critical information such as authentication results and the sender’s IP address. IP Reputation Check:** Leverages external APIs (IP Quality Score and IP-API) to analyze the originating IP for potential spam or malicious activity. Authentication Validation:** Validates SPF, DKIM, and DMARC headers, determining if the email passes industry-standard authentication protocols. Data Aggregation and Reporting:** Combines all analyzed data into a unified format, ready for reporting or integration into downstream systems. Webhook Integration:** Outputs the findings via a webhook, enabling integration with alerting tools or security information and event management (SIEM) platforms. Setup Connect to Outlook: Configure the Microsoft Outlook trigger node with valid OAuth2 credentials. Specify the email folder to monitor for new messages. API Keys (Optional): Obtain an API key for IP Quality Score (https://ipqualityscore.com). Ensure the IP-API endpoint is accessible. This step is optional as ipqualityscore.com will provide a limited number of free lookups each month. See more details here. Webhook Configuration: Set up a webhook endpoint to receive the output of the workflow. Optional Adjustments: Customize polling intervals in the trigger node. Modify header filters or extend the validation logic as needed. How to customize this workflow to your needs Add Alerts:** Use the Respond to Webhook node to trigger notifications in Slack, email, or any other communication channel. Integrate with SIEM:** Forward the workflow output to SIEM tools like Splunk or ELK Stack for further analysis. Modify Validation Rules:** Update SPF, DKIM, or DMARC logic in the Set nodes to align with your organization’s security policies. Expand IP Analysis:** Add more APIs or services to enrich IP reputation data, such as VirusTotal or AbuseIPDB. This workflow provides a robust foundation for email security monitoring and can be tailored to fit your organization's unique requirements. With its modular design and integration options, it’s a versatile tool to enhance your cybersecurity operations.
by Angel Menendez
Analyze Emails for Security Insights Who is this for? This workflow is ideal for IT professionals, security analysts, and organizations looking to enhance their email security practices. It is particularly useful for those who need to analyze Gmail email headers for IP tracking, spoofing detection, and sender reputation assessment. What problem is this workflow solving? Email spoofing and phishing attacks are significant cybersecurity threats. By analyzing email headers, this workflow provides detailed insights into the email's origin, authentication status, and the reputation of the sending IP address. It helps detect potential spoofing attempts and assess the trustworthiness of incoming emails. What this workflow does This n8n workflow automates the process of analyzing email headers received in Gmail. It performs the following key functions: Triggering and Email Header Extraction: It monitors Gmail inboxes for new emails and extracts their headers for analysis. Authentication Analysis: It validates SPF, DKIM, and DMARC authentication results to ensure the email adheres to industry-standard security protocols. IP Analysis: The workflow extracts the originating IP address and evaluates its reputation and geographic details using external APIs. Reputation Scoring: It integrates with IP Quality Score to detect spam activity and assess the sender's reputation. Consolidation and Webhook Response: All results are aggregated into a single JSON response, making it easy to integrate with third-party platforms or tools for further automation. Setup Authenticate Gmail: Configure the Gmail Trigger node with your Gmail account credentials. API Keys (Optional): Obtain an API key for IP Quality Score (https://ipqualityscore.com). Ensure the IP-API endpoint is accessible. This step is optional as ipqualityscore.com will provide a limited number of free lookups each month. See more details here. Activate the Workflow: Ensure the workflow is active to process incoming emails in real-time. How to customize this workflow to your needs Add Alerts:** Use the Gmail - Respond to Webhook node to trigger notifications in Slack, email, or any other communication channel. Integrate with SIEM:** Forward the workflow output to SIEM tools like Splunk or ELK Stack for further analysis. Modify Validation Rules:** Update SPF, DKIM, or DMARC logic in the Set nodes to align with your organization’s security policies. Expand IP Analysis:** Add more APIs or services to enrich IP reputation data, such as VirusTotal or AbuseIPDB. This workflow provides a robust foundation for email security monitoring and can be tailored to fit your organization's unique requirements. With its modular design and integration options, it’s a versatile tool to enhance your cybersecurity operations.
by Dat Proto
Introduction This workflow will backup all of your existed workflows to a single Github repository. The Backup folders' name are based on the current backup date and have default format: "yyyy/MM/dd" (setup at "Create sub path" node). Throughout the backup process, the N8N will inform user via discord with clear message about Start, Success and Failure backups. The workflow will be Tech Stack The following nodes / services / libraries are used in this workflow: Nodes: Discord: To send message to configured setup channel. N8N: To get all workflows' information. Github: To store backup data. Code: To run data comparison (Existed vs Latest workflow data). Wait: To avoid discord message rate limit. External libraries: Underscore.js: JavaScript library that provides lots of common Javascript functions, to help user save time when using code node. Guideline Open "Config" node and setup the following information: repo_owner: Your Github username. repo_name: The repository that you want to store workflows backup data. Open "Create sub path" node and change the naming and path format of backup folder(s). Setup custom messages in 3 discord nodes: Starting Message: N8N inform user at the time workflow start. Inform Success Flows: After each success backup, N8N will notify user. Inform Failed Flows: After each failure backup, N8N will notify user to have appropriate action. Completed Notifications: Then at the final, the workflow will give user a summary. Setup "Schedule Trigger" node to change default automated backup time. Screenshots Discord output
by Angel Menendez
Who is this for? This workflow is designed for IT teams, service desk personnel, and incident management professionals who need a streamlined way to monitor and report on recent ServiceNow incidents directly within Slack. What problem is this workflow solving? / Use Case Manually monitoring incidents in ServiceNow can be time-consuming, and keeping teams updated about new or specific incidents often involves additional manual effort. This workflow automates the process of querying recent incidents from ServiceNow based on user-defined parameters and delivering formatted results directly to Slack. It ensures faster response times and improved incident visibility. What this workflow does This workflow integrates Slack and ServiceNow to provide an automated system for retrieving and presenting incident details. Slack User Interaction: Users initiate the workflow via a Slack modal form, selecting incident parameters like priority and state. ServiceNow Query: The workflow queries ServiceNow for incidents matching the selected criteria. Results Delivery: Incident results are sent back to Slack as a message formatted using Block Kit. If no results are found, the workflow notifies the user with a detailed message, either in a Slack channel or via direct message. Error Handling: If no channel is selected or any issues arise, the workflow ensures graceful fallback with appropriate notifications. Setup Instructions Slack Setup: Integrate Slack with n8n using a Slack app. Configure the modal form to accept parameters like priority and state. Check out this video for setting up a modal slack app on YouTube. ServiceNow Integration: Use ServiceNow credentials to connect with n8n. Ensure appropriate permissions for querying incidents. n8n Workflow Configuration: Import this workflow into n8n. Verify all node configurations, particularly those for ServiceNow API queries and Slack outputs. Set up webhook URLs for Slack event handling. Testing: Trigger the workflow from Slack to test modal inputs and incident queries. Confirm the output is correctly formatted and delivered to the intended Slack channel or user. How to Customize this Workflow to Your Needs Modify the ServiceNow query logic to include additional filters or fields. Adjust the Slack Block Kit formatting to match your organization’s preferred notification style. Use conditional logic to add more advanced handling for specific priorities or states. Expand the workflow to include escalation steps, such as notifying a specific team or creating follow-up tasks. Workflow Highlights Slack Modal Form**: Allows users to specify search criteria for incidents interactively. Dynamic Results Delivery**: Automatically sends results to a Slack channel or direct message based on user input. Error Handling**: Provides fallback notifications when no incidents are found or user inputs are incomplete. Customizable Integration**: Easily adaptable to fit different organizational needs, including advanced filtering and formatting options.
by Boriwat Chanruang
Who is this for? This workflow is designed for: Content creators**, artists, or hobbyists looking to experiment with AI-generated art. Small business owners* or *marketers** using LEGO-style designs for branding or promotions. Developers* or *AI enthusiasts** wanting to automate image transformations through messaging platforms like LINE. What problem is this workflow solving? Simplifies the process of creating custom AI-generated LEGO-style images. Automates the manual effort of transforming user-uploaded images into AI-generated artwork. Bridges the gap between messaging platforms (LINE) and advanced AI tools (DALL·E). Provides a seamless system for users to upload an image and receive an AI-transformed output without technical expertise. What this workflow does Image Upload via LINE: Users send an image to the LINE chatbot. AI-Powered Prompt Creation: GPT generates a prompt to describe the uploaded image for LEGO-style conversion. AI Image Generation: DALL·E 3 processes the prompt and creates a LEGO-style isometric image. Image Delivery: The generated image is returned to the user in LINE. Setup Prerequisites LINE Developer Account** with API credentials. Access to OpenAI API with DALL·E and GPT-4 capabilities. A configured n8n instance to run this workflow. Steps Environment Setup: Add your LINE API Token and OpenAI credentials as environment variables (LINE_API_TOKEN, OPENAI_API_KEY) in n8n. Configure LINE Webhook: Point the LINE webhook to your n8n instance. Connect OpenAI: Set up OpenAI API credentials in the workflow nodes for GPT-4 and DALL·E. Test Workflow: Upload a sample image in LINE and ensure it returns the LEGO-style AI image. How to customize this workflow to your needs Localization**: Modify response messages in LINE to fit your audience's language and tone. Integration**: Add nodes to send notifications through other platforms like Slack or email. Image Style**: Replace the LEGO-style image prompt with other artistic styles or themes. Advanced Use Cases Art Contests: Users upload images and receive AI-enhanced outputs for community voting or branding. Marketing Campaigns: Quickly generate creative visual content for ads and promotions using customer-submitted photos. Education: Use the workflow to teach students about AI-generated art and automation through a hands-on approach. Tips for Optimization Error Handling**: Add fallback nodes to handle invalid images or API errors gracefully. Logging**: Implement a logging mechanism to track requests and outputs for debugging and analytics. Scalability**: Use queue-based systems or cloud scaling to handle large volumes of image requests. Enhancements Add sticky notes in n8n to provide inline instructions for configuring each node. Create a tutorial video or documentation for first-time users to set up and customize the workflow. Include advanced filters to allow users to select from multiple styles beyond LEGO (e.g., pixel art, watercolor). This workflow enables seamless interaction between messaging platforms and advanced AI capabilities, making it highly versatile for various creative and business applications.
by Oneclick AI Squad
This automated n8n workflow sets up a complete MERN Stack development environment on a Linux server by installing core technologies, development tools, package managers, global npm packages, deployment tools, build tools, and security configurations. It creates a dedicated developer user and configures essential settings for MERN projects. What is MERN Stack Setup? MERN Stack setup involves installing and configuring Node.js, MongoDB, Express.js, and React, along with additional tools and packages, to create a fully functional development environment for building MERN-based applications on a Linux system. Good to Know The workflow triggers manually and takes 10-15 minutes to complete A dedicated developer user with proper permissions is created Firewall configuration secures development ports The environment variables template is provided All tools are installed and ready for immediate use How It Works Set Parameters** - Configures server host, user, password, setup type, Node.js version, MongoDB version, username, and user password System Preparation** - Prepares the system for installation Install Node.js** - Installs Node.js (v20 by default) with npm Install MongoDB** - Installs MongoDB (v7.0 by default) with Compass & Shell Install Git & GitHub CLI** - Installs Git and GitHub CLI Install Development Tools** - Installs VS Code, Docker, Docker Compose, Postman, Nginx, Redis, and PostgreSQL Create Dev User** - Creates a development user account Install Additional Tools** - Installs package managers (npm, Yarn, pnpm), global npm packages, deployment tools, build tools, and security tools Final Configuration** - Configures firewall, SSH keys, and environment variables template Setup Complete** - Marks the completion of the setup process How to Use Import the workflow into n8n Configure parameters in the Set Parameters node (server_host, server_user, server_password, setup_type, node_version, mongodb_version, username, user_password) Run the workflow SSH into the server as the new developer user Start building MERN applications Requirements Linux server access with SSH Administrative privileges (root access) Customizing This Workflow Adjust the setup_type parameter to customize the installation scope Modify node_version or mongodb_version to use different versions Change the username and user_password for the developer account
by Sarfaraz Muhammad Sajib
This automation workflow captures incoming chat messages from your Tawk.to live chat widget and sends alert emails via Gmail to notify your support team instantly. It is designed to help you respond promptly to visitors and improve your customer support experience. Prerequisites Tawk.to account:** You must have an active Tawk.to account with a configured live chat widget on your website. Gmail account:** A Gmail account with API access enabled and configured in n8n for sending emails. n8n instance:** Access to an n8n workflow automation instance where you will import and configure this workflow. Step-by-Step Setup Instructions 1. Configure Tawk.to Webhook Log in to your Tawk.to dashboard. Navigate to Administration > Webhooks. Click Add Webhook and enter the following: URL: Your n8n webhook URL from the Receive Tawk.to Request node (e.g., https://your-n8n-instance.com/webhook/a4bf95cd-a30a-4ae0-bd2a-6d96e6cca3b4) Method: POST Events: Select the chat message event (e.g., Visitor Message or Chat Message Received) Save the webhook configuration. 2. Configure Gmail Credentials in n8n In your n8n instance, go to Credentials. Add a new Gmail OAuth2 credential: Follow Google's instructions to create a project, enable Gmail API, and obtain client ID and secret. Authenticate and authorize n8n to send emails via your Gmail account. 3. Import and Activate Workflow Import the provided workflow JSON into n8n. Verify the Receive Tawk.to Request webhook node path matches the webhook URL configured in Tawk.to. Enter the email address you want the alerts sent to in the Send alert email node’s sendTo parameter. Activate the workflow. Workflow Explanation Receive Tawk.to Request: This webhook node listens for POST requests from Tawk.to containing chat message data. Format the message: Extracts relevant data from the incoming payload such as chat ID, visitor name, country, and message text, and assigns them to new fields for easy use downstream. Send alert email: Uses Gmail node to send a notification email to your support team with all relevant chat details formatted in a clear, concise text email. Customization Guidance Email Recipient:** Update the sendTo field in the Send alert email node to specify your support team’s email address. Email Content:** Modify the message template in the Send alert email node’s message parameter to suit your tone or include additional details like timestamps or chat URLs. Additional Processing:** You can extend the workflow by adding nodes for logging chats, triggering Slack notifications, or storing messages in a database. By following these instructions, your support team will receive immediate email alerts whenever a new chat message arrives on your website, improving response times and customer satisfaction.
by David Ashby
🛠️ SIGNL4 Tool MCP Server Complete MCP server exposing all SIGNL4 Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every SIGNL4 Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n SIGNL4 Tool tool with full error handling 📋 Available Operations (2 total) Every possible SIGNL4 Tool operation is included: 🔧 Alert (2 operations) • Send an alert • Resolve an alert 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native SIGNL4 Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every SIGNL4 Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Shannon Atkinson
Template Description WDF Top Keywords: This workflow is designed to streamline keyword research by automating the process of generating, filtering, and analyzing Google and YouTube keyword data. Ensure compliance with local regulations and API terms of service when using this workflow. 📌 Purpose The WDF Top Keywords workflow automates collecting, processing, and managing keyword data for both Google and YouTube platforms. Leveraging multiple data sources and APIs ensures an efficient and scalable approach to identifying high-impact keywords for SEO, content creation, and marketing campaigns. Key Features Automates the generation of keyword suggestions using autocomplete APIs. Integrates with NocoDB to store and manage keyword data. Filters keywords based on monthly search volume and cost-per-click (CPC). Supports bulk import of keyword data into structured databases. Outputs both Google and YouTube keyword insights, enabling informed decision-making. 🎯 Target Audience This workflow is ideal for: Digital marketers aiming to optimize ad campaigns with data-driven insights. SEO specialists looking to identify high-potential keywords efficiently. Content creators seeking trending and relevant topics for their platforms. Agencies managing keyword research for multiple clients. ⚙️ How It Works Trigger: The workflow runs on-demand or at scheduled intervals. Keyword Generation: Retrieves base keywords from NocoDB. Generates autocomplete suggestions for Google and YouTube. Data Processing: Filters and formats keyword data based on specific criteria (e.g., search volume, CPC). Consolidates results for efficient storage and analysis. Storage and Output: Saves data into structured NocoDB tables for tracking and reuse. Bulk imports monthly search volume statistics for detailed analysis. 🛠️ Key APIs and Tools Used NocoDB**: Stores and organizes base and processed keyword data. DataForSEO API**: Provides search volume and keyword performance metrics. Google Autocomplete API**: Suggests relevant Google search terms. YouTube Autocomplete API**: Suggests trending YouTube keywords. Social Flood Docker Instance**: Serves as the local integration hub. Setup Instructions Required Tools: NocoDB n8n DataForSEO Account Social Flood Docker Instance Create the following NocoDB tables: Base Keyword Search Second Order Google Keywords Second Order YouTube Keywords Search Volume This template empowers users to handle complex keyword research tasks effortlessly, saving time and providing actionable insights. Share this template to enhance your workflow efficiency!
by Łukasz
Who is it for This workflow is for anyone who is using N8N. It's especially helpful if you are a DevOps and your N8N instance is self hosted. If you carea lot about security and number of failed executions and at the same time you are using InfluxDB to monitor status of your systems, this will perfectly fit in your stack. How it works This automation is fairly simple. It uses native N8N nodes to gather data from itself. Then it is parsing this data to be compatible with InfluxDB input. And finally it is sending this data to InfluxDB for further processing. Remember to set up Setup is really simple and you just need to provide just three variables. First is your InfluxDB URL, second is your InfluxDB organization, and third is your InfluxDB bucket name. Of course, to set up N8N nodes and gather data from them, you will need your instance API key. And that's all. How it looks in InfluxDB? See below Schedule Audits Audits don't need to be run often, but I would recommend it to be run on regular basis. This way you can see real data series in InfluxDB. I think that once a day should be enough, but it depends on your N8N usage of course Thank you, perfect! Glad I could help. Visit my profile for other automations for businesses. And if you are looking for dedicated software development, do not hesitate to reach out! You can also see automations on my Sailing Byte's GitHub N8N repository.