by rangelstoilov
This will send your Github notifications to a discord webhook. Since Github doesn't send push notifications to mobile devices other then @mention this is a great workaround to receive notifications on Discord with this. Using a github trigger was not a good option as there is no trigger for notifications only events (which don't work on org repos). Using http request on notifications api is way better. ++TAGGING USER IN MESSATGE:++ Change ** with your discord Id to get tagged when sending notifications. To find your own id type in any channel backslash followed by your username with the 4 digit hash code You can copy this by clicking on your username next to your profile picture Example: \@username#9999 Enjoy!
by Eduard
This workflow demonstrates how easy it is to export SQL query to Excel automatically! Before running the workflow please make sure you have access to a remote SQL server (MS SQL, MySQL, PostgreSQL etc.) with a sample table: Date,Band,ConcertName,Country,City,Location,LocationAddress, 2023-05-28,Ozzy Osbourne,No More Tours 2 - Special Guest: Judas Priest,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-05-08,Elton John,Farewell Yellow Brick Road Tour 2023,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-05-26,Hans Zimmer Live,Europe Tour 2023,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-07-07,Depeche Mode,Memento Mori World Tour 2023,Germany,Berlin,Olympiastadion Berlin,"Olympischer Platz 3, 14053 Berlin-Charlottenburg", The detailed process is explained in the tutorial https://blog.n8n.io/export-sql-to-excel
by n8n Team
This workflow demonstrates how to export SQL to XML and present the data nicely formatted using an XSL Template. The upper part of the workflow starts with a webhook. Then it gets several random records from the SQL table and converts them into an XML string. Then a final XML file is created that contains a link to the XML Stylesheet file. The lower part of the workflow contains a helper Webhook that reads an XSL Template from the GitHub gist and serves it back via the Respond to Webhook node. This is required to comply with the CORS rules of modern browsers. These rules dictate that both XML data and a stylesheet file should come from the same domain.
by n8n Team
This workflow automatically imports data from a CSV file located at a specific URL and then updates the Google Sheets document with the provided data. Below is a step-by-step description of what this workflow does: The workflow is started manually using the "When you click 'Execute Workflow'" node. The CSV file is then uploaded from the specified URL "https://opendata.ecdc.europa.eu/covid19/testing/csv/data.csv" using the "Upload CSV" node. The "Import CSV" node accepts the uploaded CSV file and converts it into JSON formatted data. The "Add Unique Field" node generates a unique key by combining the 'country_code' and 'year_week' fields from the JSON data, which will be further used in the Google Sheets document. The 'Keep only DACH in 2023' node filters the data to keep only records where 'country_code' is either 'DE', 'AT', or 'CH' and 'year_week' starts with '2023'. Google's API has limitations on the speed of read and write operations, so only a subset of the data is taken. The filtered data is loaded into the specified Google Sheets document via the 'Load to Spreadsheet' node. The operation is set to 'appendOrUpdate' and the document ID and sheet name are specified. Also, the previously generated 'unique_key' key is set as the key to match the columns.
by Elliot Scribner
> Disclaimer: this workflow template uses the n8n-nodes-couchbase community package. Community nodes are unverified and usage of them comes with some risks. See here for instructions on installing n8n community nodes. This template is intended for use by those interested in learning more about Agentic AI workflow development, as well as those interested in learning how to use the Couchbase Search Vector Store node for practical applications. This workflow helps users decide on travel destinations based on descriptions of several points of interest loaded into Couchbase and retrieved using Vector Search. How it Works This template contains two workflows: The Data Ingestion workflow uses the following nodes Webhook node (to listen for HTTP requests) OpenAI Embeddings node (to generate embeddings on document insertion) Note: You’ll need to configure OpenAI credentials for this node Couchbase Vector node (configured for document insertion) Default Data Loader and Recursive Character Text Splitter The Chat Application workflow uses the following nodes Chat Trigger node AI Tools Agent node connect to: Gemini (as the Chat Model, for generating responses) Note: You will have to configure Gemini credentials for this node Simple Memory (as the Memory, to maintain conversation context) Couchbase Search Vector node (as the Tool, for search) OpenAI Embeddings node (as the Embedding model for the Couchbase Search Vector node, to convert queries to vectors) Note: You’ll need to configure OpenAI credentials for this node Set up Setting up this workflow is easy and only takes around 10 minutes. Prerequisites A Couchbase Cluster running the Search Service, and corresponding database access credentials Be sure the Couchbase cluster allows the incoming IP address for n8n Create a Vector Search Index using this index definition Create a bucket (called travel-agent), scope (called vectors), and collection (called points-of-interest) in your Cluster OpenAI API Key Gemini API Key Steps Configure all necessary credentials (Couchbase, OpenAI, and Gemini) Select your bucket, scope, and collection for each of the Couchbase vector nodes Ingest data, either using the cURL statements found on the sticky note within the workflow, or using this shell script to ingest 6 points of interest Open the chat and test out your travel agent! Customization and Next Steps This workflow template can be made more robust by enhancing the data model to include more information about each point of interest. For example, the addition of price ranges, ideal seasons to visit, activity types, and accomodation options can help inform the LLM further about each destination, and in turn allow it to provide a more tailored response and be more helpful for travel planning. Alternatively, the data model could be entirely re-configured to suit a wide variety of other use cases. This template can serve as a building block for all sorts of AI Agent applications using RAG and is not limited to only travel recommendations.
by Davide
This workflow builds a conversational AI chatbot agent using Claude 3.7 Sonnet model with the new . It enhances standard LLM capabilities with Anthropic’s features: Web Search and Think: Real-time web search**, to answer up-to-date factual queries. A “Think” function, to support internal reasoning and memory-like behavior by Anthropic. A memory buffer, allowing the agent to maintain conversation history. A system prompt defining clear ethical, functional, and formatting rules for interaction. When a user sends a message (trigger), the chatbot evaluates the query, optionally performs a web search if needed, processes the result using Claude, and responds accordingly. ✅ Advantages 🧠 Enhanced Reasoning Abilities** The Think tool allows the agent to simulate deep thought processes or contextual memory storage, improving conversational intelligence. 🌐 Real-Time Knowledge via Web Search** The integrated web_search tool enables the agent to fetch the latest information from the internet, making it ideal for dynamic or news-driven use cases. 🧾 Contextual Responses with Memory Buffer** The inclusion of a memory buffer allows the agent to maintain state across messages, improving dialogue flow and continuity. 🛡️ Built-in Ethical Guidelines** The system prompt enforces privacy, factual integrity, neutrality, and ethical response generation, making the agent safe for public or enterprise use. How It Works Chat Trigger: The workflow begins when a chat message is received via a webhook. This triggers the AI Agent to process the user's query. AI Agent Processing: The AI Agent analyzes the query to determine if it requires information from the website or external sources. It follows a structured approach: For website-related queries, it uses the provided context. For external information, it employs the web_search tool to fetch up-to-date data from the internet. The Think tool is used for internal reasoning or caching thoughts without altering data. Language Model: The Anthropic Chat Model (Claude 3.7 Sonnet) generates responses based on the analyzed query, incorporating website context or web search results. Memory: A simple memory buffer retains context from previous interactions to maintain continuity in conversations. Output: The final response is delivered to the user, excluding internal processes like web searches or reasoning steps. Set Up Steps Configure Nodes: Chat Trigger: Set up the webhook to receive user messages. AI Agent: Define the system message and rules for handling queries. Anthropic Chat Model: Select the Claude 3.7 Sonnet model and configure parameters like maxTokensToSample. Memory: Initialize the memory buffer to store conversation context. Tools: web_search: Configure the HTTP request to the Anthropic API for web searches, including headers and authentication. Think: Set up the tool for internal reasoning. Connect Nodes: Link the Chat Trigger to the AI Agent. Connect the Anthropic Chat Model, Memory, and Tools (web_search and Think) to the AI Agent. Credentials: Ensure the Anthropic API credentials are correctly configured for both the chat model and the web_search tool. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Yaron Been
Description This workflow automatically searches multiple flight booking websites to find the cheapest flights for your desired routes. It leverages web scraping to compare prices across platforms, helping you save money on air travel. Overview This workflow automatically searches multiple flight booking websites to find the cheapest flights for your desired routes. It uses Bright Data to scrape flight prices and can notify you when prices drop below your target threshold. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping flight prices from booking websites. Notification Services:** Email, SMS, or other messaging platforms. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications: Configure your preferred notification method. Customize: Set your routes, date ranges, and price thresholds. Use Cases Frequent Travelers:** Find the best deals for your regular routes. Travel Agencies:** Monitor flight prices for client bookings. Budget Travelers:** Get notified when flights to your dream destination become affordable. Connect with Me YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #travel #flights #brightdata #dealalerts #webscraping #flightdeals #cheapflights #travelhacks #budgettravel #travelplanning #airfare #flightprices #travelautomation #n8nworkflow #workflow #nocode #traveltech #flightbooking #savemoney #traveltools #flightcomparison #bestflightdeals #travelsmarter #automatedtravel
by LukaszB
n8n Workflow Backup to Google Drive – Automated Export of All Your Workflows This workflow is designed to automatically create backups of all your workflows in n8n and store them as individual .json files in Google Drive. It's a fully automated system that helps developers, agencies, or automation teams ensure their automation logic is always safe, versioned, and ready to restore or share. What is this for? If you’re building and managing multiple automations inside n8n, losing a workflow due to accidental deletion or misconfiguration can cost you hours of work. This template solves that by exporting all your workflows into separate files and storing them in a dated Google Drive folder. It helps with disaster recovery, version tracking, and team collaboration — without any manual exporting. How this works: -Once triggered (manually or via a schedule), the workflow performs the following steps: -Creates a new folder in your Google Drive, named with today’s date (e.g. “Workflow Backups Monday 16-05-2025”). -Connects to your n8n instance using the internal API and retrieves a list of all existing workflows. -Iterates over each workflow, converts it into a .json file using the built-in file conversion node. -Uploads each individual .json file to the newly created folder in Google Drive. -Optionally, the workflow finds and deletes old backup folders to keep your Google Drive clean and avoid clutter. You get a clean, timestamped folder with all your flows — ready to restore, send, or store securely. You can trigger it manually or schedule it (e.g., to run weekly on Monday mornings). How to set it up: Import the provided workflow JSON into your n8n instance. Set up your credentials: -Replace the placeholder “Google demo” with your actual Google Drive OAuth2 credentials in all Google Drive nodes. -Replace the placeholder “n8n demo” with your n8n API credentials so the workflow can fetch your flows. -Go to the node “Create new folder” and replace the folder ID with your own destination folder in Google Drive where backups should be stored. -(Optional) Enable the “Schedule Trigger” to run the backup automatically once a week or on your preferred interval. You’re ready to go — test it with the Manual Trigger first and check your Google Drive for results.
by Akhil Varma Gadiraju
Bulk Contact Deletion from HubSpot via Uploaded Excel / CSV File This workflow allows you to automate the deletion of HubSpot contacts based on email addresses provided in an uploaded Excel (.xlsx) file. It's ideal for bulk-cleaning outdated or invalid contact data. ✅ Prerequisites Before using this workflow, ensure you have the following: A valid HubSpot App Token with permissions to search and delete contacts. An Excel (.xlsx) file with a column labeled emails containing the contact emails to be deleted. n8n self-hosted or cloud environment with: Webhook node enabled and accessible. HubSpot node credentials configured. Basic familiarity with n8n node configuration for custom adjustments (optional). 📃Sample Document Download 🧠 n8n Workflow: Delete HubSpot Contacts from an Uploaded Excel File This n8n workflow allows you to upload an Excel file containing contact email addresses. It will check each one in HubSpot and delete the contact if it exists. 🔗 Workflow Overview 📥 1. Trigger via Webhook (POST) The workflow starts when a .xlsx file is uploaded via an HTTP POST request to the webhook. This Excel file should contain a column with contact email addresses. 📄 2. Extract Data from Excel The uploaded file is parsed, and its rows are converted into structured JSON items, making each email address available for further processing. 🧹 3. Normalize Data The data is cleaned and normalized — for example, mapping column headers (e.g., emails) into a standard email field, ensuring consistent downstream logic. 🔁 4. Loop Through Contacts Each row (contact) is processed individually using batch looping. This allows for fine-grained error handling and sequential processing. 🔎 5. Search for Contact in HubSpot For each contact, a search query is made in HubSpot based on the email address. The workflow only fetches the first result (if any). 🧪 6. Check if Contact Exists An IF condition checks whether the contact was found (i.e., if a HubSpot contact ID exists): ✅ Yes → proceed to delete the contact. ❌ No → skip deletion and continue to the next. 🗑️ 7. Delete Contact If a contact exists, it is deleted from HubSpot using its internal contact ID. 🛠️ 8. Optional Placeholder for Post-Processing A placeholder node named “Replace Me” is included for any custom logic you may want to add after the deletion step, such as: Logging Notifications Writing to external storage ✅ Use Cases Bulk delete old or bounced email addresses from HubSpot. Clean up contacts based on external suppression lists. Automate regular CRM hygiene processes. 💡 Suggested Enhancements ✍️ Log results to Google Sheets or a database 📬 Send completion report via email or Slack 🔁 Add retry logic for temporary API failures 🔍 Validate email format before making requests 📎 Requirements n8n (self-hosted or cloud) HubSpot App Token (set up in n8n credentials) Excel file (.xlsx) with a column for email 📦 Files No external files are required. All logic is contained within the n8n workflow. 🚀 Getting Started Deploy the workflow in n8n. Copy the webhook URL and use it in your app or API client (like Postman). Upload an Excel file containing contact emails via POST request. Watch as it searches and deletes matches in HubSpot.
by Zacharia Kimotho
This workflow is aimed at generating keywords for SEO and articles To get started, you need to use the workflow as it is. You just call the webhook URL with a query parameter as q={{ $keywords}} For example, you can call it using ?q=keyword research This will give you a list of keywords back as an array. This system can be used by SEO pros, content marketers and also social media marketers to generate relevant keywords for their user needs
by Matheus Weckwerth
Flow Start: The flow starts upon receiving an HTTP GET call. Webhook: Receives the HTTP GET call and triggers the flow. Database: Connects to the database (Customer Datastore) to retrieve all necessary information (getAllPeople). Data Processing: Variable Insertion: The retrieved data is inserted into a variable. Variable Aggregation: The variables are aggregated and prepared for use in FlutterFlow. Webhook Response: Sends the response back through the Webhook with the processed data ready for use in FlutterFlow.
by ConvertAPI
Who is this for? For developers and organizations that need to protect PDF files with the password. What problem is this workflow solving? PDF file protection problem. What this workflow does Downloads the PDF file from the web. Protects PDF file with the password. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Change the password in the parameter UserPassword Optionally, additional Body Parameters can be added for the converter.