by Viktor Klepikovskyi
Advanced Retry and Delay Logic This template provides a robust solution for handling API rate limits and temporary service outages in n8n workflows. It overcomes the limitations of the default node retry settings, which cap retries at 5 and delays at 5 seconds. By using a custom loop with a Set, If, and Wait node, this workflow gives you complete control over the number of retries and the delay between them. Instructions: Replace the placeholder HTTP Request node with your target node (the one that might fail). In the initial Set Fields node, modify the max_tries value to set the total number of attempts for your workflow. Adjust the delay_seconds value to define the initial delay between retries. Optionally, configure the Edit Fields node to implement exponential backoff by adjusting the delay_seconds expression (e.g., {{$json.delay_seconds * 2}}). For a more detailed breakdown and tutorial of this template, you can find additional information here.
by WeblineIndia
This n8n workflow automates the process of converting a newly stored PDF file from Google Drive into an HTML file and saving it back to Google Drive. The workflow is triggered whenever a new PDF is uploaded to a specific folder, ensuring seamless conversion and storage without any manual intervention. This workflow provides an efficient, automated solution for converting PDFs to HTML, eliminating the need for manual file handling and ensuring a smooth document transformation process. It is particularly useful for scenarios where PDFs need to be dynamically converted and stored in an organised manner for web usage, archiving, or further processing. Prerequisites : Before setting up this workflow, ensure the following: PDF.co API Key: Sign up at PDF.co and obtain an API key for PDF to HTML conversion. Proper Authentication: Ensure authentication is configured for Google Drive in n8n. Customisation Options : Modify the API request to convert PDFs to other formats like Text, CSV, or XML. Extend the IF Node to reject files based on size or other properties. Send a notification once the conversion is complete using an Email or Telegram Node. Steps : Step 1: Google Drive Trigger Node (Watch for New Files) Click "Add Node" and search for Google Drive. Select "Google Drive Trigger" and add it to the workflow. Authenticate with your Google Account. Select the folder to monitor. Set the trigger to activate whenever a new file is added. Click "Execute Node" to test. Click "Save". Step 2: IF Node (Check if File is a PDF) Click "Add Node" and search for IF. Add a condition to check if the file extension is .pdf If true → Send the file to the next step. If false → Stop the workflow. Click "Execute Node" to test. Click "Save". Step 3: HTTP Request Node (Convert PDF to HTML) Click "Add Node" and search for HTTP Request. Set the Method to POST. Enter the PDF.co API endpoint for PDF to HTML conversion. In the Headers, add API key which we have get from pdf.co Send the binary PDF data as the request body. Click "Execute Node" to test. Click "Save". Step 4: Function Node (Convert Response to Binary) Click "Add Node" and search for Function. Write a JavaScript function to transform the API response into a binary file. Click "Execute Node" to test. Click "Save". Step 5: Google Drive Node (Save Converted HTML File) Click "Add Node" and search for Google Drive. Select "Upload File" as the action. Authenticate with your Google Account. Set the destination folder for storing the HTML file. Map the binary data from the Function Node. Click "Execute Node" to test. Click "Save". Step 6: Connect & Test the Workflow Link the nodes in this order (Google Drive Trigger → IF Node → HTTP Request → Function Node → Google Drive Upload) Run the workflow manually. Upload a test PDF to Google Drive. Check Google Drive for the converted HTML file. Who’s behind this? WeblineIndia’s AI development team. We've delivered 3500+ software projects across 25+ countries since 1999. From no-code automations to complex AI systems — our AI team builds tools that drive results. Looking to hire AI developers? Start with us.
by Jonathan
This workflow will take an alert from Syncro, determine if it's an agent_offline_trigger type, then determine if it's a new alert or a close to an existing alert, and then submit it to OpsGenie. New alerts will create a new alert in OpsGenie and resolved alerts will close the alert in OpsGenie. It doesn't require any kind of Google Sheets because OpsGenie allows you to submit a unique ID (known as an alias) along with the alert, which can be referenced later when closing the alert. The trigger type can be changed to suit your needs. You will need to create an API integration in OpsGenie. In Syncro, in addition to setting up the appropriate notification to webhook, you will also need a script that closes the agent_offline_trigger alert and an automated remediation to trigger that script when the asset goes offline (the script is queued and run when the asset comes back online). > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
by n8n Team
This automated workflow takes a Typeform form, and once it is filled out, it is automatically uploaded as a Lead in Pipedrive. There is an option for custom fields (this workflow works with company size), and leaves notes in the note section based on questions answered. Prerequisites Typeform account and Typeform credentials and a form for people to fill out Pipedrive account and Pipedrive credentials Nodes Typeform node gets the data after the survey is completed Set node extracts data from the Typeform node and keeps only relevant data Function node maps the company size Pipedrive node populates a pipeline with a deal and adds custom fields
by Tom
This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from an Excel file. For Excel 365 (the online version of Microsoft Excel) check out this workflow instead. To run the workflow: Make sure your Salesforce account is authenticated with n8n. Have a Microsoft Excel workbook with contacts and their account names ready. The workflow uses this example file, but you probably want to use your own data instead. Hit the Execute Workflow button at the bottom of the n8n canvas. Here is how it works: The workflow first searches for existing Salesforce accounts by name. It then branches out depending on whether the account already exists in Salesforce or not. If an account does not exist yet, it will be created. The data is then normalised before both branches converge again. Finally the contacts are created or updated as needed in Salesforce.
by rangelstoilov
This will send your Github notifications to a discord webhook. Since Github doesn't send push notifications to mobile devices other then @mention this is a great workaround to receive notifications on Discord with this. Using a github trigger was not a good option as there is no trigger for notifications only events (which don't work on org repos). Using http request on notifications api is way better. ++TAGGING USER IN MESSATGE:++ Change ** with your discord Id to get tagged when sending notifications. To find your own id type in any channel backslash followed by your username with the 4 digit hash code You can copy this by clicking on your username next to your profile picture Example: \@username#9999 Enjoy!
by Elliot Scribner
> Disclaimer: this workflow template uses the n8n-nodes-couchbase community package. Community nodes are unverified and usage of them comes with some risks. See here for instructions on installing n8n community nodes. This template is intended for use by those interested in learning more about Agentic AI workflow development, as well as those interested in learning how to use the Couchbase Search Vector Store node for practical applications. This workflow helps users decide on travel destinations based on descriptions of several points of interest loaded into Couchbase and retrieved using Vector Search. How it Works This template contains two workflows: The Data Ingestion workflow uses the following nodes Webhook node (to listen for HTTP requests) OpenAI Embeddings node (to generate embeddings on document insertion) Note: You’ll need to configure OpenAI credentials for this node Couchbase Vector node (configured for document insertion) Default Data Loader and Recursive Character Text Splitter The Chat Application workflow uses the following nodes Chat Trigger node AI Tools Agent node connect to: Gemini (as the Chat Model, for generating responses) Note: You will have to configure Gemini credentials for this node Simple Memory (as the Memory, to maintain conversation context) Couchbase Search Vector node (as the Tool, for search) OpenAI Embeddings node (as the Embedding model for the Couchbase Search Vector node, to convert queries to vectors) Note: You’ll need to configure OpenAI credentials for this node Set up Setting up this workflow is easy and only takes around 10 minutes. Prerequisites A Couchbase Cluster running the Search Service, and corresponding database access credentials Be sure the Couchbase cluster allows the incoming IP address for n8n Create a Vector Search Index using this index definition Create a bucket (called travel-agent), scope (called vectors), and collection (called points-of-interest) in your Cluster OpenAI API Key Gemini API Key Steps Configure all necessary credentials (Couchbase, OpenAI, and Gemini) Select your bucket, scope, and collection for each of the Couchbase vector nodes Ingest data, either using the cURL statements found on the sticky note within the workflow, or using this shell script to ingest 6 points of interest Open the chat and test out your travel agent! Customization and Next Steps This workflow template can be made more robust by enhancing the data model to include more information about each point of interest. For example, the addition of price ranges, ideal seasons to visit, activity types, and accomodation options can help inform the LLM further about each destination, and in turn allow it to provide a more tailored response and be more helpful for travel planning. Alternatively, the data model could be entirely re-configured to suit a wide variety of other use cases. This template can serve as a building block for all sorts of AI Agent applications using RAG and is not limited to only travel recommendations.
by Davide
This workflow builds a conversational AI chatbot agent using Claude 3.7 Sonnet model with the new . It enhances standard LLM capabilities with Anthropic’s features: Web Search and Think: Real-time web search**, to answer up-to-date factual queries. A “Think” function, to support internal reasoning and memory-like behavior by Anthropic. A memory buffer, allowing the agent to maintain conversation history. A system prompt defining clear ethical, functional, and formatting rules for interaction. When a user sends a message (trigger), the chatbot evaluates the query, optionally performs a web search if needed, processes the result using Claude, and responds accordingly. ✅ Advantages 🧠 Enhanced Reasoning Abilities** The Think tool allows the agent to simulate deep thought processes or contextual memory storage, improving conversational intelligence. 🌐 Real-Time Knowledge via Web Search** The integrated web_search tool enables the agent to fetch the latest information from the internet, making it ideal for dynamic or news-driven use cases. 🧾 Contextual Responses with Memory Buffer** The inclusion of a memory buffer allows the agent to maintain state across messages, improving dialogue flow and continuity. 🛡️ Built-in Ethical Guidelines** The system prompt enforces privacy, factual integrity, neutrality, and ethical response generation, making the agent safe for public or enterprise use. How It Works Chat Trigger: The workflow begins when a chat message is received via a webhook. This triggers the AI Agent to process the user's query. AI Agent Processing: The AI Agent analyzes the query to determine if it requires information from the website or external sources. It follows a structured approach: For website-related queries, it uses the provided context. For external information, it employs the web_search tool to fetch up-to-date data from the internet. The Think tool is used for internal reasoning or caching thoughts without altering data. Language Model: The Anthropic Chat Model (Claude 3.7 Sonnet) generates responses based on the analyzed query, incorporating website context or web search results. Memory: A simple memory buffer retains context from previous interactions to maintain continuity in conversations. Output: The final response is delivered to the user, excluding internal processes like web searches or reasoning steps. Set Up Steps Configure Nodes: Chat Trigger: Set up the webhook to receive user messages. AI Agent: Define the system message and rules for handling queries. Anthropic Chat Model: Select the Claude 3.7 Sonnet model and configure parameters like maxTokensToSample. Memory: Initialize the memory buffer to store conversation context. Tools: web_search: Configure the HTTP request to the Anthropic API for web searches, including headers and authentication. Think: Set up the tool for internal reasoning. Connect Nodes: Link the Chat Trigger to the AI Agent. Connect the Anthropic Chat Model, Memory, and Tools (web_search and Think) to the AI Agent. Credentials: Ensure the Anthropic API credentials are correctly configured for both the chat model and the web_search tool. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by LukaszB
n8n Workflow Backup to Google Drive – Automated Export of All Your Workflows This workflow is designed to automatically create backups of all your workflows in n8n and store them as individual .json files in Google Drive. It's a fully automated system that helps developers, agencies, or automation teams ensure their automation logic is always safe, versioned, and ready to restore or share. What is this for? If you’re building and managing multiple automations inside n8n, losing a workflow due to accidental deletion or misconfiguration can cost you hours of work. This template solves that by exporting all your workflows into separate files and storing them in a dated Google Drive folder. It helps with disaster recovery, version tracking, and team collaboration — without any manual exporting. How this works: -Once triggered (manually or via a schedule), the workflow performs the following steps: -Creates a new folder in your Google Drive, named with today’s date (e.g. “Workflow Backups Monday 16-05-2025”). -Connects to your n8n instance using the internal API and retrieves a list of all existing workflows. -Iterates over each workflow, converts it into a .json file using the built-in file conversion node. -Uploads each individual .json file to the newly created folder in Google Drive. -Optionally, the workflow finds and deletes old backup folders to keep your Google Drive clean and avoid clutter. You get a clean, timestamped folder with all your flows — ready to restore, send, or store securely. You can trigger it manually or schedule it (e.g., to run weekly on Monday mornings). How to set it up: Import the provided workflow JSON into your n8n instance. Set up your credentials: -Replace the placeholder “Google demo” with your actual Google Drive OAuth2 credentials in all Google Drive nodes. -Replace the placeholder “n8n demo” with your n8n API credentials so the workflow can fetch your flows. -Go to the node “Create new folder” and replace the folder ID with your own destination folder in Google Drive where backups should be stored. -(Optional) Enable the “Schedule Trigger” to run the backup automatically once a week or on your preferred interval. You’re ready to go — test it with the Manual Trigger first and check your Google Drive for results.
by Akhil Varma Gadiraju
Bulk Contact Deletion from HubSpot via Uploaded Excel / CSV File This workflow allows you to automate the deletion of HubSpot contacts based on email addresses provided in an uploaded Excel (.xlsx) file. It's ideal for bulk-cleaning outdated or invalid contact data. ✅ Prerequisites Before using this workflow, ensure you have the following: A valid HubSpot App Token with permissions to search and delete contacts. An Excel (.xlsx) file with a column labeled emails containing the contact emails to be deleted. n8n self-hosted or cloud environment with: Webhook node enabled and accessible. HubSpot node credentials configured. Basic familiarity with n8n node configuration for custom adjustments (optional). 📃Sample Document Download 🧠 n8n Workflow: Delete HubSpot Contacts from an Uploaded Excel File This n8n workflow allows you to upload an Excel file containing contact email addresses. It will check each one in HubSpot and delete the contact if it exists. 🔗 Workflow Overview 📥 1. Trigger via Webhook (POST) The workflow starts when a .xlsx file is uploaded via an HTTP POST request to the webhook. This Excel file should contain a column with contact email addresses. 📄 2. Extract Data from Excel The uploaded file is parsed, and its rows are converted into structured JSON items, making each email address available for further processing. 🧹 3. Normalize Data The data is cleaned and normalized — for example, mapping column headers (e.g., emails) into a standard email field, ensuring consistent downstream logic. 🔁 4. Loop Through Contacts Each row (contact) is processed individually using batch looping. This allows for fine-grained error handling and sequential processing. 🔎 5. Search for Contact in HubSpot For each contact, a search query is made in HubSpot based on the email address. The workflow only fetches the first result (if any). 🧪 6. Check if Contact Exists An IF condition checks whether the contact was found (i.e., if a HubSpot contact ID exists): ✅ Yes → proceed to delete the contact. ❌ No → skip deletion and continue to the next. 🗑️ 7. Delete Contact If a contact exists, it is deleted from HubSpot using its internal contact ID. 🛠️ 8. Optional Placeholder for Post-Processing A placeholder node named “Replace Me” is included for any custom logic you may want to add after the deletion step, such as: Logging Notifications Writing to external storage ✅ Use Cases Bulk delete old or bounced email addresses from HubSpot. Clean up contacts based on external suppression lists. Automate regular CRM hygiene processes. 💡 Suggested Enhancements ✍️ Log results to Google Sheets or a database 📬 Send completion report via email or Slack 🔁 Add retry logic for temporary API failures 🔍 Validate email format before making requests 📎 Requirements n8n (self-hosted or cloud) HubSpot App Token (set up in n8n credentials) Excel file (.xlsx) with a column for email 📦 Files No external files are required. All logic is contained within the n8n workflow. 🚀 Getting Started Deploy the workflow in n8n. Copy the webhook URL and use it in your app or API client (like Postman). Upload an Excel file containing contact emails via POST request. Watch as it searches and deletes matches in HubSpot.
by Zacharia Kimotho
This workflow is aimed at generating keywords for SEO and articles To get started, you need to use the workflow as it is. You just call the webhook URL with a query parameter as q={{ $keywords}} For example, you can call it using ?q=keyword research This will give you a list of keywords back as an array. This system can be used by SEO pros, content marketers and also social media marketers to generate relevant keywords for their user needs
by Matheus Weckwerth
Flow Start: The flow starts upon receiving an HTTP GET call. Webhook: Receives the HTTP GET call and triggers the flow. Database: Connects to the database (Customer Datastore) to retrieve all necessary information (getAllPeople). Data Processing: Variable Insertion: The retrieved data is inserted into a variable. Variable Aggregation: The variables are aggregated and prepared for use in FlutterFlow. Webhook Response: Sends the response back through the Webhook with the processed data ready for use in FlutterFlow.