by Harshil Agrawal
This workflow allows you to release a new version via a Telegram bot command. This workflow can be used in your Continous Delivery pipeline. Telegram Trigger node: This node will trigger the workflow when a message is sent to the bot. If you want to trigger the workflow via a different messaging platform or a service, replace the Telegram Trigger node with the Trigger node of that service. IF node The IF node checks for the incoming command. If the command is not deploy, the IF node will return false, otherwise true. Set node: This node extracts the value of the version from the Telegram message and sets the value. This value is used later in the workflow. GitHub node: This node creates a new version release. It uses the version from the Set node to create the tag. NoOp node: Adding this node is optional.
by Angel Menendez
Temporary solution using the undocumented REST API for backups using Google drive. Please note that there are issues with this workflow. It does not support versioning, so please know that it will create multiple copies of the workflows so if you run this daily it will make the folder grow quickly. Once I figure out how to version in Gdrive I'll update it here.
by Lorena
This workflow detects toxic language (such as profanity, insults, threats) in messages sent via Telegram. This blog tutorial explains how to configure the workflow nodes step-by-step. Telegram Trigger: triggers the workflow when a new message is sent in a Telegram chat. Google Perspective: analyzes the text of the message and returns a probability value between 0 and 1 of how likely it is that the content is toxic. IF: filters messages with a toxic probability value above 0.7. Telegram: sends a message in the chat with the text "I don't tolerate toxic language" if the probability value is above 0.7. NoOp: takes no action if the probability value is below 0.7.
by Harshil Agrawal
This workflow demonstrates how to can use Redis to implement rate limits to your API. The workflow uses the incoming API key to uniquely identify the user and use it as a key in Redis. Every time a request is made, the value is incremented by one, and we check for the threshold using the IF node. Duplicate the following Airtable to try out the workflow: https://airtable.com/shraudfG9XAvqkBpF
by rpshu
-- Disclaimer: This template is mainly made for self-hosted users who can reach CSV files in their file system. For Cloud users, just replace the first few nodes with your file system of choice, like Google Drive or Dropbox -- How to automatically import CSV files into postgres 1、project description This workflow demonstrates how CSV file can be automatically imported into existing PostgreSQL database. Before running the workflow please make sure you have a file on the server: /tmp/t1.csv The name of the test database is db01, and you can replace it. then create table t1 create table t1(id int,name varchar(10)); And the content of the file is the following: |id|name| |-|-|-| |1|a| |2|b| |3|c| 2、Other If you want to import a custom csv file, please refer to the following methods. 2.1、Create a table in the database SQL Commands: https://www.postgresql.org/docs/current/sql-createtable.html 2.2、Upload csv file Upload csv file to N8N server and make sure it can be read.
by Lucas Walter
AI News Scraping System This n8n workflow automates the process of pulling in breaking AI-related headlines from curated RSS feeds, scraping their full content, and saving readable Markdown versions directly to Google Drive. Use cases include: Creating a personal newsletter curation system Automating blog post research workflows Archiving news content for later summarization or AI use How it Works Scheduled Triggers The workflow runs every 3–4 hours using multiple Schedule Trigger nodes. Each trigger targets a different news source (e.g., Google News, OpenAI Blog, Hugging Face, etc.). Fetch and Parse Feeds RSS feeds are fetched via the HTTP Request node. Items from the feed are split into individual entries using the Split Out node. Scrape Article Content Each article URL is sent to the Firecrawl API with a prompt to extract only the main content in Markdown. The scraping skips navigation, headers, footers, and ads. Convert and Save The extracted Markdown is converted into a .md file using the Convert to File node. The file is then uploaded to a Google Drive folder. Good to Know This workflow uses the Firecrawl API for web scraping. Be sure to configure a Generic HTTP Header credential with your API key. Output files are saved in Markdown format You can add more Schedule Trigger + HTTP Request pairs to extend this workflow to additional feeds. Requirements Firecrawl API account for scraping Google Drive account (OAuth2 credentials must be configured in n8n) n8n instance (self-hosted or cloud) Customization Ideas Replace or extend RSS feeds with sources relevant to your niche Load up scraped news stories into a prompt to create new content like TikToks and Reels Add a summarization step using an LLM like GPT or Claude Send the Markdown files to Notion, Slack, or a blog CMS Example Feeds | Feed Name | URL | |------------------|----------------------------------------------------------------------| | Google News (AI) | https://rss.app/feeds/v1.1/AkOariu1C7YyUUMv.json | | OpenAI Blog | https://rss.app/feeds/v1.1/xNVg2hbY14Z7Gpva.json | | Hugging Face | https://rss.app/feeds/v1.1/sgHcE2ehHQMTWhrL.json |
by jason
This is the workflow that I presented at the April 9, 2021 n8n Meetup. This workflow uses Baserow.io to store registration information collected using n8n as both the web server and the data processor. To get this workflow working properly, you will either need to run it on n8n.cloud or using the on premise version with people having the ability to connect to n8n externally. You will need an account on Baserow.io to store your subscriptions with a table with the following fields: GUID First Name Last Name Email Confirmed
by Eska
Deadlock Match Stats Bot is an automated workflow for n8n designed to send detailed player statistics from the most recent Deadlock match directly to Telegram. When the user sends the /match command to the Telegram bot, the workflow performs the following steps: Loads the HTML content of the player's profile page from deadlocktracker.gg using a preconfigured Steam ID. Extracts the most recent match ID using a regular expression from the embedded JavaScript data. Loads the HTML page for the specified match. Parses the match page using cheerio to extract relevant data for each player, including their nickname, selected hero, and current rank. Formats the collected information into a single message and sends it to the Telegram chat that issued the command.
by bangank36
This workflow automates the Mark as Fulfilled action in Squarespace for each order, ensuring a seamless fulfillment process without manual intervention. How It Works This workflow retrieves all pending Squarespace orders and processes their fulfillment automatically. The workflow follows these steps: 1️⃣ Get all pending orders using the HTTP Request node (Since Squarespace does not have a n8n node) 2️⃣ Create a fulfillment request using Fulfill Order node The Filter Orders node can be used to filter valid pending order to process. Step-by-step The workflow can be run as requested or on schedule You can adjust these parameters within the Global and filter nodes: Global node for API Setting api-version** (string, required) – The current API version (see Squarespace Orders API documentation). modifiedAfter**={a-datetime} (string, conditional) – Fetch orders modified after a specific date (ISO 8601 format). modifiedBefore**={b-datetime} (string, conditional) – Fetch orders modified before a specific date (ISO 8601 format). cursor**={c} (string, conditional) – Used for pagination, cannot be combined with other filters. fulfillmentStatus**={status} (optional, enum) – Filter by fulfillment status: PENDING, FULFILLED, or CANCELED. maxPage** – Set -1 to enables infinite pagination to fetch all available orders. Filter Orders node Order Filtering – Ensures only valid orders are fulfilled, particularly useful if: You sell digital downloads or gift cards exclusively. You use third-party fulfillment services for all products. Requirements Credentials To use this workflow, you need: Squarespace API Key – Retrieve from your Squarespace settings. Who Is This For? Squarespace store owners looking to automate their fulfillment process. Merchants selling digital or personalized products who need instant fulfillment. Explore More Templates Get all orders in Squarespace to Google Sheets Convert Squarespace Profiles to Shopify Customers in Google Sheets Fetch Squarespace Blog & Event Collections to Google Sheets 👉 Check out my other n8n templates
by Eric
This is a specific use case. The ElevenLabs guide for Cal.com bookings is comprehensive but I was having trouble with the booking API request. So I built a simple workflow to validate the request and handle the booking creation. Who's this for? You have an ElevenLabs voice agent (or other external service) booking meetings in your Cal.com account and you want more control over the book_meeting tool called by the voice agent. How's it work? Request is received by the webhook trigger node Request sent from ElevenLabs voice agent, or other source Request body contains contact info for the user with whom a meeting will be booked in Cal.com Workflow validates input data for required fields in Cal.com If validation fails, a 400 bad request response is returned If valid, meeting is booked in Cal.com api How do I use this? Create a custom tool in the ElevenLabs agent setup, and connect it to the webhook trigger in this workflow. Add authorization for security. Instruct your voice agent to call this tool after it has collected the required information from the user. Expected input structure Note: Modify this according to your needs, but be sure to reflect your changes in all following nodes. Requirements here depend on required fields in your Cal.com event type. If you have multiple event types in Cal.com with varying required fields, you'll need to handle this in this workflow, and provide appropriate instructions in your *voice agent prompt*. "body": { "attendee_name": "Some Guy", "start": "2025-07-07T13:30:00Z", "attendee_phone": "+12125551234", "attendee_timezone": "America/New_York", "eventTypeId": 123456, "attendee_email": "someguy@example.com", "attendee_company": "Example Inc", "notes": "Discovery call to find synergies." } Modifications Note: ElevenLabs doesn't handle webhook response headers or body, and only recognizes the response code. In other words, if the workflow responds with 400 Bad request that's the only info the voice agent gets back; it doesn't get back any details, eg. "User email still needed". You can modify the structure of the expected webhook request body, and then you should reflect that structure change in all following nodes in the workflow. Ie. if you change attendee_name to attendeeFirstName and attendeeLastName then you need to make this change in the following nodes that use these properties. You can also require or make optional other user data for the Cal.com event type which would reduce or increase the data the voice agent must collect from the user. You can modify the authorization of this webhook to meet your security needs. ElevenLabs has some limitations and you should be mindful of those, but it also offers a secret feature with proves useful. An improvement to this workflow could include a GET request to a CRM or other db to get info on the user interacting with the voice agent. This could reduce some of the data collection needed from the voice agent, like if you already have the user's email address, for example. I believe you can also get the user's phone number if the voice agent is set up on a dial-in interface, so then the agent wouldn't need to ask for it. This all depends on your use case. A savvy step might be prompting the voice agent to get an email, and using the email in this workflow to pull enrichment data from Apollo.io or similar ;-)
by Jonathan
This workflow checks a Google Calendar at 8am on the first of each month to get anything that has been marked as a Holiday or Illness. It then merges the count for each person and sends an email with the list. To use this workflow you will need to set the credentials to use for the Google Calendar node and Send Email node. You will also need to select the calendar ID and fill out the information in the send email node. This workflow searches for Events that contain "Holiday" or "Illness" in the summary. If you want to change this you can modify it in the Switch node.
by Baptiste Fort
Still reminding people about their tasks manually every morning? Let’s be honest — who wants to start the day chasing teammates about what they need to do? What if Slack could do it for you — automatically, at 9 a.m. every day — without missing anything, and without you lifting a finger? In this tutorial, you’ll build a simple automation with n8n that checks Airtable for active tasks and sends reminders in Slack, daily. Here’s the flow you’ll build: Schedule Trigger → Search Records (Airtable) → Send Message (Slack) STEP 1 : Set up your Airtable base Create a new base called Tasks Add a table (for example: Projects, To-Do, or anything relevant) Add the following fields: | Field | Type | Example | | -------- | ----------------- | ------------------------------------------- | | Title | Text | Finalize quote for Client A | | Assignee | Text | Baptiste Fort | | Email | Email | claire@email.com | | Status | Single select | In Progress / Done | | Due Date | Date (dd/mm/yyyy) | 05/07/2025 | Add a few sample tasks with the status In Progress so you can test your workflow later. STEP 2 Create the trigger in n8n In n8n, add a Schedule Trigger node Set it to run every day at 9:00 a.m.: Trigger interval: Days Days Between Triggers: 1 Trigger at hour: 9 Trigger at minute: 0 This is the node that kicks off the workflow every morning. STEP 3 : Search for active tasks in Airtable This step is all about connecting n8n to your Airtable base and pulling the tasks that are still marked as "In Progress". 1. Add the Airtable node In your n8n workflow, add a node called: Airtable → Search Records You can find it by typing "airtable" in the node search. 2. Create your Airtable Personal Access Token If you haven’t already created your Airtable token, here’s how: 🔗 Go to: https://airtable.com/create/tokens Then: Name your token something like TACHES Under Scopes, check: ✅ data.records:read Under Access, select only the base you want to use (e.g. “Tâches”) Click “Save token” Copy the personal token 3. Set up the Airtable credentials in n8n In the Airtable node: Click on the Credentials field Select: Airtable Personal Access Token Click Create New Paste your token Give it a name like: My Airtable Token Click Save 4. Configure the node Now fill in the parameters: Base: Tâches Table: Produits (or Tâches, depending on what you called it) Operation: Search Filter By Formula: {Statut} = "En cours" Return All: ✅ Yes (make sure it’s enabled) Output Format: Simple 5. Test the node Click “Execute Node”. You should now see all tasks with Statut = "En cours" show up in the output (on the right-hand side of your screen), just like in your screenshot. STEP 4: Send each task to Slack Now that we’ve fetched all the active tasks from Airtable, let’s send them to Slack — one by one — using a loop. Add the Slack node Drag a new node into your n8n workflow and select: Slack → Message Name it something like Send Slack Message You can find it quickly by typing "Slack" into the node search bar. Connect your Slack account If you haven't already connected your Slack credentials: Go to n8n → Credentials Select Slack API Click Create new Paste your Slack Bot Token (from your Slack App OAuth settings) Give it a clear name like Slack Bot n8n Choose the workspace and save Then, in the Slack node, choose this credential from the dropdown. Configure the message Set these parameters: Operation: Send Send Message To: Channel Channel: your Slack channel (e.g. #tous-n8n) Message Type: Simple Text Message Message template Paste the following inside the Message Text field: Message template Paste the following inside the Message Text field: New task for {{ $json.name }}: {{ $json["Titre"] }} 👉 Deadline: {{ $json["Date limite"] }} Example output: New task for Jeremy: Relancer fournisseur X 👉 Deadline: 2025-07-04 Test it Click Execute Node to verify the message is correctly sent in Slack. If the formatting works, you’re ready to run it on schedule 🚀