by Angel Menendez
Who's it for This workflow is ideal for AI developers running multi-agent systems in n8n who need to quantitatively evaluate tool usage behavior. If you're building autonomous agents and want to verify their decisions against ground-truth expectations, this workflow gives you plug-and-play observability. What it does This template uses n8n's built-in Evaluation Trigger and Evaluation nodes to assess whether an AI agent correctly used all the expected tools. It supports: Dataset-driven testing of agent behavior Logging actual tools to compare them with the expected tools Assigning performance metrics (tool_called = true/false) Persisting output back to Google Sheets for further debugging The workflow can be triggered by either the chat input or the dataset row evaluation. It routes through a multi-tool agent node powered by the best LLMs. The agent has access to tools such as web search, calculator, vector search, and summarizer tools. The workflow then aims to validate tool use decisions by extracting the intermediate steps from the agent (i.e., action + observation) and comparing the tools that were called with the expected tools. If the tools that were called during the workflow execution match, then it's a pass; otherwise, it's documented as a fail. The evaluation nodes take care of that process.ย How to set it up Connect your Google Sheets OAuth2 credential. Replace the document with your own test dataset. Set your desired models and configure the different agent tools, such as the summarizer and vector store. The default vector store used is Qdrant, so the user must create this vector store with a few samples of queries + web search results. Run from either the chat trigger or the evaluation trigger to test. Requirements Google Sheets OAuth2 credential OpenRouter / OpenAI credentials for AI agents and embeddings Firecrawl and Qdrant credentials for web + vector search How to customize Edit the Search Agent system message to define tool selection behavior Add more metric columns in the Evaluation node for complex scoring Add new tool nodes and link them to the agent block Swap in your own summarizer
by Automate With Marc
๐ฅ Automated Daily Firecrawl Scraper with Telegram Alerts Get structured insights scraped daily from the web using Firecrawlโs AI extraction engine โ then send them directly to your Telegram chat. ๐งฐ What this workflow does: This workflow automatically scrapes specific structured data from any webpage every day at a scheduled time using the Firecrawl API, checks if results are returned, and then sends the formatted results to Telegram. For step-by-step video tutorials of n8n builds, check out my channel: https://www.youtube.com/@Automatewithmarc ๐งญ How It Works: ๐ Schedule Trigger (Daily at 6PM) Starts the workflow every day at a set time. ๐ Firecrawl POST Request Sends a custom extraction prompt and schema to Firecrawl, targeting any list of URLs you provide. โณ 30 Seconds Wait Waits to give Firecrawl enough time to complete processing. ๐ฅ GET Firecrawl Result Fetches the extraction results using the request ID. ๐ Loop with IF Node Checks whether data is returned. If not, waits another 15 seconds and retries. ๐งน Format & Clean (Set Node) Prepares and formats the extracted result into a readable message. ๐ฒ Telegram Message Node Delivers the structured data directly to your Telegram channel or group. ๐ง Requirements: โ Firecrawl API Key (Header Auth) โ Telegram Bot Token & Chat ID ๐ก Use Cases: Extract structured data (like product info or events) from niche websites Automate compliance monitoring or intelligence gathering Create market alert bots with real-time info delivery ๐ Customization Ideas: Swap Telegram with Gmail, Discord, or Slack Expand schema to include more complex nested fields Add a Google Sheet node to log daily scraped data Integrate with a summarizer or language model for intelligent summaries Ready to automate your web intelligence gathering? ๐ง Let Firecrawl do the scraping โ and let this workflow do the rest.
by Sebastien
How to use Get a .csv file with your contacts (you can download this from any contact manager app) Set API key for Google Drive API, and Notion (you need to create a "connection" in Notion) Create Database for your contacts in Notion Choose which properties to extract from the .csv and pass it in to the Notion database. Right now, it transfer 4 pieces of information: full name, email, phone, and company.
by dev
Every 10 minutes look at your published news in your Tiny tiny RSS public feed and make a toot on your mastodon. You'll need: Your mastondon URL instance Your mastondon access token Your Tiny Tiny RSS public published feed URL
by sthosstudio
This workflow automatically tracks changes on specific websites, typically in e-commerce where you want to get information about price changes. Prerequisites Basic knowledge of HTML and JavaScript Nodes Execute Command nodes create a file named kopacky.json in the /data/ folder (Make sure that n8n has the permissions to make changes to the folder in your setup) and clean data. Cron node triggers the workflow at regular intervals (default is 15 minutes), depending on how often you want to crawl URLs of your watchers. Function Item node (Change me) adds the URL watchers. You can put as many URLs (watchers) as you want by changing the JavaScript code in the node. There are four properties for each watcher: |Property|Meaning| |-|-| |slug|Unique identifier for the watcher.| |link|URL of the website where you want to track changes.| |selector|CSS selector of the HTML tag, where your price is placed. You can use browser web tools to get a specific selector.| |currency|Currency code in which your price is set.| Function Item node (Init item) saves all required data from each watcher to the kopacky.json file. HTTP Request node fetches data from the website. HTML Extract node extracts the required information from the webpage. Send Email nodes (NotifyBetterPrice) send you an email when there is an issue with getting the price, and when a better price is available (this could happen if the website is down, your tracking product is not available anymore, or the owner of the website changed the selector or HTML). IF nodes filter the incoming data and route the workflow. Move Binary Data nodes convert the JSON file to binary data. Write Binary File nodes write the product prices in the file. NOTE: This is the first (beta) version of this workflow, so it could have some issues. For example, there is an issue with getting content of those websites, where the owner of the website blocks any calls from unknown foreign services - it's typical protection against crawlers.
by Harshil Agrawal
This workflow demonstrates the use of static data in n8n. The workflow is built on the concept of polling. Cron node: The Cron node triggers the workflow every minute. You can configure the time based on your use-case. HTTP Request node: This node makes an HTTP Request to an API that returns the position of the ISS. Set node: In the Set node we set the information that we need in the workflow. Since we only need the timestamp, latitude, and longitude we set this in the node. If you need other information, you can set them in this node. Function node: The Function node, checks if the incoming data is similar to the data that was returned in the previous execution or not. If the data is different the Function node returns this new node, otherwise, it returns a message 'No New Items'. The data is also stored as static data with the workflow. Based on your use-case, you can build the workflow further. For example, you can use it send updates to Mattermost or Slack
by Jonathan
Task: Make sure that data is in the right format before injecting it into a database/spreadsheet/CRM/etc. Why: Spreadsheets and databases require the incoming data to have the same fields as the headers of the destination table. You can decide which fields you would like to send with the database and rename them by using the set node Main use cases: Change fields names to match a database or a spreadsheet table structure Keep only the fields that are needed at the destination table
by Eduard
This workflow demonstrates how CSV file can be automatically imported into existing MySQL database. Before running the workflow please make sure you have a file on the server: /home/node/.n8n/concerts-2023.csv And the content of the file is the following: Date,Band,ConcertName,Country,City,Location,LocationAddress, 2023-05-28,Ozzy Osbourne,No More Tours 2 - Special Guest: Judas Priest,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-05-08,Elton John,Farewell Yellow Brick Road Tour 2023,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-05-26,Hans Zimmer Live,Europe Tour 2023,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-07-07,Depeche Mode,Memento Mori World Tour 2023,Germany,Berlin,Olympiastadion Berlin,"Olympischer Platz 3, 14053 Berlin-Charlottenburg", The detailed process is explained in the tutorial https://blog.n8n.io/import-csv-into-mysql
by n8n Team
This workflow automatically downloads a CSV from the web, and parses it in a format that n8n can access. It then ensures that the data from the CSV is matched to the names of the columns in the database, and inserts this data as new rows in Snowflake. Prerequisites: A CSV with data A Snowflake account and credentials A Snowflake database to upload your CSV to Nodes: A HTTP Request node to download the CSV file A Spreadsheet File node to access the data from the CSV A Set node to ensure the data from the CSV is mapped to the column names of the Snowflake database A Snowflake node to insert these new rows into the database.
by Eduard
This workflow demonstrates how easy it is to export SQL query to CSV automatically! Before running the workflow please make sure you have access to a local or remote MSSQL server with a sample AdventureWorks database. The detailed process is explained in the tutorial https://blog.n8n.io/sql-export-to-csv/
by Tom
This workflow is the opposite of this one. It transforms multiple different items with one binary object named data into a single item with multiple binary objects: This can be useful when creating a single .zip archive for example. It uses the updated Code node instead of the older Function node.
by n8n Team
This workflow template shows how to load JSON data into a workflow and push that data into an App or convert it into a Spreadsheet file. Specifically, this workflow shows how to make a generic API request that returns JSON. It then shows how to load that into a Google Sheets spreadsheet, or convert it to .CSV file format. However, you can use the general pattern to load data into any app or convert to any spreadsheet file format (such as .xlsx).