by Airtop
About this AI Agent This workflow is designed to automate web interactions by simulating a human user, using a combination of the Agent node and AI tools powered by Airtop. How does this workflow works? Form Submission Trigger: The workflow starts with a form submission trigger node named "On form submission". This node collects user instructions for the web AI agent, including a prompt and an optional Airtop profile name for sites requiring authentication. AI Agent: The core of the workflow is the "AI Agent" node, which uses a smart web agent to manage a remote web browser. It is designed to fulfill user requests by interacting with the browser through various tools. Browser Session Management Start Browser: The "Start browser" node initiates a new browser session and window. It is essential for obtaining the sessionId and windowId required for subsequent operations. Session and Window Management: The workflow includes nodes for creating and managing browser sessions and windows, such as "Session" and "Window". Web Interaction Tools: Load URL: This node loads a specified URL into the browser window. Query: The "Query" node allows the agent to ask questions and extract information from the current web page. Click: This node simulates clicking on elements within the web page. Type: The "Type" node types text into specified elements on the page. Session Termination: The "End session" node is used to terminate the browser session once the tasks are completed. Output Handling Structured Output Parser: This node processes the agent's results into a structured format. Output: The final results are set and prepared for output. Slack Integration: Although currently disabled, there is a "Slack" node intended to send messages to a Slack channel, potentially for notifications or live view URLs. Seting up your agent Airtop API Credentials: Users must have valid Airtop API credentials to interact with the web browser tools. This includes nodes like "Click", "Query", "Load URL", "End session", "Type", "Session", and "Window". Slack API Credentials (Optional): If users want to enable Slack notifications, they need to configure Slack OAuth2 credentials. The Slack node is currently disabled but can be used to send messages to a Slack channel. Anthropic API Credentials: The "Claude 3.5 Haiku" node requires Anthropic API credentials. Users need to have access to this API to utilize the language model features.
by ConvertAPI
Who is this for? For developers and organizations that need to convert XLSX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the XLSX file from the web. Converts the XLSX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by AppStoneLab Technologies LLP
Automated AI Research Assistant: From Query to Polished Report with Jina & Gemini Turn a single research question into a comprehensive, multi-source report with proper citations. This workflow automates the entire research process by leveraging the web-crawling power of Jina AI and the advanced reasoning capabilities of Google's Gemini models. Simply input your query, and this AI-powered assembly line will search the web, scrape relevant sources, summarize the content, draft a structured research paper, and finally, evaluate and polish the report for accuracy and formatting. ✨ Key Features 🔎 Dynamic Web Search**: Kicks off by searching the web with Jina AI based on your initial query. 📚 Multi-Source Content Scraping**: Automatically reads and extracts content from the top 10 search results. 🧠 AI-Powered Summarization**: Uses a Gemini agent to intelligently summarize each webpage, retaining the core information. ✍️ Automated Report Generation**: A specialized "Generator Agent" synthesizes the summarized data into a structured research paper, complete with an executive summary, introduction, discussion, and conclusion. ✅ Citation & Quality Verification**: A final "Evaluator Agent" meticulously checks the generated report for citation accuracy, logical flow, and markdown formatting, delivering a polished final document. 📈Rate-Limit Ready**: Includes a configurable Wait node to ensure stable execution when dealing with multiple API calls. 📝 What This Workflow Does This workflow is designed to be your personal research assistant. It addresses the time-consuming process of gathering, reading, and synthesizing information from multiple online sources. Instead of spending hours manually searching, reading, and citing, you can delegate the entire task to this workflow and receive a well-structured and cited report as the final output. It's perfect for students, researchers, content creators, and analysts who need to quickly compile information on any given topic. ⚙️ How It Works (Step-by-Step) Initiate with a Query: The workflow starts when you send your research question or topic to the Chat Trigger node. Search the Web: The user's query is passed to the Jina AI node, which performs a web search and returns the top 10 most relevant URLs. Scrape, Summarize, Repeat: The workflow then loops through each URL: Read Content: The Jina AI node scrapes the full text content from the URL. Summarize: A Summarizer Agent powered by Google Gemini reads the scraped content and the original user query, then generates a concise summary. Wait: A one-second pause helps to avoid hitting API rate limits before processing the next URL. Aggregate the Knowledge: Once the loop is complete, a Code node gathers all 10 individual summaries into a single, neatly structured list. Draft the Research Report: This aggregated data is fed to the Generator Agent. Following a detailed prompt, this Gemini-powered agent writes a full research report, structuring it with headings and adding inline citations for every piece of information it uses. Evaluate and Finalize: The generated draft is passed to the final Evaluator Chain. This agent acts as a quality control supervisor. It verifies that all claims are correctly cited, refines the content for clarity and academic tone, and polishes the markdown formatting to produce the final, ready-to-use report. 🚀 How to Use This Workflow Credentials: Click on Use template, then configure your credentials for the following nodes: Jina AI: You will need a Jina AI API key for the Search web and Read URL content nodes. Get your key from here: JinaAI API Key Google Gemini: You will need a Google Gemini API key for the Summarizer Model, Generator Model, and Evaluator Model nodes. Get your key from here: Gemini API Key Activate Workflow: Make sure the workflow is active in your n8n instance. Start Research: Send a chat message with your research topic to the webhook URL provided in the When chat message received node. Get Your Report: Check the output of the final node, Evaluator Chain, to find your completed and polished research report. Nodes Used Chat Trigger Jina AI Code (Python) Split in Batches (Looping) Wait AI Agent Basic LLM Chain Google Gemini Chat Model
by Raz Hadas
Stay ahead of the market with this powerful, automated workflow that performs real-time sentiment analysis on stock market news. By leveraging the advanced capabilities of Google Gemini, this solution provides you with actionable insights to make informed investment decisions. This workflow is designed for investors, traders, and financial analysts who want to automate the process of monitoring news and gauging market sentiment for specific stocks. It seamlessly integrates with Google Sheets for input and output, making it easy to track a portfolio of stocks. Key Features & Benefits Automated Daily Analysis: The workflow is triggered daily, providing you with fresh sentiment analysis just in time for the market open. Dynamic Stock Tracking: Easily manage your list of tracked stocks from a simple Google Sheet. AI-Powered Insights: Utilizes Google Gemini's sophisticated language model to analyze news content for its potential impact on stock prices, including a sentiment score and a detailed rationale. Comprehensive News Aggregation: Fetches the latest news articles from EODHD for each of your specified stock tickers. Error Handling & Validation: Includes built-in checks for invalid stock tickers and formats the AI output for reliable data logging. Centralized Reporting: Automatically logs the sentiment score, rationale, and date into a Google Sheet for easy tracking and historical analysis. How It Works This workflow follows a systematic process to deliver automated sentiment analysis: Scheduled Trigger: The workflow begins each day at a specified time. Fetch Stock Tickers: It reads a list of stock tickers from your designated Google Sheet. Loop and Fetch News: For each ticker, it retrieves the latest news articles using the EODHD API. AI Sentiment Analysis: The collected news articles are then passed to a Google Gemini-powered AI agent. The agent is prompted to act as a stock sentiment analyzer, evaluating the news and generating: A sentiment score from -1 (strong negative) to 1 (strong positive). A detailed rationale explaining the basis for the score. Data Formatting & Validation: The AI's output is parsed and validated to ensure it is in the correct JSON format. Log to Google Sheets: The final sentiment score and rationale are appended to your Google Sheet, alongside the corresponding stock ticker and the current date. Nodes Used Schedule Trigger Google Sheets SplitInBatches HttpRequest (EODHD) If Code (JavaScript) AI Agent (LangChain) Google Gemini Chat Model This workflow is a valuable tool for anyone looking to harness the power of AI for financial market analysis. Deploy this automated solution to save time, gain a competitive edge, and make more data-driven trading decisions.
by Harshil Agrawal
This workflow allows you to receive updates about the positiong of the ISS and add it to a table in TimescaleDB. Cron node: The Cron node triggers the workflow every minute. You can configure the time based on your use-case. HTTP Request node: This node makes an HTTP Request to an API that returns the position of the ISS. Based on your use-case you may want to fetch data from a different URL. Enter the URL in the URL field. Set node: In the Set node we set the information that we need in the workflow. Since we only need the timestamp, latitude, and longitude we set this in the node. If you need other information, you can set them in this node. TimescaleDB node: This node stores the information in a table named iss. You can use a different table as well.
by ConvertAPI
Who is this for? For developers and organizations that need to convert web page to PDF. What problem is this workflow solving? The web page conversion to PDF problem. What this workflow does Converts web page to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Change the parameter url to the webpage you want to convert to pdf Optionally, additional Body Parameters can be added for the converter.
by Miquel Colomer
Do you want to avoid communication problems when launching phone calls? This workflow verifies landline and mobile phone numbers using the uProc Get Parsed and validated phone tool with worldwide coverage. You need to add your credentials (Email and API Key - real -) located at Integration section to n8n. Node "Create Phone Item" can be replaced by any other supported service with phone values, like databases (MySQL, Postgres), or Typeform. The "uProc" node returns the next fields per every parsed and validated phone number: country_prefix: contains the international country phone prefix number. country_code: contains the 2-digit ISO country code of the phone number. local_number: contains the phone number without international prefix. formatted: contains a formatted version of the phone number, according to country detected. valid: detects if the phone number has a valid format and prefix. type: the phone number type (mobile, landline, or something else). "If" node checks if the phone number is valid. You can use the result to mark invalid phone numbers in your database or discard them from future telemarketing campaigns.
by Jenny
Vector Database as a Big Data Analysis Tool for AI Agents Workflows from the webinar "Build production-ready AI Agents with Qdrant and n8n". This series of workflows shows how to build big data analysis tools for production-ready AI agents with the help of vector databases. These pipelines are adaptable to any dataset of images, hence, many production use cases. Uploading (image) datasets to Qdrant Set up meta-variables for anomaly detection in Qdrant Anomaly detection tool KNN classifier tool For anomaly detection 1. This is the first pipeline to upload an image dataset to Qdrant. The second pipeline is to set up cluster (class) centres & cluster (class) threshold scores needed for anomaly detection. The third is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant to detect if it's an anomaly to the uploaded dataset. For KNN (k nearest neighbours) classification 1. This is the first pipeline to upload an image dataset to Qdrant. The second is the KNN classifier tool, which takes any image as input and classifies it on the uploaded to Qdrant dataset. To recreate both You'll have to upload crops and lands datasets from Kaggle to your own Google Storage bucket, and re-create APIs/connections to Qdrant Cloud (you can use Free Tier cluster), Voyage AI API & Google Cloud Storage. [This workflow] Batch Uploading Images Dataset to Qdrant This template imports dataset images from Google Could Storage, creates Voyage AI embeddings for them in batches, and uploads them to Qdrant, also in batches. In this particular template, we work with crops dataset. However, it's analogous to uploading lands dataset, and in general, it's adaptable to any dataset consisting of image URLs (as the following pipelines are). First, check for an existing Qdrant collection to use; otherwise, create it here. Additionally, when creating the collection, we'll create a payload index, which is required for a particular type of Qdrant requests we will use later. Next, import all (dataset) images from Google Cloud Storage but keep only non-tomato-related ones (for anomaly detection testing). Create (per batch) embeddings for all imported images using the Voyage AI multimodal embeddings API. Finally, upload the resulting embeddings and image descriptors to Qdrant via batch upload.
by darrell_tw
This workflow automates the process of fetching agricultural transaction data from the Taiwan Agricultural Products Open Data Platform and storing it in a Google Sheets document for further analysis. Key Features Manual Trigger: Allows manual execution of the workflow to control when data is fetched. HTTP Request: Sends a request to the Open Data Platform's API to retrieve detailed transaction data, including: Pricing (Upper, Middle, Lower, Average) Transaction quantities Crop and market details Split Out Node: Processes each record individually, ensuring accurate handling of every data entry. Google Sheets Integration: Appends the data into a structured Google Sheets document for easy access and analysis. Node Configurations 1. Manual Trigger Purpose**: Start the workflow manually. Configuration**: No setup needed. 2. HTTP Request Purpose**: Fetch agricultural data. Configuration**: URL: https://data.moa.gov.tw/api/v1/SheepQuotation Query Parameters: Start_time: 2024/12/01 End_time: 2024/12/31 MarketName: 台北二 api_key: <your_api_key> Headers: accept: application/json 3. Split Out Purpose**: Split the API response data array into individual items. Configuration**: Field to Split Out: Data 4. Google Sheets Purpose**: Append the data to Google Sheets. Configuration**: Operation: Append Document ID: <your_document_id> Sheet Name: Sheet1 Mapped Fields: TransDate, TcType, CropCode, CropName, MarketCode, MarketName Upper_Price, Middle_Price, Lower_Price, Avg_Price, Trans_Quantity 此 Workflow 從 台灣農業產品開放資料平臺 獲取農產品交易數據,並將其儲存到 Google Sheets 文件 中進行進一步分析。 主要功能 Manual Trigger:允許手動執行工作流程,以控制數據獲取的時間。 HTTP Request:向開放資料平臺的 API 發送請求,獲取詳細的交易數據,包括: 價格 (Upper, Middle, Lower, Average) 交易數量 作物和市場詳細資料 Split Out Node:逐筆處理每一筆記錄,確保數據準確無誤。 Google Sheets Integration:將數據追加到結構化的 Google Sheets 文件中,方便存取和分析。 節點設定 1. Manual Trigger 用途**:手動啟動工作流程。 設定**:無需額外設定。 2. HTTP Request 用途**:抓取農產品數據。 設定**: URL: https://data.moa.gov.tw/api/v1/SheepQuotation 查詢參數 (Query Parameters): Start_time: 2024/12/01 End_time: 2024/12/31 MarketName: 台北二 api_key: <your_api_key> 標頭 (Headers): accept: application/json 3. Split Out 用途**:將 API 回應的數據陣列分解為個別項目。 設定**: Field to Split Out: Data 4. Google Sheets 用途**:將數據追加至 Google Sheets。 設定**: Operation:Append Document ID:<your_document_id> Sheet Name:Sheet1 映射欄位 (Mapped Fields): TransDate, TcType, CropCode, CropName, MarketCode, MarketName Upper_Price, Middle_Price, Lower_Price, Avg_Price, Trans_Quantity 請多利用 Curl Import 功能 例如 curl -X GET "https://data.moa.gov.tw/api/v1/AgriProductsTransType/?Start_time=114.01.01&End_time=114.01.01&MarketName=%E5%8F%B0%E5%8C%97%E4%BA%8C" -H "accept: application/json" 農業資料開放平台 文件
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks individual sales rep performance—calls, emails, meetings, quota attainment—and surfaces coaching insights. Free your managers from manual report building and focus on improvement. Overview On a daily schedule, the workflow queries CRM and telephony/email logs, aggregating activity metrics per rep. OpenAI analyzes patterns, flags underperformance or standout achievements, and suggests tailored coaching tips. Results are delivered as a nicely formatted Slack message and stored in Airtable. Tools Used n8n** – Automation backbone CRM + Telephony APIs** – Provide activity data OpenAI** – Generates insights and coaching tips Slack** – Sends manager digest Airtable** – Maintains historical performance records How to Install Import the Workflow into n8n. Connect Data Sources: Add CRM, VoIP, and email API keys. Set Up OpenAI: Enter your API key. Authorize Slack & Airtable. Customize Metrics: Modify the aggregation node to focus on your KPIs. Use Cases Sales Coaching**: Provide reps with daily feedback. Performance Management**: Quickly identify top and low performers. Incentive Programs**: Track achievements for rewards. Revenue Operations**: Unify activity data into one source. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #salesperformance #openai #salescoaching #n8nworkflow #nocode #revenueops
by Sascha
Having a seamless flow of customer data between your online store and your marketing platform is essential. By keeping your systems synchronized, you can ensure that your marketing campaigns are accurately targeted and effective. The integration between Shopify, a leading e-commerce platform, and Mautic, an open-source marketing automation system, is not available out-of-the-box. However, with a n8n workflow you can bridge this gap with. This template will help you: enhance accuracy in marketing lists by ensuring that subscription changes in Shopify are instantly updated in Mautic. improve compliance with data protection laws by respecting users' subscription preferences across platforms achieve integration without the need for additional plugins or software, minimizing complexity and potential points of failure. This template will demonstrate the follwing concepts in n8n: working with Shopify in n8n control flow with the IF node use Webhooks validate Webhooks with the Crypto node use the GraphQL node to call the Shopify Admin API The template consists of two parts: Sync Email Subscriptions from Shopify to Mautic Sync Email Subscriptions from Mautic to Shopify How to get started? Create a custom app in Shopify get the credentials needed to connect n8n to Shopify This is needed for the Shopify Trigger Create Shopify Acces Token API credentials n n8n for the Shopify trigger node Create Header Auth credentials: Use X-Shopify-Access-Token as the name and the Acces-Token from the Shopify App you created as the value. The Header Auth is neccessary for the GraphQL nodes. Enable the Mautic API under Configuration/API Settings, After the settings are saved you will have an additional entry in your settings menu to create API credentials for n8n Create Mautic credentials in n8n Please make sure to read the notes in the template. For a detailed explanation please check the corresponding video: https://youtu.be/x63rrh_yJzI
by David Ashby
What it is- I wanted to create a simple, easy-to-use, MCP server for your Discord bot(s). How to set up- Literally all you do is select your bot auth (or crease a new Discord Bot auth if you havn't entered your key in n8n before) and that's IT! How to use it- You can now ask your bot to do things via any MCP client, including from within N8N workflows! Note: If you need an example, you can check out my simple quickstart Discord MCP Server that uses 4o to send messages to channels on your server and users who are members of the server the bot is in. Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community