Chat with Gemini AI through Local CLI via SSH

This workflow allows you to integrate the Google Gemini CLI into your n8n AI Agents. It is designed for self-hosted n8n instances and enables you to chat with the Gemini CLI running on your local machine or server via SSH.

This is powerful for users who want to utilize the free tier of Gemini via Google's CLI tools or need the AI to interact with local files on the host server.

How it works AI Agent**: The main workflow uses a LangChain AI Agent with a custom tool. Custom Tool**: When the agent needs to answer, it calls a sub-workflow ("Gemini CLI Worker"). SSH Execution**: The sub-workflow connects to your host machine via SSH, executes the gemini command with your prompt, and returns the CLI's standard output to the chat.

Set up steps Prerequisites: You must have Node.js (v20+) and the gemini-chat-cli installed on your host machine. Split the Workflows: Copy the bottom section (Sub-workflow). Paste it into a new workflow, name it "Gemini CLI Worker", and Save it. Note the ID of this new workflow. Configure the Main Workflow: Open the Call 'Gemini CLI' node. In the "Workflow ID" field, select the "Gemini CLI Worker" workflow you just saved. Configure SSH: Open the Execute a command node in the sub-workflow. Configure your SSH credentials (IP, Username, Password/Key) to allow n8n to connect to the host where Gemini CLI is installed.

0
Downloads
0
Views
7.9
Quality Score
beginner
Complexity
Author:Alexandru Florea(View Original →)
Created:12/19/2025
Updated:1/18/2026

🔒 Please log in to import templates to n8n and favorite templates

Workflow Visualization

Loading...

Preparing workflow renderer

Comments (0)

Login to post comments