Price Per TokenPrice Per Token
Gemini CLI MCP Server

Gemini CLI MCP Server

by epicsagas

GitHub 1Remote
0

About

Gemini CLI MCP Server bridges the Model Context Protocol with Google's Gemini CLI, enabling AI agents like Cursor and Claude Desktop to invoke Gemini commands directly. The server wraps locally installed `gemini-cli` functionality and exposes it as standardized MCP tools. Key features: - Ask questions and interact with Gemini AI models through conversational interfaces - Git workflow automation including commit generation and pull request management - Dual implementations available in Python (FastAPI) and Node.js - Multiple transport modes: stdio, HTTP, and Docker containerization - Distributed via PyPI (`pip install gemini-cli-mcp`) and npm (`npm install -g gemini-cli-mcp-server`) - Requires Gemini authentication via `gemini login` or `GEMINI_API_KEY` environment variable

README

gemini-cli-mcp

> Quickstart for End Users: > > Install via PyPI (Python): >

> pip install gemini-cli-mcp
> 
> Install via npm (Node.js): >
> npm install -g gemini-cli-mcp-server
> 
> > MCP Client Configuration: > - Set the command in your MCP client (e.g., Cursor, Claude Desktop) to the absolute path of the installed gemini-cli-mcp executable. > - Do not point to a local script or source file unless you are developing or debugging. > > For advanced usage, development, or troubleshooting, see the implementation-specific README files in server_py/ (Python) or server_node/ (Node.js).

> Authentication Requirement: > > Before using this server, you must either: > - Log in to gemini-cli (e.g., by running gemini login) to maintain an active login session, or > - Set your Gemini API key as the GEMINI_API_KEY environment variable. > > Without authentication, the server will not be able to invoke gemini-cli commands successfully.

gemini-cli-mcp is a server that bridges the Model Context Protocol (MCP) with the locally installed gemini-cli. It allows modern AI agents, such as Cursor and Claude Desktop, to use gemini-cli's powerful features as Tools.

This server enables invoking key gemini-cli functionalities—including ask, ~~agent, commit, and pr~~ —directly from your AI agent.

1. Project Overview

This project aims to provide a seamless integration between AI agents and gemini-cli, supporting multiple languages and environments to maximize developer experience.

Goals

  • MCP Compliance: Fully adhere to the MCP specification for stable integration.
  • Tool Abstraction: Expose core gemini-cli commands as MCP Tools.
  • Multi-language & Multi-environment Support: Provide implementations in Python and Node.js, supporting stdio, http, and Docker.
  • Effortless Deployment: Distribute via pip (PyPI) and npm.
  • Architecture

    flowchart LR
        A["AI Agent(Cursor, etc.)"]
        B["gemini-cli-mcp(Python or Node.js)"]
        C["gemini-cli"]
        A -- "MCP (stdio/http)" --> B
        B -- "Shell (Subprocess)" --> C
        C -- "Shell (Subprocess)" --> B
        B -- "MCP (stdio/http)" --> A
    

    2. Implementations

    This project provides separate, language-specific implementations. Please refer to the README.md file within each implementation directory for detailed setup and usage instructions.

  • Python: A server built with FastAPI.
  • Node.js: A server built with Node.js.
  • 3. MCP Tool Reference

    The server exposes gemini-cli commands as MCP tools. The core logic involves wrapping gemini-cli commands based on the tool called.

    Available Tools

    | Tool Name | Description | Main Params | | ------------------- | ------------------------------------------------------------------------ | ------------------------------------------------------------ | | gemini_ask | Ask a question in Ask mode. | question (string) | | gemini_yolo | Run a prompt in Agent mode with auto-execution. | prompt (string) | | ~~gemini_git_commit~~ | ~~Generate a conventional commit message and perform git commit.~~ | ~~branch_name (string, optional)~~ | | ~~gemini_git_pr~~ | ~~Automatically commit, push, and create a PR.~~ | ~~commit_message, branch_name, pr_title (all optional strings)~~ | | ~~gemini_git_diff~~ | ~~Summarize code changes using Gemini AI.~~ | ~~diff_args (string, optional)~~ |

    Command Translation Example

  • gemini_askgemini ask --model {model} --all_files --sandbox --prompt "{question}"
  • gemini_yologemini agent --model {model} --all_files --sandbox --yolo --prompt "{prompt}"
  • 4. MCP Client Configuration & Usage

    No need to start the server manually

  • The MCP client will launch the process and communicate via STDIO.
  • Just register the following configuration.
  • #### Cursor, Windsurf Example

    // cursor: $HOME/.cursor/mcp.json
    // windwurf: $HOME/.codeium/windsurf/mcp_config.json
    {
      "mcpServers": {
        "gemini-cli-mcp": {
          "type": "stdio",
          "command": "gemini-cli-mcp", // gemini-cli-mcp-node for node
          "env": {
            "GEMINI_MODEL": "gemini-2.5-flash",
            "PROJECT_ROOT": "/path/to/project_root"
          }
        }
      }
    }
    

    #### Claude Code Example ``json // Settings > Developer > Edit Config > claude_desktop_config.json // find command location with which gemini-cli-mcp` // MUST provide a Gemini API key to use with Claude Desktop { "mcpServer

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81