Price Per TokenPrice Per Token

MCP Bridge MCP Server

by patruff

0

About

MCP-LLM Bridge is a TypeScript middleware that connects local LLMs running via Ollama to the Model Context Protocol (MCP) ecosystem. It translates between LLM outputs and MCP's JSON-RPC protocol, enabling open-source models to use external tools and services just like cloud-based assistants. Key features: - Connects Ollama-compatible models to multiple MCP servers including filesystem access, Brave web search, GitHub operations, memory persistence, Flux image generation, and Google Drive/Gmail integration - Provides dynamic tool routing and structured output validation for reliable execution - Offers multi-server MCP orchestration through a unified local endpoint - Supports configuration-driven setup with credential management for each service - Includes robust process management, automatic tool detection from prompts, and detailed logging

README

MCP-LLM Bridge

A TypeScript implementation that connects local LLMs (via Ollama) to Model Context Protocol (MCP) servers. This bridge allows open-source models to use the same tools and capabilities as Claude, enabling powerful local AI assistants.

Overview

This project bridges local Large Language Models with MCP servers that provide various capabilities like:

  • Filesystem operations
  • Brave web search
  • GitHub interactions
  • Google Drive & Gmail integration
  • Memory/storage
  • Image generation with Flux
  • The bridge translates between the LLM's outputs and the MCP's JSON-RPC protocol, allowing any Ollama-compatible model to use these tools just like Claude does.

    Current Setup

  • LLM: Using Qwen 2.5 7B (qwen2.5-coder:7b-instruct) through Ollama
  • MCPs:
  • - Filesystem operations (@modelcontextprotocol/server-filesystem) - Brave Search (@modelcontextprotocol/server-brave-search) - GitHub (@modelcontextprotocol/server-github) - Memory (@modelcontextprotocol/server-memory) - Flux image generation (@patruff/server-flux) - Gmail & Drive (@patruff/server-gmail-drive)

    Architecture

  • Bridge: Core component that manages tool registration and execution
  • LLM Client: Handles Ollama interactions and formats tool calls
  • MCP Client: Manages MCP server connections and JSON-RPC communication
  • Tool Router: Routes requests to appropriate MCP based on tool type
  • Key Features

  • Multi-MCP support with dynamic tool routing
  • Structured output validation for tool calls
  • Automatic tool detection from user prompts
  • Robust process management for Ollama
  • Detailed logging and error handling
  • Setup

    1. Install Ollama and required model:

    ollama pull qwen2.5-coder:7b-instruct
    

    2. Install MCP servers:

    npm install -g @modelcontextprotocol/server-filesystem
    npm install -g @modelcontextprotocol/server-brave-search
    npm install -g @modelcontextprotocol/server-github
    npm install -g @modelcontextprotocol/server-memory
    npm install -g @patruff/server-flux
    npm install -g @patruff/server-gmail-drive
    

    3. Configure credentials: - Set BRAVE_API_KEY for Brave Search - Set GITHUB_PERSONAL_ACCESS_TOKEN for GitHub - Set REPLICATE_API_TOKEN for Flux - Run Gmail/Drive MCP auth: node path/to/gmail-drive/index.js auth - For example node C:\Users\patru\AppData\Roaming\npm\node_modules\@patruff\server-gmail-drive\dist\index.js auth

    Configuration

    The bridge is configured through bridge_config.json:

  • MCP server definitions
  • LLM settings (model, temperature, etc.)
  • Tool permissions and paths
  • Example:

    {
      "mcpServers": {
        "filesystem": {
          "command": "node",
          "args": ["path/to/server-filesystem/dist/index.js"],
          "allowedDirectory": "workspace/path"
        },
        // ... other MCP configurations
      },
      "llm": {
        "model": "qwen2.5-coder:7b-instruct",
        "baseUrl": "http://localhost:11434"
      }
    }
    

    Usage

    1. Start the bridge:

    npm run start
    

    2. Available commands: - list-tools: Show available tools - Regular text: Send prompts to the LLM - quit: Exit the program

    Example interactions:

    > Search the web for "latest TypeScript features"
    [Uses Brave Search MCP to find results]

    > Create a new folder called "project-docs" [Uses Filesystem MCP to create directory]

    > Send an email to user@example.com [Uses Gmail MCP to compose and send email]

    Technical Details

    Tool Detection

    The bridge includes smart tool detection based on user input:
  • Email operations: Detected by email addresses and keywords
  • Drive operations: Detected by file/folder keywords
  • Search operations: Contextually routed to appropriate search tool
  • Response Processing

    Responses are processed through multiple stages: 1. LLM generates structured tool calls 2. Bridge validates and routes to appropriate MCP 3. MCP executes operation and returns result 4. Bridge formats response for user

    Extended Capabilities

    This bridge effectively brings Claude's tool capabilities to local models:

  • Filesystem manipulation
  • Web search and research
  • Email and document management
  • Code and GitHub interactions
  • Image generation
  • Persistent memory
  • All while running completely locally with open-source models.

    Future Improvements

  • Add support for more MCPs
  • Implement parallel tool execution
  • Add streaming responses
  • Enhance error recovery
  • Add conversation memory
  • Support more Ollama models
  • Related Projects

    This bridge integrates with the broader Claude ecosystem:

  • Model Context Protocol (MCP)
  • Claude Desktop Configuration
  • Ollama Project
  • Various MCP server implementations
  • The result is a powerful local AI assistant that can match many of Claude's capabilities while running entirely on your own hardware.

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81