Price Per TokenPrice Per Token
MindBridge

MindBridge

by pinkpixel-dev

GitHub 27 298 uses
0

About

MindBridge is an AI model router and orchestration hub that unifies access to multiple LLM providers through a single interface. It eliminates vendor lock-in by letting you route requests between OpenAI, Anthropic, Google, DeepSeek, OpenRouter, Ollama (local models), and other OpenAI-compatible APIs — selecting the right model for each task automatically or on-demand. Key features: - Multi-provider support: Connect to OpenAI, Claude, Gemini, DeepSeek, OpenRouter, Groq, and local models via Ollama - Smart routing: Auto-direct requests to reasoning-optimized models like Claude or DeepSeek Reasoner for complex tasks, or faster/cheaper models for simple queries - getSecondOpinion: Query multiple models simultaneously and compare responses side-by-side for consensus or diverse perspectives - OpenAI-compatible API layer: Drop-in replacement for OpenAI endpoints that works with Azure, Together.ai, and other compatible services - Provider auto-detection: Automatic setup and discovery once API keys are configured via environment variables or JSON config

README

[](https://mseep.ai/app/pinkpixel-dev-mindbridge-mcp)

MindBridge MCP Server ⚡ The AI Router for Big Brain Moves

[](https://smithery.ai/server/@pinkpixel-dev/mindbridge-mcp)

MindBridge is your AI command hub — a Model Context Protocol (MCP) server built to unify, organize, and supercharge your LLM workflows.

Forget vendor lock-in. Forget juggling a dozen APIs. MindBridge connects your apps to *any* model, from OpenAI and Anthropic to Ollama and DeepSeek — and lets them talk to each other like a team of expert consultants.

Need raw speed? Grab a cheap model. Need complex reasoning? Route it to a specialist. Want a second opinion? MindBridge has that built in.

This isn't just model aggregation. It's model orchestration.

---

Core Features 🔥

| What it does | Why you should use it | |--------------|--------------| | Multi-LLM Support | Instantly switch between OpenAI, Anthropic, Google, DeepSeek, OpenRouter, Ollama (local models), and OpenAI-compatible APIs.| | Reasoning Engine Aware | Smart routing to models built for deep reasoning like Claude, GPT-4o, DeepSeek Reasoner, etc.| | getSecondOpinion Tool | Ask multiple models the same question to compare responses side-by-side. | | OpenAI-Compatible API Layer | Drop MindBridge into any tool expecting OpenAI endpoints (Azure, Together.ai, Groq, etc.). | | Auto-Detects Providers | Just add your keys. MindBridge handles setup & discovery automagically. | | Flexible as Hell | Configure everything via env vars, MCP config, or JSON — it's your call. |

---

Why MindBridge?

> *"Every LLM is good at something. MindBridge makes them work together."*

Perfect for:

  • Agent builders
  • Multi-model workflows
  • AI orchestration engines
  • Reasoning-heavy tasks
  • Building smarter AI dev environments
  • LLM-powered backends
  • Anyone tired of vendor walled gardens
  • ---

    Installation 🛠️

    Option 1: Install from npm (Recommended)

    # Install globally
    npm install -g @pinkpixel/mindbridge

    use with npx

    npx @pinkpixel/mindbridge

    Installing via Smithery

    To install mindbridge-mcp for Claude Desktop automatically via Smithery:

    npx -y @smithery/cli install @pinkpixel-dev/mindbridge-mcp --client claude
    

    Option 2: Install from source

    1. Clone the repository:

       git clone https://github.com/pinkpixel-dev/mindbridge.git
       cd mindbridge
       

    2. Install dependencies:

       chmod +x install.sh
       ./install.sh
       

    3. Configure environment variables:

       cp .env.example .env
       
    Edit .env and add your API keys for the providers you want to use.

    Configuration ⚙️

    Environment Variables

    The server supports the following environment variables:

  • OPENAI_API_KEY: Your OpenAI API key
  • ANTHROPIC_API_KEY: Your Anthropic API key
  • DEEPSEEK_API_KEY: Your DeepSeek API key
  • GOOGLE_API_KEY: Your Google AI API key
  • OPENROUTER_API_KEY: Your OpenRouter API key
  • OLLAMA_BASE_URL: Ollama instance URL (default: http://localhost:11434)
  • OPENAI_COMPATIBLE_API_KEY: (Optional) API key for OpenAI-compatible services
  • OPENAI_COMPATIBLE_API_BASE_URL: Base URL for OpenAI-compatible services
  • OPENAI_COMPATIBLE_API_MODELS: Comma-separated list of available models
  • MCP Configuration

    For use with MCP-compatible IDEs like Cursor or Windsurf, you can use the following configuration in your mcp.json file:

    ```json { "mcpServers": { "mindbridge": { "command": "npx", "args": [ "-y", "@pinkpixel/mindbridge" ], "env": { "OPENAI_API_KEY": "OPENAI_API_KEY_HERE", "ANTHROPIC_API_KEY": "ANTHROPIC_API_KEY_HERE", "GOOGLE_API_KEY": "GOOGLE_API_KEY_HERE", "DEEPSEEK_API_KEY": "DEEPSEEK_API_KEY_HERE", "OPENROUTER_API_KEY": "OPENROUTER_API_KEY_HERE" }, "provider_config": { "openai": { "default_model": "gpt-4o" }, "anthropic": { "default_model": "claude-3-5-sonnet-20241022" }, "google": { "default_model": "gemini-2.0-flash" }, "deepseek": { "default_model": "deepseek-chat" }, "openrouter": { "default_model": "openai/gpt-4o" }, "ollama": { "base_url": "http://localhost:11434", "default_model": "llama3" }, "openai_compatible": { "api_key": "API_KEY_HERE_OR_REMOVE_IF_NOT_NEEDED", "base_url": "FULL_API_URL_HERE", "available_models": ["MODEL1", "MODEL2"], "default_model": "MODEL1" } }, "default_params": {

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81