Price Per TokenPrice Per Token
HexDocs MCP

HexDocs MCP

by bradleygolden

0

About

HexDocs MCP enables semantic search of Hex package documentation directly from your editor. It combines an Elixir binary for downloading and embedding documentation with a TypeScript MCP server that integrates with AI coding assistants. Key features: - Semantic search across Hex package documentation using embeddings - Automatic downloading and processing of Elixir package docs - Integration with MCP-compatible clients including Cursor, Claude Desktop, and Continue - Local embedding generation via Ollama using models like mxbai-embed-large - Supports the Hex package manager ecosystem for Elixir

README

HexDocs MCP

HexDocs MCP is a project that provides semantic search capabilities for Hex package documentation, designed specifically for AI applications. It consists of two main components:

1. An Elixir binary that downloads, processes, and generates embeddings from Hex package documentation 2. A TypeScript server implementing the Model Context Protocol (MCP) that calls the Elixir binary to fetch and search documentation

> [!CAUTION] > This documentation reflects the current development state on the main branch. > For documentation on the latest stable release, please see the latest release page and the latest release branch.

Installation

MCP Client Configuration

The TypeScript MCP server implements the Model Context Protocol (MCP) and is designed to be used by MCP-compatible clients such as Cursor, Claude Desktop App, Continue, and others. The server provides tools for semantic search of Hex documentation. For a complete list of MCP-compatible clients, see the MCP Clients documentation.

Add this to your client's MCP json config:

{
  "mcpServers": {
    "hexdocs-mcp": {
      "command": "npx",
      "args": [
        "-y",
        "hexdocs-mcp@0.5.0"
      ]
    }
  }
}

This command will automatically download the elixir binaries to both fetch_docs and search documentation. While the server handles downloading the binaries, you still need Elixir and Mix installed on your system for the HexDocs fetching functionality to work properly.

#### Smithery

Alternatively, you can use Smithery to automatically add the MCP server to your client config.

For example, for Cursor, you can use the following command:

npx -y @smithery/cli@latest install @bradleygolden/hexdocs-mcp --client cursor

Elixir Package

Alternatively, you can add the hexdocs_mcp package to your project if you don't want to use the MCP server.

{:hexdocs_mcp, "~> 0.5.0", only: :dev, runtime: false}

And if you use floki or any other dependencies that are marked as only available in another environment, update them to be available in the :dev environment as well.

For example floki is commonly used in :test:

{:floki, ">= 0.30.0", only: :test}

But you can update it to be available in the :dev environment:

{:floki, ">= 0.30.0", only: [:dev, :test]}

Requirements

  • Ollama - Required for generating embeddings
  • - Run ollama pull mxbai-embed-large to download the recommended embedding model - Ensure Ollama is running before using the embedding features
  • Elixir 1.16+ and Erlang/OTP 26+
  • - Installed automatically in CI environments - Required locally for development
  • Mix - The Elixir build tool (comes with Elixir installation)
  • Node.js 22 or later (for the MCP server)
  • Breaking Change: Model Migration (v0.6.0+)

    ⚠️ IMPORTANT: Version 0.6.0 introduces a breaking change with the default embedding model.

    What changed:

  • Default model changed from nomic-embed-text (384 dimensions) to mxbai-embed-large (1024 dimensions)
  • Existing embeddings are incompatible and will be cleared during upgrade
  • To upgrade: 1. Pull the new model:

       ollama pull mxbai-embed-large
       

    2. Your existing embeddings will be automatically cleared when you first run any command

    3. Regenerate embeddings for your packages:

       mix hex.docs.mcp fetch_docs phoenix
       

    Why this change: mxbai-embed-large provides significantly better semantic search quality and consistent dimensions across all platforms (Windows/macOS/Linux).

    Configuration

    Environment Variables

    The following environment variables can be used to configure the tool:

    | Variable | Description | Default | |----------|-------------|---------| | HEXDOCS_MCP_PATH | Path where data will be stored | ~/.hexdocs_mcp | | HEXDOCS_MCP_MIX_PROJECT_PATHS | Comma-separated list of paths to mix.exs files | (none) |

    #### Examples:

    # Set custom storage location
    export HEXDOCS_MCP_PATH=/path/to/custom/directory

    Configure common project paths to avoid specifying --project flag each time

    export HEXDOCS_MCP_MIX_PROJECT_PATHS="/path/to/project1/mix.exs,/path/to/project2/mix.exs"

    MCP Server Configuration

    You can also configure environment variables in the MCP configuration for the server:

    {
      "mcpServers": {
        "hexdocs-mcp": {
          "command": "...",
          "args": [
            "..."
          ],
          "env": {
            "HEXDOCS_MCP_PATH": "/path/to/custom/directory",
            "HEXDOCS_MCP_MIX_PROJECT_PATHS": "/path/to/project1/mix.exs,/path/to/project2/mix.exs"
          }
        }
      }
    }
    

    Usage

    AI Tooling

    The MCP server can be used by any MCP-compatible AI tool

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81