Price Per TokenPrice Per Token

LangGraph Docs MCP Server

by langchain-ai

0

About

LangGraph Docs MCP Server provides transparent, user-controlled access to llms.txt documentation indexes for LangGraph, LangChain, and other supported libraries. Instead of relying on opaque built-in tools within IDEs, developers can audit every tool call and the exact context retrieved. Key capabilities: - Fetch_docs tool reads URLs from user-defined llms.txt files - Pre-configured support for LangGraph (Python and JS) and LangChain (Python and JS) documentation - Security controls with strict domain access restrictions based on configured llms.txt sources - Full auditability of documentation retrieval for Cursor, Windsurf, Claude Code/Desktop, and other MCP hosts

README

MCP LLMS-TXT Documentation Server

Overview

llms.txt is a website index for LLMs, providing background information, guidance, and links to detailed markdown files. IDEs like Cursor and Windsurf or apps like Claude Code/Desktop can use llms.txt to retrieve context for tasks. However, these apps use different built-in tools to read and process files like llms.txt. The retrieval process can be opaque, and there is not always a way to audit the tool calls or the context returned.

MCP offers a way for developers to have *full control* over tools used by these applications. Here, we create an open source MCP server to provide MCP host applications (e.g., Cursor, Windsurf, Claude Code/Desktop) with (1) a user-defined list of llms.txt files and (2) a simple fetch_docs tool read URLs within any of the provided llms.txt files. This allows the user to audit each tool call as well as the context returned.

llms-txt

You can find llms.txt files for langgraph and langchain here:

| Library | llms.txt | |------------------|------------------------------------------------------------------------------------------------------------| | LangGraph Python | https://langchain-ai.github.io/langgraph/llms.txt | | LangGraph JS | https://langchain-ai.github.io/langgraphjs/llms.txt | | LangChain Python | https://python.langchain.com/llms.txt | | LangChain JS | https://js.langchain.com/llms.txt |

Quickstart

#### Install uv

  • Please see official uv docs for other ways to install uv.
  • curl -LsSf https://astral.sh/uv/install.sh | sh
    

    #### Choose an llms.txt file to use.

  • For example, here's the LangGraph llms.txt file.
  • > Note: Security and Domain Access Control > > For security reasons, mcpdoc implements strict domain access controls: > > 1. Remote llms.txt files: When you specify a remote llms.txt URL (e.g., https://langchain-ai.github.io/langgraph/llms.txt), mcpdoc automatically adds only that specific domain (langchain-ai.github.io) to the allowed domains list. This means the tool can only fetch documentation from URLs on that domain. > > 2. Local llms.txt files: When using a local file, NO domains are automatically added to the allowed list. You MUST explicitly specify which domains to allow using the --allowed-domains parameter. > > 3. Adding additional domains: To allow fetching from domains beyond those automatically included: > - Use --allowed-domains domain1.com domain2.com to add specific domains > - Use --allowed-domains '*' to allow all domains (use with caution) > > This security measure prevents unauthorized access to domains not explicitly approved by the user, ensuring that documentation can only be retrieved from trusted sources.

    #### (Optional) Test the MCP server locally with your llms.txt file(s) of choice:

    uvx --from mcpdoc mcpdoc \
        --urls "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt" "LangChain:https://python.langchain.com/llms.txt" \
        --transport sse \
        --port 8082 \
        --host localhost
    

  • This should run at: http://localhost:8082
  • Run MCP inspector and connect to the running server:
  • npx @modelcontextprotocol/inspector
    

  • Here, you can test the tool calls.
  • #### Connect to Cursor

  • Open Cursor Settings and MCP tab.
  • This will open the ~/.cursor/mcp.json file.
  • Paste the following into the file (we use the langgraph-docs-mcp name and link to the LangGraph llms.txt).
  • {
      "mcpServers": {
        "langgraph-docs-mcp": {
          "command": "uvx",
          "args": [
            "--from",
            "mcpdoc",
            "mcpdoc",
            "--urls",
            "LangGraph:https://langchain-ai.github.io/langgraph/llms.txt LangChain:https://python.langchain.com/llms.txt",
            "--transport",
            "stdio"
          ]
        }
      }
    }
    

  • Confirm that the server is running i
  • Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81