Price Per TokenPrice Per Token

MCP Server for WinDbg Crash Analysis

by svnscha

0

About

MCP Server for WinDbg Crash Analysis enables AI assistants to analyze Windows crash dumps and perform live debugging through integration with WinDbg and CDB (Console Debugger). Key features: - Windows crash dump examination and analysis using natural language queries - Live and remote debugging session connectivity via CDB/WinDbg - Directory analysis for processing multiple dumps to identify patterns - Direct execution of debugger commands like stack traces, memory inspection, and access violation analysis - Support for custom symbol paths and configurable command timeouts - Multiple transport protocols including stdio (for VS Code, Claude Desktop) and streamable-http - Compatible with Debugging Tools for Windows from the Windows SDK or Microsoft Store WinDbg - Requires Python 3.10+ and Windows with installed debugging tools

README

MCP Server for WinDbg Crash Analysis

A Model Context Protocol server that bridges AI models with WinDbg for crash dump analysis and remote debugging.

Overview

This MCP server integrates with CDB to enable AI models to analyze Windows crash dumps and connect to remote debugging sessions using WinDbg/CDB.

What is this?

An AI-powered tool that bridges LLMs with WinDbg for crash dump analysis and live debugging. Execute debugger commands through natural language queries like *"Show me the call stack and explain this access violation"*.

What This is Not

Not a magical auto-fix solution. It's a Python wrapper around CDB that leverages LLM knowledge to assist with debugging.

Usage Modes

  • Crash Dump Analysis: Examine Windows crash dumps
  • Live Debugging: Connect to remote debugging targets
  • Directory Analysis: Process multiple dumps for patterns
  • Quick Start

    Prerequisites

  • Windows with Debugging Tools for Windows or WinDbg from Microsoft Store.
  • Python 3.10 or higher
  • Any MCP-compatible client (GitHub Copilot, Claude Desktop, Cline, Cursor, Windsurf etc.)
  • Configure MCP server in your chosen client
  • > [!TIP] > In enterprise environments, MCP server usage might be restricted by organizational policies. Check with your IT team about AI tool usage and ensure you have the necessary permissions before proceeding.

    Installation

    pip install mcp-windbg
    

    Transport Options

    The MCP server supports multiple transport protocols:

    | Transport | Description | Use Case | |-----------|-------------|----------| | stdio (default) | Standard input/output | Local MCP clients like VS Code, Claude Desktop | | streamable-http | Streamable HTTP | Modern HTTP clients with bidirectional streaming |

    Starting with Different Transports

    Standard I/O (default):

    mcp-windbg
    

    or explicitly

    mcp-windbg --transport stdio

    Streamable HTTP:

    mcp-windbg --transport streamable-http --host 127.0.0.1 --port 8000
    
    Endpoint: http://127.0.0.1:8000/mcp

    Command Line Options

    --transport {stdio,streamable-http}  Transport protocol (default: stdio)
    --host HOST                              HTTP server host (default: 127.0.0.1)
    --port PORT                              HTTP server port (default: 8000)
    --cdb-path PATH                          Custom path to cdb.exe
    --symbols-path PATH                      Custom symbols path
    --timeout SECONDS                        Command timeout (default: 30)
    --verbose                                Enable verbose output
    

    Configuration for Visual Studio Code

    To make MCP servers available in all your workspaces, use the global user configuration:

    1. Press F1, type > and select MCP: Open User Configuration. 2. Paste the following JSON snippet into your user configuration:

    {
        "servers": {
            "mcp_windbg": {
                "type": "stdio",
                "command": "python",
                "args": ["-m", "mcp_windbg"],
                "env": {
                    "_NT_SYMBOL_PATH": "SRV*C:\\Symbols*https://msdl.microsoft.com/download/symbols"
                }
            }
        }
    }
    

    This enables MCP Windbg in any workspace, without needing a local .vscode/mcp.json file.

    HTTP Transport Configuration

    For scenarios where you need to run the MCP server separately (e.g., remote access, shared server, or debugging the server itself), you can use the HTTP transport:

    1. Start the server manually:

    python -m mcp_windbg --transport streamable-http --host 127.0.0.1 --port 8000
    

    2. Configure VS Code to connect via HTTP:

    {
        "servers": {
            "mcp_windbg_http": {
                "type": "http",
                "url": "http://localhost:8000/mcp"
            }
        }
    }
    

    > Workspace-specific and alternative configuration: See Installation documentation for details on configuring Claude Desktop, Cline, and other clients, or for workspace-only setup.

    Once configured, restart your MCP client and start debugging:

    Analyze the crash dump at C:\dumps\app.dmp
    

    MCP Compatibility

    This server implements the Model Context Protocol (MCP), making it compatible with any MCP-enabled client:

    The beauty of MCP is that you write the server once, and it works everywhere. Choose your favorite AI assistant!

    Tools

    | Tool | Purpose | Use Case | |------|---------|----------| | list_windbg_dumps | List crash dump files | Discovery and batch analysis | | open_windbg_dump | Analyze crash

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81