About
hyper-mcp is a WebAssembly-based MCP server that enables secure, extensible plugin execution for AI applications. It connects to Claude Desktop, Cursor IDE, and other MCP-compatible clients while allowing developers to extend functionality with plugins written in any language that compiles to WebAssembly. Key features of hyper-mcp: - Multi-language plugin development: Build plugins in Rust, Go, C, or any language targeting WASM - OCI registry distribution: Publish and load plugins from standard container registries like Docker Hub and GitHub Container Registry, complete with Sigstore signing and verification - Sandboxed execution: Memory-safe runtime with fine-grained permissions controlling network, filesystem, and resource access - Full protocol support: Compatible with stdio, SSE, and streamable-http MCP transport protocols - Lightweight deployment: Runs efficiently on resource-constrained environments from serverless functions to IoT devices and mobile platforms - Cross-platform: Native support for Linux, macOS, and Windows operating systems
README
[](https://crates.io/crates/hyper-mcp) [](#license) [](https://github.com/tuananh/hyper-mcp/issues)
hyper-mcp
⚠️ NOTICE: PROJECT TRANSFERRED
This project has been transferred to https://github.com/joseph-wortmann/hyper-mcp
Please refer to the new repository for the latest updates and contributions.
A fast, secure MCP server that extends its capabilities through WebAssembly plugins.
What is it?
hyper-mcp makes it easy to add AI capabilities to your applications. It works with Claude Desktop, Cursor IDE, and other MCP-compatible apps. Write plugins in your favorite language, distribute them through container registries, and run them anywhere - from cloud to edge.
Features
stdio, sse and streamble-http.Security
Built with security-first mindset:
Getting Started
1. Create your config file:
- Linux: $HOME/.config/hyper-mcp/config.json
- Windows: {FOLDERID_RoamingAppData}. Eg: C:\Users\Alice\AppData\Roaming
- macOS: $HOME/Library/Application Support/hyper-mcp/config.json
{
"plugins": {
"time": {
"url": "oci://ghcr.io/tuananh/time-plugin:latest"
},
"qr_code": {
"url": "oci://ghcr.io/tuananh/qrcode-plugin:latest"
},
"hash": {
"url": "oci://ghcr.io/tuananh/hash-plugin:latest"
},
"myip": {
"url": "oci://ghcr.io/tuananh/myip-plugin:latest",
"runtime_config": {
"allowed_hosts": ["1.1.1.1"]
}
},
"fetch": {
"url": "oci://ghcr.io/tuananh/fetch-plugin:latest",
"runtime_config": {
"allowed_hosts": ["*"],
"memory_limit": "100 MB",
}
}
}
}
> 📖 For detailed configuration options including authentication setup, runtime configuration, and advanced features, see RUNTIME_CONFIG.md
Supported URL schemes:
oci:// - for OCI-compliant registries (like Docker Hub, GitHub Container Registry, etc.)file:// - for local fileshttp:// or https:// - for remote filess3:// - for Amazon S3 objects (requires that you have your AWS credentials set up in the environment)2. Start the server:
$ hyper-mcp
stdio transport. If you want to use SSE, use flag --transport sse or streamable HTTP with --transport streamable-http.RUST_LOG=info.insecure_skip_signature flag or env var HYPER_MCP_INSECURE_SKIP_SIGNATURE to trueUsing with Cursor IDE
You can configure hyper-mcp either globally for all projects or specifically for individual projects.
1. For project-scope configuration, create .cursor/mcp.json in your project root:
{
"mcpServers": {
"hyper-mcp": {
"command": "/path/to/hyper-mcp"
}
}
}
2. Set up hyper-mcp in Cursor's settings:
3. Start using tools through chat:
Available Plugins
We maintain several example plugins to get you started:
V1 Plugins
These plugins use the v1 plugin interface. While still supported, new plugins should use the v2 interface.
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support