About
OpenWebSearch is a free multi-engine web search MCP server that requires no API keys. It aggregates search results from multiple sources including Bing, Baidu, DuckDuckGo, Brave, and Exa, plus specialized developer platforms like GitHub, Juejin, and CSDN. Key features of OpenWebSearch: - Multi-engine web search without API keys or authentication - Support for major search engines: Bing, Baidu, DuckDuckGo, Brave, and Exa - Specialized search for developer content on GitHub, Juejin (Chinese tech community), and CSDN - Structured search results with titles, URLs, and descriptions - Full content fetching for CSDN articles and GitHub README files - HTTP proxy configuration for accessing restricted resources - Configurable default search engine and result count - CORS support for web-based integrations
Tools 5
searchSearch the web using multiple engines (e.g., Baidu, Bing, DuckDuckGo, CSDN, Exa, Brave, Juejin(掘金)) with no API key required
fetchLinuxDoArticleFetch full article content from a linux.do post URL
fetchCsdnArticleFetch full article content from a csdn post URL
fetchGithubReadmeFetch README content from a GitHub repository URL
fetchJuejinArticleFetch full article content from a Juejin(掘金) post URL
README
Open-WebSearch MCP Server
[](https://www.modelscope.cn/mcp/servers/Aasee1/open-webSearch) [](https://archestra.ai/mcp-catalog/aas-ee__open-websearch) [](https://smithery.ai/server/@Aas-ee/open-websearch)
🇨🇳 中文 | 🇺🇸 English
A Model Context Protocol (MCP) server based on multi-engine search results, supporting free web search without API keys.
Features
TODO
Installation Guide
NPX Quick Start (Recommended)
The fastest way to get started:
# Basic usage
npx open-websearch@latestWith environment variables (Linux/macOS)
DEFAULT_SEARCH_ENGINE=duckduckgo ENABLE_CORS=true npx open-websearch@latestWindows PowerShell
$env:DEFAULT_SEARCH_ENGINE="duckduckgo"; $env:ENABLE_CORS="true"; npx open-websearch@latestWindows CMD
set MODE=stdio && set DEFAULT_SEARCH_ENGINE=duckduckgo && npx open-websearch@latestCross-platform (requires cross-env, Used for local development)
npm install -g open-websearch
npx cross-env DEFAULT_SEARCH_ENGINE=duckduckgo ENABLE_CORS=true open-websearch
Environment Variables:
| Variable | Default | Options | Description |
|----------|-------------------------|---------|-------------|
| ENABLE_CORS | false | true, false | Enable CORS |
| CORS_ORIGIN | * | Any valid origin | CORS origin configuration |
| DEFAULT_SEARCH_ENGINE | bing | bing, duckduckgo, exa, brave, baidu, csdn, juejin | Default search engine |
| USE_PROXY | false | true, false | Enable HTTP proxy |
| PROXY_URL | http://127.0.0.1:7890 | Any valid URL | Proxy server URL |
| MODE | both | both, http, stdio | Server mode: both HTTP+STDIO, HTTP only, or STDIO only |
| PORT | 3000 | 1-65535 | Server port |
| ALLOWED_SEARCH_ENGINES | empty (all available) | Comma-separated engine names | Limit which search engines can be used; if the default engine is not in this list, the first allowed engine becomes the default |
| MCP_TOOL_SEARCH_NAME | search | Valid MCP tool name | Custom name for the search tool |
| MCP_TOOL_FETCH_LINUXDO_NAME | fetchLinuxDoArticle | Valid MCP tool name | Custom name for the Linux.do article fetch tool |
| MCP_TOOL_FETCH_CSDN_NAME | fetchCsdnArticle | Valid MCP tool name | Custom name for the CSDN article fetch tool |
| MCP_TOOL_FETCH_GITHUB_NAME | fetchGithubReadme | Valid MCP tool name | Custom name for the GitHub README fetch tool |
| MCP_TOOL_FETCH_JUEJIN_NAME | fetchJuejinArticle | Valid MCP tool name | Custom name for the Juejin article fetch tool |
Common configurations:
# Enable proxy for restricted regions
USE_PROXY=true PROXY_URL=http://127.0.0.1:7890 npx open-websearch@latestFull configuration
DEFAULT_SEARCH_ENGINE=duckduckgo ENABLE_CORS=true USE_PROXY=true PROXY_URL=http://127.0.0.1:7890 PORT=8080 npx open-websearch@latest
Local Installation
1. Clone or download this repository 2. Install dependencies:
npm install
3. Build the server:
npm run build
4. Add the server to your MCP configuration:Cherry Studio:
{
"mcpServers": {
"web-search": {
"name": "Web Search MCP",
"type": "streamableHttp",
"description": "Multi-engine web search with article fetching",
"isActive": true,
"baseUrl": "http://localhost:3000/mcp"
}
}
}
VSCode (Claude Dev Extension): ```json { "mcpServers": { "web-search": { "transport": { "type"
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support