About
Intelligent Architecture Recommendation Engine is an automated infrastructure planning tool that generates tailored system architecture recommendations based on business parameters. The tool analyzes key metrics like QPS throughput, concurrent user load, database requirements, and AI model specifications to produce complete deployment blueprints. Key features: - Parameter-driven architecture generation using inputs like QPS, concurrent users, daily active users, business type (web/AI), database type (relational/NoSQL/analytics), and AI model size (small/medium/large) - Automated resource allocation recommendations covering CPU, memory, network configuration, and GPU inference clusters - Middleware selection guidance including Redis cache sizing, eviction strategies, and message queue configurations - Architecture pattern recommendations such as microservices vs. monolith and distributed system design - Deployment strategy suggestions with cloud provider recommendations - Exportable deliverables including Markdown reports and Mermaid architecture diagrams
README
mcp-system-infra
🚀 Intelligent Architecture Recommendation Engine: Tailored for Your System
In today's rapidly evolving digital landscape, how can you quickly and efficiently build a scalable and reliable technical infrastructure? The Intelligent Architecture Recommendation Engine is here to solve that challenge.
Based on key parameters—QPS (queries per second), concurrent users, daily active users, business type, database choice, and AI model size—this tool automatically generates:
---
✨ Key Benefits
✅ Fully Parameter-Driven, Business-Oriented
Simply provide the following parameters:
--qps: Peak request throughput--concurrentUsers: Number of concurrent connections--uad: Daily Active Users (UAD)--type: Business type (web / ai)--db: Database type (relational / nosql / analytics)--model: AI model size (small / medium / large)The system will automatically assess and recommend:
---
🗺️ Architecture Recommendation Diagram
The system automatically generates a Mermaid diagram to clearly represent component relationships:
flowchart TD
User[User Request] --> Nginx[Nginx Load Balancer]
Nginx --> Service[Main Business Service Node]
Service --> DB[Database]
Service --> Redis[Redis Cache]
Service --> MQ[Message Queue]
Service --> GPU[AI Inference GPU Node]
MQ --> Consumer[Asynchronous Consumer]#
▶️ Quick Start
CLI
~~~bash npx -y mcp-system-infra ~~~MCP Server Configuration
~~~json { "mcpServers": { "mcp-system-infra": { "command": "npx", "args": [ "-y", "mcp-system-infra" ] } } } ~~~
MCP Example:
Please help design a web-based system architecture report with the following specifications:
---
💭 Murmurs
This project is for educational and internal use only. Contributions and feedback are welcome. For feature customization, web deployment, or enterprise integration, please contact the project maintainer.
Contact
Business Contact Email: deeppathai@outlook.com
---
🧠 MCP Access Addresses
mcp-system-infra directly within the ModelScope platform.mcp-system-infra service via Smithery.Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support