Price Per TokenPrice Per Token

Context Portal MCP Server

by GreatScottyMac

0

About

Context Portal (ConPort) is a Model Context Protocol (MCP) server that functions as a project memory bank, building a structured knowledge graph to power Retrieval Augmented Generation (RAG) for AI assistants in development environments. Key features of Context Portal: - Project-specific knowledge graph construction capturing decisions, progress, architecture patterns, and their relationships. - Vector embeddings and semantic search capabilities for intelligent context retrieval. - SQLite-backed storage with one database per workspace for reliable, queryable context management. - Multi-workspace support via workspace_id, compatible with various MCP-enabled IDEs and tools. - STDIO-based deployment for tight integration with AI coding assistants like Roo Code, Cline, Windsurf, and Cursor.

README

Context Portal MCP (ConPort)

(It's a memory bank!)

           

A database-backed Model Context Protocol (MCP) server for managing structured project context, designed to be used by AI assistants and developer tools within IDEs and other interfaces.

What is Context Portal MCP server (ConPort)?

Context Portal (ConPort) is your project's memory bank. It's a tool that helps AI assistants understand your specific software project better by storing important information like decisions, tasks, and architectural patterns in a structured way. Think of it as building a project-specific knowledge base that the AI can easily access and use to give you more accurate and helpful responses.

What it does:

  • Keeps track of project decisions, progress, and system designs.
  • Stores custom project data (like glossaries or specs).
  • Helps AI find relevant project information quickly (like a smart search).
  • Enables AI to use project context for better responses (RAG).
  • More efficient for managing, searching, and updating context compared to simple text file-based memory banks.
  • ConPort provides a robust and structured way for AI assistants to store, retrieve, and manage various types of project context. It effectively builds a project-specific knowledge graph, capturing entities like decisions, progress, and architecture, along with their relationships. This structured knowledge base, enhanced by vector embeddings for semantic search, then serves as a powerful backend for Retrieval Augmented Generation (RAG), enabling AI assistants to access precise, up-to-date information for more context-aware and accurate responses.

    It replaces older file-based context management systems by offering a more reliable and queryable database backend (SQLite per workspace). ConPort is designed to be a generic context backend, compatible with various IDEs and client interfaces that support MCP.

    Key features include:

  • Structured context storage using SQLite (one DB per workspace, automatically created).
  • MCP server (context_portal_mcp) built with Python/FastAPI.
  • A comprehensive suite of defined MCP tools for interaction (see "Available ConPort Tools" below).
  • Multi-workspace support via workspace_id.
  • Primary deployment mode: STDIO for tight IDE integration.
  • Enables building a dynamic project knowledge graph with explicit relationships between context items.
  • Includes vector data storage and semantic search capabilities to power advanced RAG.
  • Serves as an ideal backend for Retrieval Augmented Generation (RAG), providing AI with precise, queryable project memory.
  • Provides structured context that AI assistants can leverage for prompt caching with compatible LLM providers.
  • Manages database schema evolution using Alembic migrations, ensuring seamless updates and data integrity.
  • Prerequisites

    Before you begin, ensure you have the following installed:

  • Python: Version 3.8 or higher is recommended.
  • - Download Python - Ensure Python is added to your system's PATH during installation (especially on Windows).
  • uv: (Highly Recommended) A fast Python environment and package manager. Using uv significantly simplifies virtual environment creation and dependency installation.
  • - Install uv

    Installation and Configuration (Recommended)

    The recommended way to install and run ConPort is by using uvx to execute the package directly from PyPI. This method avoids the need to manually create and manage virtual environments.

    uvx Configuration (Recommended for most IDEs)

    In your MCP client settings (e.g., mcp_settings.json), use the following configuration:

    {
      "mcpServers": {
        "conport": {
          "command": "uvx",
          "args": [
            "--from",
            "context-portal-mcp",
            "conport-mcp",
            "--mode",
            "stdio",
            "--workspace_id",
            "${workspaceFolder}",
            "--log-file",
            "./logs/conport.log",
            "--log-level",
            "INFO"
          ]
        }
      }
    }
    

  • command: uvx handles the environment for you.
  • args: Contains the arguments to run the ConPort server.
  • ${workspaceFolder}: This IDE variable is used to automatically provide the absolute path of the current project workspace.
  • --log-file: Optional: Path to a file where server logs will be written. If not provided, logs are directed to stderr (console). Useful for persistent logging and debugging server behavior.
  • --log-level: Optional: Sets the minimum l
  • Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81