Price Per TokenPrice Per Token
Natural Context Provider

Natural Context Provider

by portel-dev

GitHub 71 118 uses Remote
0

About

Natural Context Provider (NCP) aggregates multiple MCP servers into a unified interface, allowing AI assistants to discover and execute tools using natural language queries rather than browsing through dozens of individual tools. It reduces complexity by exposing just 2-3 simple tools to the AI while managing 50+ underlying capabilities behind the scenes. Key features: - Natural language search across all connected MCP servers to find the right tool instantly - Code mode execution for multi-step TypeScript workflows that combine multiple tools - Skills system providing domain expertise for canvas design, PDF manipulation, and document generation - Photons for creating custom TypeScript MCPs without publishing to npm - Task scheduling capabilities and response caching - Project-level configuration for automatic MCP definitions per project - Significant token optimization (97% reduction) and faster response times by eliminating tool selection overhead

README

[](https://www.npmjs.com/package/@portel/ncp) [](https://www.npmjs.com/package/@portel/ncp) [](https://github.com/portel-dev/ncp/releases) [](https://github.com/portel-dev/ncp/releases/latest) [](https://www.elastic.co/licensing/elastic-license) [](https://modelcontextprotocol.io/)

NCP - Natural Context Provider

> 1 MCP to rule them all

Your MCPs, supercharged. Find any tool instantly, execute with code mode, run on schedule, discover skills, load Photons, ready for any client. Smart loading saves tokens and energy.

💍 What is NCP?

Instead of your AI juggling 50+ tools scattered across different MCPs, NCP gives it a single, unified interface with code mode execution, scheduling, skills discovery, and custom Photons.

Your AI sees just 2-3 simple tools:

  • find - Search for any tool, skill, or Photon: "I need to read a file" → finds the right tool automatically
  • code - Execute TypeScript directly: await github.create_issue({...}) (code mode, enabled by default)
  • run - Execute tools individually (when code mode is disabled)
  • Behind the scenes, NCP manages all 50+ tools + skills + Photons: routing requests, discovering the right capability, executing code, scheduling tasks, managing health, and caching responses.

    Why this matters:

  • Your AI stops analyzing "which tool do I use?" and starts doing actual work
  • Code mode lets AI write multi-step TypeScript workflows combining tools, skills, and scheduling
  • Skills provide domain expertise: canvas design, PDF manipulation, document generation, more
  • Photons enable custom TypeScript MCPs without npm publishing
  • 97% fewer tokens burned on tool confusion (2,500 vs 103,000 for 80 tools)
  • 5x faster responses (sub-second tool selection vs 5-8 seconds)
  • Your AI becomes focused. Not desperate.
  • 🚀 NEW: Project-level configuration - each project can define its own MCPs automatically

    > What's MCP? The Model Context Protocol by Anthropic lets AI assistants connect to external tools and data sources. Think of MCPs as "plugins" that give your AI superpowers like file access, web search, databases, and more.

    ---

    📑 Quick Navigation

  • The Problem - Why too many tools break your AI
  • The Solution - How NCP transforms your experience
  • Getting Started - Installation & quick start
  • Try It Out - See the CLI in action
  • Supercharged Features - How NCP empowers your MCPs
  • Setup by Client - Claude Desktop, Cursor, VS Code, etc.
  • Popular MCPs - Community favorites to add
  • Advanced Features - Project config, scheduling, remote MCPs
  • Troubleshooting - Common issues & solutions
  • How It Works - Technical deep dive
  • Contributing - Help us improve NCP
  • ---

    😤 The MCP Paradox: From Assistant to Desperate

    You gave your AI assistant 50 tools to be more capable. Instead, you got desperation:

  • Paralyzed by choice ("Should I use read_file or get_file_content?")
  • Exhausted before starting ("I've spent my context limit analyzing which tool to use")
  • Costs explode (50+ tool schemas burn tokens before any real work happens)
  • Asks instead of acts (used to be decisive, now constantly asks for clarification)
  • ---

    🧸 Why Too Many Tools Break the System

    Think about it like this:

    A child with one toy → Treasures it, masters it, creates endless games with it A child with 50 toys → Can't hold them all, gets overwhelmed, stops playing entirely

    Your AI is that child. MCPs are the toys. More isn't always better.

    The most creative people thrive with constraints, not infinite options. A poet given "write about anything" faces writer's block. Given "write a haiku about rain"? Instant inspiration.

    Your AI is the same. Give it one perfect tool → Instant action. Give it 50 tools → Cognitive overload. NCP provides just-in-time tool discovery so you

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Arch Linux [ Not Updated ]

    Arch Linux [ Not Updated ]

    nihalxkumar

    Arch Linux MCP Server connects AI assistants to the Arch Linux ecosystem, enabling intelligent access to the Arch Wiki, AUR (Arch User Repository), and official package repositories. Key features: - Search and retrieve Arch Wiki documentation in markdown format via `archwiki://` URIs - Query package details from official repositories (`archrepo://`) and AUR (`aur://*/info`) - Analyze PKGBUILDs with built-in safety checks before installation (`aur://*/pkgbuild`) - Access system package state on Arch systems including installed packages, orphans, explicit packages, and package groups - Works on both Arch and non-Arch systems for documentation lookup and package research - Guided workflows for safe AUR installations and system troubleshooting

    Developer Tools
    41 25