Price Per TokenPrice Per Token

Qdrant MCP Server

by qdrant

0

About

Qdrant MCP Server is the official Model Context Protocol implementation for Qdrant, enabling LLM applications to store and retrieve semantic memories through a vector search engine. It transforms unstructured information into vector embeddings for intelligent, meaning-based search and recall. Key features: - Store information as vector embeddings with optional JSON metadata - Semantic similarity search to retrieve contextually relevant memories based on meaning - Support for both Qdrant Cloud (via URL and API key) and local Qdrant deployments (via local path) - Configurable embedding providers for automatic text vectorization - Collection management with support for default and custom collection names - Persistent semantic memory layer that maintains context across conversations

README

mcp-server-qdrant: A Qdrant MCP server

[](https://smithery.ai/protocol/mcp-server-qdrant)

> The Model Context Protocol (MCP) is an open protocol that enables > seamless integration between LLM applications and external data sources and tools. Whether you're building an > AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to > connect LLMs with the context they need.

This repository is an example of how to create a MCP server for Qdrant, a vector search engine.

Overview

An official Model Context Protocol server for keeping and retrieving memories in the Qdrant vector search engine. It acts as a semantic memory layer on top of the Qdrant database.

Components

Tools

1. qdrant-store - Store some information in the Qdrant database - Input: - information (string): Information to store - metadata (JSON): Optional metadata to store - collection_name (string): Name of the collection to store the information in. This field is required if there are no default collection name. If there is a default collection name, this field is not enabled. - Returns: Confirmation message 2. qdrant-find - Retrieve relevant information from the Qdrant database - Input: - query (string): Query to use for searching - collection_name (string): Name of the collection to store the information in. This field is required if there are no default collection name. If there is a default collection name, this field is not enabled. - Returns: Information stored in the Qdrant database as separate messages

Environment Variables

The configuration of the server is done using environment variables:

| Name | Description | Default Value | |--------------------------|---------------------------------------------------------------------|-------------------------------------------------------------------| | QDRANT_URL | URL of the Qdrant server | None | | QDRANT_API_KEY | API key for the Qdrant server | None | | COLLECTION_NAME | Name of the default collection to use. | None | | QDRANT_LOCAL_PATH | Path to the local Qdrant database (alternative to QDRANT_URL) | None | | EMBEDDING_PROVIDER | Embedding provider to use (currently only "fastembed" is supported) | fastembed | | EMBEDDING_MODEL | Name of the embedding model to use | sentence-transformers/all-MiniLM-L6-v2 | | TOOL_STORE_DESCRIPTION | Custom description for the store tool | See default in settings.py | | TOOL_FIND_DESCRIPTION | Custom description for the find tool | See default in settings.py |

Note: You cannot provide both QDRANT_URL and QDRANT_LOCAL_PATH at the same time.

> [!IMPORTANT] > Command-line arguments are not supported anymore! Please use environment variables for all configuration.

FastMCP Environment Variables

Since mcp-server-qdrant is based on FastMCP, it also supports all the FastMCP environment variables. The most important ones are listed below:

| Environment Variable | Description | Default Value | |---------------------------------------|-----------------------------------------------------------|---------------| | FASTMCP_DEBUG | Enable debug mode | false | | FASTMCP_LOG_LEVEL | Set logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL) | INFO | | FASTMCP_HOST | Host address to bind the server to | 127.0.0.1 | | FASTMCP_PORT | Port to run the server on | 8000 | | FASTMCP_WARN_ON_DUPLICATE_RESOURCES | Show warnings for duplicate resources | true | | FASTMCP_WARN_ON_DUPLICATE_TOOLS | Show warnings for duplicate tools | true | | FASTMCP_WARN_ON_DUPLICATE_PROMPTS | Show warnings for duplicate p

Related MCP Servers

AI Research Assistant

AI Research Assistant

hamid-vakilzadeh

AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

Web & Search
12 8
Linkup

Linkup

LinkupPlatform

Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

Web & Search
2 24
Math-MCP

Math-MCP

EthanHenrickson

Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

Developer Tools
22 81