Price Per TokenPrice Per Token

MCP Mem0 MCP Server

by coleam00

0

About

Mem0 MCP Server provides persistent long-term memory capabilities for AI agents, enabling them to store, retrieve, and search memories across sessions using semantic search. Key features of Mem0 MCP Server: - **save_memory**: Store any information in long-term memory with semantic indexing for intelligent retrieval - **get_all_memories**: Retrieve all stored memories to provide comprehensive context to AI agents - **search_memories**: Find relevant memories using semantic search and embeddings - Vector storage backed by PostgreSQL or Supabase for scalable memory persistence - Support for multiple LLM providers including OpenAI, OpenRouter, and Ollama - Integration with embedding models (e.g., text-embedding-3-small) for semantic memory indexing

README

MCP-Mem0: Long-Term Memory for AI Agents

A template implementation of the Model Context Protocol (MCP) server integrated with Mem0 for providing AI agents with persistent memory capabilities.

Use this as a reference point to build your MCP servers yourself, or give this as an example to an AI coding assistant and tell it to follow this example for structure and code correctness!

Overview

This project demonstrates how to build an MCP server that enables AI agents to store, retrieve, and search memories using semantic search. It serves as a practical template for creating your own MCP servers, simply using Mem0 and a practical example.

The implementation follows the best practices laid out by Anthropic for building MCP servers, allowing seamless integration with any MCP-compatible client.

Features

The server provides three essential memory management tools:

1. save_memory: Store any information in long-term memory with semantic indexing 2. get_all_memories: Retrieve all stored memories for comprehensive context 3. search_memories: Find relevant memories using semantic search

Prerequisites

  • Python 3.12+
  • Supabase or any PostgreSQL database (for vector storage of memories)
  • API keys for your chosen LLM provider (OpenAI, OpenRouter, or Ollama)
  • Docker if running the MCP server as a container (recommended)
  • Installation

    Using uv

    1. Install uv if you don't have it:

       pip install uv
       

    2. Clone this repository:

       git clone https://github.com/coleam00/mcp-mem0.git
       cd mcp-mem0
       

    3. Install dependencies:

       uv pip install -e .
       

    4. Create a .env file based on .env.example:

       cp .env.example .env
       

    5. Configure your environment variables in the .env file (see Configuration section)

    Using Docker (Recommended)

    1. Build the Docker image:

       docker build -t mcp/mem0 --build-arg PORT=8050 .
       

    2. Create a .env file based on .env.example and configure your environment variables

    Configuration

    The following environment variables can be configured in your .env file:

    | Variable | Description | Example | |----------|-------------|----------| | TRANSPORT | Transport protocol (sse or stdio) | sse | | HOST | Host to bind to when using SSE transport | 0.0.0.0 | | PORT | Port to listen on when using SSE transport | 8050 | | LLM_PROVIDER | LLM provider (openai, openrouter, or ollama) | openai | | LLM_BASE_URL | Base URL for the LLM API | https://api.openai.com/v1 | | LLM_API_KEY | API key for the LLM provider | sk-... | | LLM_CHOICE | LLM model to use | gpt-4o-mini | | EMBEDDING_MODEL_CHOICE | Embedding model to use | text-embedding-3-small | | DATABASE_URL | PostgreSQL connection string | postgresql://user:pass@host:port/db |

    Running the Server

    Using uv

    #### SSE Transport

    # Set TRANSPORT=sse in .env then:
    uv run src/main.py
    

    The MCP server will essentially be run as an API endpoint that you can then connect to with config shown below.

    #### Stdio Transport

    With stdio, the MCP client iself can spin up the MCP server, so nothing to run at this point.

    Using Docker

    #### SSE Transport

    docker run --env-file .env -p:8050:8050 mcp/mem0
    

    The MCP server will essentially be run as an API endpoint within the container that you can then connect to with config shown below.

    #### Stdio Transport

    With stdio, the MCP client iself can spin up the MCP server container, so nothing to run at this point.

    Integration with MCP Clients

    SSE Configuration

    Once you have the server running with SSE transport, you can connect to it using this configuration:

    {
      "mcpServers": {
        "mem0": {
          "transport": "sse",
          "url": "http://localhost:8050/sse"
        }
      }
    }
    

    > Note for Windsurf users: Use serverUrl instead of url in your configuration: >

    > {
    >   "mcpServers": {
    >     "mem0": {
    >       "transport": "sse",
    >       "serverUrl": "http://localhost:8050/sse"
    >     }
    >   }
    > }
    > 

    > Note for n8n users: Use host.docker.internal instead of localhost since n8n has to reach outside of it's own container to the host machine: > > So the full URL in the MCP node would be: http://host.docker.internal:8050/sse

    Make sure to update the port if you are using a value other than the default 8050.

    Python with Stdio Configuration

    Add this server to your MCP configuration for Claude Desktop, Windsurf, or any other MCP client:

    ```json { "mcpServers": { "mem0": { "command": "your/path/to/mcp-mem0/.venv/Scripts/python.exe", "args": ["your/path/to/mcp-mem0/src/main.py"], "env": { "TRANSPORT": "stdio", "LLM_PROVIDER": "openai", "LLM_BASE_URL": "https://api.openai.

    Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81