Price Per TokenPrice Per Token

Mem0 MCP Server

by mem0ai

0

About

Mem0 is a long-term memory service for AI agents that enables persistent storage and retrieval of information across conversations, users, and agent runs. It provides semantic memory capabilities that help LLMs remember context and user preferences over time. Key capabilities of Mem0: - Add and store memories from conversation history or explicit messages for specific users and agents - Semantic search across memories with filtering and pagination support - Retrieve, update, and delete individual memories by ID - Bulk memory management including deleting all memories within a scope - Entity management for organizing memories by user, agent, app, or run - Cloud-hosted MCP server available at mcp.mem0.ai for easy integration with Claude, Cursor, and other MCP-compatible clients

README

> [!CAUTION] > ## This project has been archived > > mem0-mcp-server is no longer actively maintained and this repository is now a public archive. > > Thank you to the 640+ stargazers, 140+ forkers, and every contributor who helped shape this project. Your support and feedback meant the world to us. > > Looking for Mem0 MCP? We now offer an official cloud-hosted MCP server. Check out the docs to get started. > > Quick install across all major clients: >

> npx mcp-add \
>   --name mem0-mcp \
>   --type http \
>   --url "https://mcp.mem0.ai/mcp" \
>   --clients "claude,claude code,cursor,windsurf,vscode,opencode"
> 

Mem0 MCP Server

[](https://pypi.org/project/mem0-mcp-server/) [](LICENSE) [](https://smithery.ai/server/@mem0ai/mem0-memory-mcp)

mem0-mcp-server wraps the official Mem0 Memory API as a Model Context Protocol (MCP) server so any MCP-compatible client (Claude Desktop, Cursor, custom agents) can add, search, update, and delete long-term memories.

Tools

The server exposes the following tools to your LLM:

| Tool | Description | | --------------------- | --------------------------------------------------------------------------------- | | add_memory | Save text or conversation history (or explicit message objects) for a user/agent. | | search_memories | Semantic search across existing memories (filters + limit supported). | | get_memories | List memories with structured filters and pagination. | | get_memory | Retrieve one memory by its memory_id. | | update_memory | Overwrite a memory's text once the user confirms the memory_id. | | delete_memory | Delete a single memory by memory_id. | | delete_all_memories | Bulk delete all memories in the confirmed scope (user/agent/app/run). | | delete_entities | Delete a user/agent/app/run entity (and its memories). | | list_entities | Enumerate users/agents/apps/runs stored in Mem0. |

All responses are JSON strings returned directly from the Mem0 API.

Usage Options

There are three ways to use the Mem0 MCP Server:

1. Python Package - Install and run locally using uvx with any MCP client 2. Docker - Containerized deployment that creates an /mcp HTTP endpoint 3. Smithery - Remote hosted service for managed deployments

Quick Start

Installation

uv pip install mem0-mcp-server

Or with pip:

pip install mem0-mcp-server

Client Configuration

Add this configuration to your MCP client:

{
  "mcpServers": {
    "mem0": {
      "command": "uvx",
      "args": ["mem0-mcp-server"],
      "env": {
        "MEM0_API_KEY": "m0-...",
        "MEM0_DEFAULT_USER_ID": "your-handle"
      }
    }
  }
}

Test with the Python Agent

Click to expand: Test with the Python Agent

To test the server immediately, use the included Pydantic AI agent:

# Install the package
pip install mem0-mcp-server

Or with uv

uv pip install mem0-mcp-server

Set your API keys

export MEM0_API_KEY="m0-..." export OPENAI_API_KEY="sk-openai-..."

Clone and test with the agent

git clone https://github.com/mem0ai/mem0-mcp.git cd mem0-mcp-server python example/pydantic_ai_repl.py

Using different server configurations:

# Use with Docker container
export MEM0_MCP_CONFIG_PATH=example/docker-config.json
export MEM0_MCP_CONFIG_SERVER=mem0-docker
python example/pydantic_ai_repl.py

Use with Smithery remote server

export MEM0_MCP_CONFIG_PATH=example/config-smithery.json export MEM0_MCP_CONFIG_SERVER=mem0-memory-mcp python example/pydantic_ai_repl.py

What You Can Do

The Mem0 MCP server enables powerful memory capabilities for your AI applications:

  • Remember that I'm allergic to peanuts and shellfish - Add new health information to memory
  • Store these trial parameters: 200 participants, double-blind, placebo-controlled study - Save research data
  • What do you know about my dietary preferences? - Search and retrieve all food-related memories
  • Update my project status: the mobile app is now 80% complete - Modify existing memory with new info
  • Delete all memories from 2023, I need a fresh start - Bulk remove outdated memories
  • Show me everything I've saved about the Phoenix project - List all memories for a specific topic
  • Configuration

    Environment Variables

  • `M
  • Related MCP Servers

    AI Research Assistant

    AI Research Assistant

    hamid-vakilzadeh

    AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar

    Web & Search
    12 8
    Linkup

    Linkup

    LinkupPlatform

    Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages

    Web & Search
    2 24
    Math-MCP

    Math-MCP

    EthanHenrickson

    Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support

    Developer Tools
    22 81