About
Mem0 Memories is a memory storage and retrieval service that helps AI agents maintain long-term context across user sessions. Built on the Mem0 platform, it stores user preferences, facts, and interaction history to enable personalized, context-aware responses. Key features: - Store memories with user-specific context and unique user IDs - Search through memories with relevance scoring to surface the most relevant information - Multi-user support for managing separate memory contexts per individual - Simple API for adding and querying memories programmatically - Compatible with MCP clients including Cursor and VS Code - Deployable on Smithery for cloud-based access without local installation
README
Mem0 Memory MCP Server
A Model Context Protocol (MCP) server that provides memory storage and retrieval capabilities using Mem0. This server allows AI agents to store and search through memories, making it useful for maintaining context and making informed decisions based on past interactions.
Features
Installation
Using npx (Recommended)
npx -y @mem0/mcp
Local Development
# Install dependencies
npm installRun development server with Smithery playground
npm run devBuild the project
npm run build
Configuration
The server requires a Mem0 API key to function. You can obtain one from Mem0 Dashboard.
For Smithery Deployment
When deploying on Smithery, users will be prompted to provide their API key through the configuration interface.
For Local Development
Create a .env file in the root directory:
MEM0_API_KEY=your-api-key-here
Available Tools
1. Add Memory (add-memory)
Store new memories with user-specific context.
Parameters:
content (string, required): The content to store in memoryuserId (string, required): User ID for memory storage2. Search Memories (search-memories)
Search through stored memories to retrieve relevant information.
Parameters:
query (string, required): The search queryuserId (string, required): User ID for memory storageUsage with MCP Clients
Cursor
1. Open Cursor Settings 2. Go to Features > MCP Servers 3. Add the server configuration:
{
"mcpServers": {
"mem0-mcp": {
"command": "npx",
"args": ["-y", "@mem0/mcp"],
"env": {
"MEM0_API_KEY": "YOUR-API-KEY-HERE"
}
}
}
}
VS Code
Add to your User Settings (JSON):
{
"mcp": {
"inputs": [
{
"type": "promptString",
"id": "apiKey",
"description": "Mem0 API Key",
"password": true
}
],
"servers": {
"mem0-memory": {
"command": "npx",
"args": ["-y", "@mem0/mcp"],
"env": {
"MEM0_API_KEY": "${input:apiKey}"
}
}
}
}
}
Deployment on Smithery
This server is configured for deployment on Smithery:
1. Push your code to GitHub 2. Connect your repository to Smithery 3. Deploy from the Deployments tab
The server will be available via Smithery's Streamable HTTP transport, allowing users to connect without installing dependencies locally.
Development
Project Structure
mem0-mcp/
├── src/
│ └── index.ts # Main server implementation
├── package.json # Dependencies and scripts
├── tsconfig.json # TypeScript configuration
├── tsup.config.ts # Build configuration
├── smithery.yaml # Smithery deployment config
└── README.md
Scripts
npm run dev - Start development server with Smithery playgroundnpm run build - Build the project using Smithery CLInpm run build:local - Build using tsup directlynpm run dev:local - Watch mode with tsupnpm start - Run the built serverContributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support