About
MCP Knowledge Graph is a local memory store that enables AI models to persistently retain information across conversations using a knowledge graph structure of entities, relations, and observations. Key features of MCP Knowledge Graph: - Master database for primary memory storage with optional named databases (work, personal, health, etc.) for topic organization. - Project-local storage through `.aim` directories with automatic project detection. - Global storage option with configurable memory paths for cross-project memory access. - Safety system with `_aim` file markers to prevent accidental overwrites of unrelated JSONL files. - Multi-context support with separate memory files for different conversation contexts. - Compatible with Claude Code, Claude Desktop, and any MCP-compatible AI platform.
README
MCP Knowledge Graph
Persistent memory for AI models through a local knowledge graph.
Store and retrieve information across conversations using entities, relations, and observations. Works with Claude Code/Desktop and any MCP-compatible AI platform.
Why ".aim" and "aim_" prefixes?
AIM stands for AI Memory - the core concept of this system. The three AIM elements provide clear organization and safety:
.aim directories: Keep AI memory files organized and easily identifiableaim_ tool prefixes: Group related memory functions together in multi-tool setups_aim safety markers: Each memory file starts with {"type":"_aim","source":"mcp-knowledge-graph"} to prevent accidental overwrites of unrelated JSONL filesThis consistent AIM naming makes it obvious which directories, tools, and files belong to the AI memory system.
CRITICAL: Understanding .aim dir vs _aim file marker
Two different things with similar names:
.aim = Project-local directory name (MUST be named exactly .aim for project detection to work)_aim = File safety marker (appears inside JSONL files: {"type":"_aim","source":"mcp-knowledge-graph"})For project-local storage:
.aim in your project rootmy-project/.aim/memory.jsonlFor global storage (--memory-path):
~/yourusername/.aim/, ~/memories/, ~/Dropbox/ai-memory/, ~/Documents/ai-data/Storage Logic
File Location Priority:
1. Project with .aim - Uses .aim/memory.jsonl (project-local)
2. No project/no .aim - Uses configured global directory
3. Contexts - Adds suffix: memory-work.jsonl, memory-personal.jsonl
Safety System:
{"type":"_aim","source":"mcp-knowledge-graph"}Master Database Concept
The master database is your primary memory store - used by default when no specific database is requested. It's always named default in listings and stored as memory.jsonl.
work, personal, health) for organizing specific topicsKey Features
.aim directoriesQuick Start
Global Memory (Recommended)
Add to your claude_desktop_config.json or .claude.json. Two common approaches:
Option 1: Default .aim directory (simple)
{
"mcpServers": {
"Aim-Memory-Bank": {
"command": "npx",
"args": [
"-y",
"mcp-knowledge-graph",
"--memory-path",
"/Users/yourusername/.aim"
]
}
}
}
Option 2: Dropbox/cloud sync (portable)
For accessing memories across multiple machines, use a synced folder. This is how the author of this MCP server keeps his own memories:
{
"mcpServers": {
"Aim-Memory-Bank": {
"command": "npx",
"args": [
"-y",
"mcp-knowledge-graph",
"--memory-path",
"/Users/yourusername/Dropbox/ai-memory"
]
}
}
}
This creates memory files in your specified directory:
memory.jsonl - Master Database (default for all operations)memory-work.jsonl - Work databasememory-personal.jsonl - Personal databaseProject-Local Memory
In any project, create a .aim directory:
mkdir .aim
Now memory tools automatically use .aim/memory.jsonl (project-local master database) instead of global storage when run from this project.
How AI Uses Databases
Once configured, AI models use the master database by default or can specify named databases with a context parameter. New databases are created automatically - no setup required:
```json // Master Database (default - no context needed) aim_memory_store({ entities: [{ name: "John_Doe", entityType: "person", observations: ["Met at conference"] }] })
// Work database aim_memory_store({ context: "work", entities: [{ name: "Q4_Project", entityType: "project", observations: ["Due December 20
Related MCP Servers
AI Research Assistant
hamid-vakilzadeh
AI Research Assistant provides comprehensive access to millions of academic papers through the Semantic Scholar and arXiv databases. This MCP server enables AI coding assistants to perform intelligent literature searches, citation network analysis, and paper content extraction without requiring an API key. Key features include: - Advanced paper search with multi-filter support by year ranges, citation thresholds, field of study, and publication type - Title matching with confidence scoring for finding specific papers - Batch operations supporting up to 500 papers per request - Citation analysis and network exploration for understanding research relationships - Full-text PDF extraction from arXiv and Wiley open-access content (Wiley TDM token required for institutional access) - Rate limits of 100 requests per 5 minutes with options to request higher limits through Semantic Scholar
Linkup
LinkupPlatform
Linkup is a real-time web search and content extraction service that enables AI assistants to search the web and retrieve information from trusted sources. It provides source-backed answers with citations, making it ideal for fact-checking, news gathering, and research tasks. Key features of Linkup: - Real-time web search using natural language queries to find current information, news, and data - Page fetching to extract and read content from any webpage URL - Search depth modes: Standard for direct-answer queries and Deep for complex research across multiple sources - Source-backed results with citations and context from relevant, trustworthy websites - JavaScript rendering support for accessing dynamic content on JavaScript-heavy pages
Math-MCP
EthanHenrickson
Math-MCP is a computation server that enables Large Language Models (LLMs) to perform accurate numerical calculations through the Model Context Protocol. It provides precise mathematical operations via a simple API to overcome LLM limitations in arithmetic and statistical reasoning. Key features of Math-MCP: - Basic arithmetic operations: addition, subtraction, multiplication, division, modulo, and bulk summation - Statistical analysis functions: mean, median, mode, minimum, and maximum calculations - Rounding utilities: floor, ceiling, and nearest integer rounding - Trigonometric functions: sine, cosine, tangent, and their inverses with degrees and radians conversion support