Memory MCP Server

平台与服务

by j0hanz

SQLite-backed MCP server for persistent memory, full-text retrieval, and graph traversal.

什么是 Memory MCP Server

SQLite-backed MCP server for persistent memory, full-text retrieval, and graph traversal.

README

Memory MCP

<!-- markdownlint-disable MD033 -->

npm version License: MIT Node.js TypeScript

Install in VS Code Install in VS Code Insiders

Install in Cursor

A SQLite-backed MCP server for persistent memory storage, full-text retrieval, and relationship graph traversal.

Overview

Memory MCP provides a local, persistent memory layer for MCP-enabled assistants. It stores SHA-256-addressed memory items in SQLite with FTS5-powered full-text search, a directed relationship graph, BFS recall traversal, and token-budget-aware context retrieval — all accessible over stdio transport with no external dependencies.

Key Features

  • 13 MCP tools for CRUD, batch operations, FTS5 search, BFS graph recall, token-budget context retrieval, relationships, and stats.
  • Full-text search over content and tags via SQLite FTS5 with importance and type filters.
  • Graph recall with BFS traversal, bounded frontier, and MCP progress notifications per hop.
  • Token-budget retrieval (retrieve_context) selects memories that fit a caller-specified token budget — no manual pagination needed.
  • Strict Zod input validation with typed output envelopes and SHA-256 hash addressing.
  • Resource support with internal://instructions (Markdown guide) and memory://memories/{hash} URI template with hash auto-completion.
  • stdio transport with clean shutdown handling (SIGINT, SIGTERM) and no HTTP endpoints. |

Requirements

  • Node.js >=24.
  • SQLite with FTS5 support (verified at startup).
  • Any MCP client that supports stdio command servers.

Quick Start

Use the npm package directly with npx — no installation required:

json
{
  "mcpServers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}

[!TIP] The server uses stdio transport only; no HTTP endpoint is exposed. Stdout must not be polluted by custom logging.

Or run with Docker:

bash
docker run --rm -i ghcr.io/j0hanz/memory-mcp:latest

Client Configuration

<details> <summary><b>Install in VS Code</b></summary>

Install in VS Code

Workspace file .vscode/mcp.json:

json
{
  "servers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}

CLI:

bash
code --add-mcp '{"name":"memory-mcp","command":"npx","args":["-y","@j0hanz/memory-mcp@latest"]}'
</details> <details> <summary><b>Install in VS Code Insiders</b></summary>

Install in VS Code Insiders

CLI:

bash
code-insiders --add-mcp '{"name":"memory-mcp","command":"npx","args":["-y","@j0hanz/memory-mcp@latest"]}'
</details> <details> <summary><b>Install in Cursor</b></summary>

Install in Cursor

~/.cursor/mcp.json:

json
{
  "mcpServers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}
</details> <details> <summary><b>Install in Claude Desktop / Claude Code</b></summary>

claude_desktop_config.json:

json
{
  "mcpServers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}

CLI:

bash
claude mcp add memory-mcp -- npx -y @j0hanz/memory-mcp@latest
</details> <details> <summary><b>Install in Windsurf</b></summary>

MCP config:

json
{
  "mcpServers": {
    "memory-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/memory-mcp@latest"]
    }
  }
}
</details> <details> <summary><b>Run with Docker</b></summary>
bash
# Pull and run (stdio mode)
docker run --rm -i \
  -e MEMORY_DB_PATH=/data/memory.db \
  -v memory-data:/data \
  ghcr.io/j0hanz/memory-mcp:latest

MCP client config:

json
{
  "mcpServers": {
    "memory-mcp": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-e",
        "MEMORY_DB_PATH=/data/memory.db",
        "-v",
        "memory-data:/data",
        "ghcr.io/j0hanz/memory-mcp:latest"
      ]
    }
  }
}
</details>

Documentation Maintenance

  • Owner: maintainers updating MCP behavior in src/ must update README.md and affected mcp/ mirror pages in the same PR.
  • Link/version policy: use pinned https://modelcontextprotocol.io/specification/2025-11-25/... links for protocol references; avoid latest and mixed legacy targets.
  • Drift-check checklist:
    • Re-verify capability declarations in src/server.ts.
    • Reconcile tool/resource/prompt docs with src/tools/index.ts, src/resources/index.ts, and src/prompts/index.ts.
    • Confirm limitations/gotchas in src/instructions.md match runtime behavior.
  • Validation commands: npm run type-check, npm run test:fast, npm run build.

MCP Surface

Tools Summary

ToolCategoryNotes
store_memoryWriteIdempotent by content+sorted tags hash
store_memoriesWriteBatch (1–50), transaction-wrapped
get_memoryReadHash lookup
update_memoryWriteReturns old_hash + new_hash
delete_memoryWriteCascades relationship deletion
delete_memoriesWriteBatch (1–50), transaction-wrapped
search_memoriesReadFTS5 + importance/type filters + cursor
create_relationshipWriteIdempotent directed edge creation
delete_relationshipWriteDeletes exact directed edge
get_relationshipsReadDirection filter + linked memory fields
recallReadFTS5 seed + BFS traversal (depth 0–3)
retrieve_contextReadToken-budget-aware context retrieval
memory_statsReadStore aggregates and type breakdown

store_memory

Store a new memory with content, tags, and optional type/importance. Idempotent — storing the same content+tags returns the existing hash with created: false.

NameTypeRequiredDefaultDescription
contentstringYesMemory content (1–100000 chars)
tagsstring[]Yes1–100 tags, each max 50 chars, no whitespace
memory_typeenumNogeneralgeneral, fact, plan, decision, reflection, lesson, error, gradient
importanceintegerNo0Priority 0–10

Returns: { hash, created }


store_memories

Store multiple memories in one transaction (max 50 items).

NameTypeRequiredDescription
itemsArray<StoreMemoryItem>Yes1–50 items, each with content, tags, optional memory_type, optional importance

Returns: { items, succeeded, failed }


get_memory

Retrieve one memory by its SHA-256 hash.

NameTypeRequiredDescription
hashstringYes64-char lowercase SHA-256 hex

Returns: Memory or { ok: false, error } on E_NOT_FOUND.


update_memory

Update content and optionally tags for an existing memory. Returns both hashes.

NameTypeRequiredDefaultDescription
hashstringYesExisting memory hash
contentstringYesReplacement content
tagsstring[]NoExisting tagsReplacement tags

Returns: { old_hash, new_hash }


delete_memory

Delete one memory by hash. Cascades to related relationship rows.

NameTypeRequiredDescription
hashstringYesMemory hash

Returns: { hash, deleted }


delete_memories

Delete multiple memories by hash in one transaction.

NameTypeRequiredDescription
hashesstring[]Yes1–50 memory hashes

Returns: { items, succeeded, failed }


search_memories

Full-text search over memory content and tags using FTS5. Supports importance and type filters with cursor pagination.

NameTypeRequiredDefaultDescription
querystringYesSearch text (1–1000 chars)
limitintegerNo20Results per page (1–100)
cursorstringNoPagination cursor from previous response
min_importanceintegerNoOnly return memories with importance >= this value (0–10)
max_importanceintegerNoOnly return memories with importance <= this value (0–10)
memory_typeenumNoFilter by memory type

Returns: { memories, total_returned, nextCursor? }


create_relationship

Create a directed relationship edge between two memories. Idempotent.

Suggested relation_type values: related_to, causes, depends_on, parent_of, child_of, supersedes, contradicts, supports, references.

NameTypeRequiredDescription
from_hashstringYesSource memory hash
to_hashstringYesTarget memory hash
relation_typestringYesEdge label (1–50 chars, no whitespace, free-form)

Returns: { created }


delete_relationship

Delete one directed relationship edge.

NameTypeRequiredDescription
from_hashstringYesSource hash
to_hashstringYesTarget hash
relation_typestringYesRelationship type

Returns: { deleted } or { ok: false, error } on E_NOT_FOUND.


get_relationships

Retrieve relationships for a memory, with optional direction filter.

NameTypeRequiredDefaultDescription
hashstringYesMemory hash
directionenumNobothoutgoing, incoming, or both

Returns: { relationships, count }

Each relationship includes from_hash, to_hash, relation_type, created_at, linked_hash, linked_content, and linked_tags.


recall

Search memories by full-text query, then traverse the relationship graph up to depth hops via BFS. Emits MCP progress notifications per hop.

NameTypeRequiredDefaultDescription
querystringYesSeed search query (1–1000 chars)
depthintegerNo1BFS hops (0–3)
limitintegerNo10Seed memory count (1–50)
cursorstringNoPagination cursor from previous response
min_importanceintegerNoSeed filter: only memories with importance >= value (0–10)
max_importanceintegerNoSeed filter: only memories with importance <= value (0–10)
memory_typeenumNoSeed filter: only memories of this type

Returns: { memories, graph, depth_reached, aborted?, nextCursor? }

Each item in graph uses the shape:

json
{ "from_hash": "...", "to_hash": "...", "relation_type": "..." }

[!NOTE] aborted: true indicates the traversal hit a safety limit (RECALL_MAX_FRONTIER_SIZE, RECALL_MAX_EDGE_ROWS, or RECALL_MAX_VISITED_NODES). Partial results are still returned.


retrieve_context

Search memories and return relevance-ranked results that fit within a caller-specified token budget. Eliminates manual pagination and token counting for context window management.

NameTypeRequiredDefaultDescription
querystringYesSearch query (1–1000 chars)
token_budgetintegerNo4000Maximum estimated tokens to return (100–200000)
strategyenumNorelevanceSort order: relevance (FTS rank), importance (highest first), recency (newest first)

Returns: { memories, estimated_tokens, truncated }

[!TIP] Token estimation is approximate (content length ÷ 4). truncated: true means the budget was reached before all candidates were included.


memory_stats

Return aggregate memory and relationship stats. Takes no input.

Returns:

json
{
  "memories": {
    "total": 0,
    "oldest": null,
    "newest": null,
    "avg_importance": null
  },
  "relationships": { "total": 0 },
  "by_type": {}
}

Resources

URIMIMEDescription
internal://instructionstext/markdownMarkdown usage guide for all tools and workflows
memory://memories/{hash}application/jsonReturns one memory as JSON; hash completion supported

Prompts

NameArgumentsPurpose
get-helpnoneReturns full usage instructions for all tools

Configuration

Environment Variables

VariableDescriptionDefaultRequired
MEMORY_DB_PATHSQLite database file pathmemory_db/memory.dbNo
RECALL_MAX_FRONTIER_SIZEMax BFS frontier nodes per hop (100–50000)1000No
RECALL_MAX_EDGE_ROWSMax relationship rows fetched per traversal (100–50000)5000No
RECALL_MAX_VISITED_NODESMax visited nodes across entire traversal (100–50000)5000No

[!IMPORTANT] If MEMORY_DB_PATH is relative (including the default memory_db/memory.db), it resolves from the process working directory.

[!TIP] Add memory_db/ to your .gitignore to keep the database out of version control — it contains local session data and should not be shared or committed.

Limits and Constraints

ItemValue
Content length1–100000 chars
Tag count1–100 per memory
Tag length1–50 chars, no whitespace
Hash format64-char lowercase hex SHA-256
Search query length1–1000 chars
search_memories.limit1–100 (default 20)
recall.depth0–3 (default 1)
recall.limit1–50 (default 10)
retrieve_context.token_budget100–200000 (default 4000)
Batch size1–50 items (store_memories, delete_memories)
Recall frontier guardRECALL_MAX_FRONTIER_SIZE (default 1000 per hop)
SQLite busy timeout5000 ms

[!NOTE] Cursor values are opaque base64url-encoded tokens. Treat them as opaque and do not parse them.

Security

  • Transport is stdio-only (StdioServerTransport) — no HTTP endpoints.
  • Fatal process errors are written to stderr; stdout must remain clean for the MCP protocol.
  • All inputs are validated with strict Zod schemas and bounded field constraints before any database access.
  • Hashes are validated against a lowercase 64-char SHA-256 hex regex.
  • Search input is tokenized to alphanumeric terms before FTS MATCH execution (non-alphanumeric characters act as delimiters, preventing FTS injection).
  • SQLite foreign keys are enabled; relationship rows cascade-delete when a memory is removed.

Development

Install dependencies:

bash
npm install

Core scripts:

ScriptCommandPurpose
buildnpm run buildClean, compile, validate instructions, copy assets, chmod executable
devnpm run devTypeScript watch mode
dev:runnpm run dev:runRun built server with .env and file watch
startnpm run startStart built server
testnpm run testFull build + tests via task runner
test:fastnpm run test:fastRun TS tests directly with Node test runner
lintnpm run lintESLint checks
lint:fixnpm run lint:fixESLint auto-fix
type-checknpm run type-checkStrict TypeScript checks
formatnpm run formatPrettier format
inspectornpm run inspectorBuild and open MCP Inspector against stdio server

Inspect with MCP Inspector:

bash
npx @modelcontextprotocol/inspector node dist/index.js

Build & Release

GitHub Actions release workflow (.github/workflows/release.yml) handles versioning, validation, and publishing via a single workflow_dispatch trigger:

code
workflow_dispatch (patch / minor / major / custom)
    │
    ▼
  release — bump package.json + server.json → lint → type-check → test → build → tag → GitHub Release
    │
    ├──► publish-npm ──► publish-mcp   (npm Trusted Publishing OIDC → MCP Registry)
    │
    └──► publish-docker                (GHCR, linux/amd64 + linux/arm64)

Trigger a release:

bash
gh workflow run release.yml -f bump=patch

Or use the GitHub UI: Actions → Release → Run workflow.

[!NOTE] npm publishing uses OIDC Trusted Publishing — no NPM_TOKEN secret required. MCP Registry uses GitHub OIDC. Docker uses the built-in GITHUB_TOKEN.

Troubleshooting

SymptomCauseFix
Startup fails with FTS5 errorNode.js build without FTS5Use Node.js 24+ with SQLite FTS5 support
E_NOT_FOUND on get_memoryHash doesn't existVerify via search_memories first
E_INVALID_CURSORStale or malformed cursorRetry the request without the cursor parameter
MCP client can't connectCustom stdout logging addedEnsure nothing writes to stdout in the server process
aborted: true in recallTraversal hit a safety limitReduce depth, or tune RECALL_MAX_* env vars
Database locked errorsHigh concurrent write loadSQLite busy timeout is 5000 ms; reduce concurrent writes

License

MIT

<!-- markdownlint-enable MD033 -->

常见问题

Memory MCP Server 是什么?

SQLite-backed MCP server for persistent memory, full-text retrieval, and graph traversal.

相关 Skills

MCP构建

by anthropics

Universal
热门

聚焦高质量 MCP Server 开发,覆盖协议研究、工具设计、错误处理与传输选型,适合用 FastMCP 或 MCP SDK 对接外部 API、封装服务能力。

想让 LLM 稳定调用外部 API,就用 MCP构建:从 Python 到 Node 都有成熟指引,帮你更快做出高质量 MCP 服务器。

平台与服务
未扫描123.0k

Slack动图

by anthropics

Universal
热门

面向Slack的动图制作Skill,内置emoji/消息GIF的尺寸、帧率和色彩约束、校验与优化流程,适合把创意或上传图片快速做成可直接发送的Slack动画。

帮你快速做出适配 Slack 的动图,内置约束规则和校验工具,少踩上传与播放坑,做表情包和演示都更省心。

平台与服务
未扫描123.0k

邮件模板

by alirezarezvani

Universal
热门

快速搭建生产可用的事务邮件系统:生成 React Email/MJML 模板,接入 Resend、Postmark、SendGrid 或 AWS SES,并支持本地预览、i18n、暗色模式、反垃圾优化与追踪埋点。

面向营销与服务场景,快速搭建高质量邮件模板,省去反复设计与切图成本,成熟度和社区认可都很高。

平台与服务
未扫描12.5k

相关 MCP Server

Slack 消息

编辑精选

by Anthropic

热门

Slack 是让 AI 助手直接读写你的 Slack 频道和消息的 MCP 服务器。

这个服务器解决了团队协作中需要 AI 实时获取 Slack 信息的痛点,特别适合开发团队让 Claude 帮忙汇总频道讨论或发送通知。不过,它目前只是参考实现,文档有限,不建议在生产环境直接使用——更适合开发者学习 MCP 如何集成第三方服务。

平台与服务
84.2k

by netdata

热门

io.github.netdata/mcp-server 是让 AI 助手实时监控服务器指标和日志的 MCP 服务器。

这个工具解决了运维人员需要手动检查系统状态的痛点,最适合 DevOps 团队让 Claude 自动分析性能数据。不过,它依赖 NetData 的现有部署,如果你没用过这个监控平台,得先花时间配置。

平台与服务
78.5k

by d4vinci

热门

Scrapling MCP Server 是专为现代网页设计的智能爬虫工具,支持绕过 Cloudflare 等反爬机制。

这个工具解决了爬取动态网页和反爬网站时的头疼问题,特别适合需要批量采集电商价格或新闻数据的开发者。不过,它依赖外部浏览器引擎,资源消耗较大,不适合轻量级任务。

平台与服务
38.1k

评论