io.github.varun29ankuS/shodh-memory

编码与调试

by varun29ankus

提供带 semantic search 的持久化 AI memory,可跨会话存储、检索并召回上下文信息。

让 AI 助手摆脱“失忆症”,用语义搜索把跨会话上下文持久化存储与精准召回,特别适合需要长期记忆的编码流程。

什么是 io.github.varun29ankuS/shodh-memory

提供带 semantic search 的持久化 AI memory,可跨会话存储、检索并召回上下文信息。

README

<p align="center"> <img src="https://raw.githubusercontent.com/varun29ankuS/shodh-memory/main/assets/logo.png" width="120" alt="Shodh-Memory"> </p> <h1 align="center">Shodh-Memory</h1> <p align="center"><b>Persistent cognitive memory for AI agents and robots. Remembers what matters, forgets what doesn't, gets smarter with use.</b></p> <p align="center"> <a href="https://github.com/varun29ankuS/shodh-memory/actions"><img src="https://github.com/varun29ankuS/shodh-memory/workflows/CI/badge.svg" alt="build"></a> <a href="https://registry.modelcontextprotocol.io/v0/servers?search=shodh"><img src="https://img.shields.io/badge/MCP-Registry-green" alt="MCP Registry"></a> <a href="https://cursor.directory/plugins/shodh-memory-1"><img src="https://img.shields.io/badge/Cursor-Directory-black?logo=cursor" alt="Cursor Directory"></a> <a href="https://crates.io/crates/shodh-memory"><img src="https://img.shields.io/crates/v/shodh-memory.svg" alt="crates.io"></a> <a href="https://www.npmjs.com/package/@shodh/memory-mcp"><img src="https://img.shields.io/npm/v/@shodh/memory-mcp.svg?logo=npm" alt="npm"></a> <a href="https://pypi.org/project/shodh-memory/"><img src="https://img.shields.io/pypi/v/shodh-memory.svg" alt="PyPI"></a> <a href="https://hub.docker.com/r/varunshodh/shodh-memory"><img src="https://img.shields.io/docker/pulls/varunshodh/shodh-memory.svg?logo=docker" alt="Docker"></a> <a href="LICENSE"><img src="https://img.shields.io/badge/license-Apache%202.0-blue.svg" alt="License"></a> <a href="https://discord.gg/HrpzXqTtEp"><img src="https://img.shields.io/discord/1471830549818642432?logo=discord&label=Discord&color=5865F2" alt="Discord"></a> </p>
<p align="center"> <img src="https://raw.githubusercontent.com/varun29ankuS/shodh-memory/main/assets/Shodh_preview.gif" width="800" alt="Shodh-Memory Demo — Claude Code with persistent memory and TUI dashboard"> </p>

AI agents forget everything between sessions. Robots lose context between missions. They repeat mistakes, miss patterns, and treat every interaction like the first one.

Shodh-Memory fixes this. It's persistent memory that actually learns — memories you use often become easier to find, old irrelevant context fades automatically, and recalling one thing brings back related things. Works for chat agents (MCP/HTTP), robots (Zenoh/ROS2), and edge devices. No API keys. No cloud. No external databases. One binary.

Why Not Just Use mem0 / Cognee / Zep?

Shodhmem0CogneeZep
LLM calls to store a memory02+ per add3+ per cognify2+ per episode
External services neededNoneOpenAI + vector DBOpenAI + Neo4j + vector DBOpenAI + Neo4j
Time to store a memory55ms~20 secondssecondsseconds
Learns from usageYes (Hebbian)NoNoNo
Forgets irrelevant dataYes (decay)NoNoTemporal only
Runs fully offlineYesNoNoNo
Robotics / ROS2 nativeYes (Zenoh)NoNoNo
Binary size~17MBpip install + API keyspip install + API keys + Neo4jCloud only

Every other memory system delegates intelligence to LLM API calls — that's why they're slow, expensive, and can't work offline. Shodh uses algorithmic intelligence: local embeddings, mathematical decay, learned associations. No LLM in the loop.

Get Started

Unified CLI

bash
# Download from GitHub Releases (or brew tap varun29ankuS/shodh-memory && brew install shodh-memory)
shodh init       # First-time setup — creates config, generates API key, downloads AI model
shodh server     # Start the memory server on :3030
shodh tui        # Launch the TUI dashboard
shodh status     # Check server health
shodh doctor     # Diagnose issues

One binary, all functionality. No Docker, no API keys, no external dependencies.

Claude Code (one command)

bash
claude mcp add shodh-memory -- npx -y @shodh/memory-mcp

That's it. The MCP server auto-downloads the backend binary and starts it. No Docker, no API keys, no configuration. Claude now has persistent memory across sessions.

<details> <summary>Or with Docker (for production / shared servers)</summary>
bash
# 1. Start the server
docker run -d -p 3030:3030 -v shodh-data:/data varunshodh/shodh-memory

# 2. Add to Claude Code
claude mcp add shodh-memory -- npx -y @shodh/memory-mcp
</details> <details> <summary>Cursor / Claude Desktop config</summary>
json
{
  "mcpServers": {
    "shodh-memory": {
      "command": "npx",
      "args": ["-y", "@shodh/memory-mcp"]
    }
  }
}

For local use, no API key is needed — one is generated automatically. For remote servers, add "env": { "SHODH_API_KEY": "your-key" }.

</details>

Python

bash
pip install shodh-memory
python
from shodh_memory import Memory

memory = Memory(storage_path="./my_data")
memory.remember("User prefers dark mode", memory_type="Decision")
results = memory.recall("user preferences", limit=5)

Rust

toml
[dependencies]
shodh-memory = "0.1"
rust
use shodh_memory::{MemorySystem, MemoryConfig};

let memory = MemorySystem::new(MemoryConfig::default())?;
memory.remember("user-1", "User prefers dark mode", MemoryType::Decision, vec![])?;
let results = memory.recall("user-1", "user preferences", 5)?;

Docker

bash
docker run -d -p 3030:3030 -v shodh-data:/data varunshodh/shodh-memory

What It Does

code
You use a memory often  →  it becomes easier to find (Hebbian learning)
You stop using a memory →  it fades over time (activation decay)
You recall one memory   →  related memories surface too (spreading activation)
A connection is used    →  it becomes permanent (long-term potentiation)

Under the hood, memories flow through three tiers:

code
Working Memory ──overflow──▶ Session Memory ──importance──▶ Long-Term Memory
   (100 items)                  (100 MB)                      (RocksDB)

This is based on Cowan's working memory model and Wixted's memory decay research. The neuroscience isn't a gimmick — it's why the system gets better with use instead of just accumulating data.

Performance

OperationLatency
Store memory (API response)<200ms
Store memory (core)55-60ms
Semantic search34-58ms
Tag search~1ms
Entity lookup763ns
Graph traversal (3-hop)30µs

Single binary. No GPU required. Content-hash dedup ensures identical memories are never stored twice.

TUI Dashboard

bash
shodh tui
<p align="center"> <img src="https://raw.githubusercontent.com/varun29ankuS/shodh-memory/main/assets/recall.png" width="700" alt="Shodh Recall"> </p> <p align="center"><i>Semantic recall with hybrid search — relevance scores, memory tiers, and activity feed</i></p> <p align="center"> <img src="https://raw.githubusercontent.com/varun29ankuS/shodh-memory/main/assets/projects-todos.jpg" width="700" alt="Shodh Projects & Todos"> </p> <p align="center"><i>GTD task management — projects, todos, comments, and causal lineage</i></p>

37 MCP Tools

Full list of tools available to Claude, Cursor, and other MCP clients:

<details> <summary>Memory</summary>

remember · recall · proactive_context · context_summary · list_memories · read_memory · forget

</details> <details> <summary>Todos (GTD)</summary>

add_todo · list_todos · update_todo · complete_todo · delete_todo · reorder_todo · list_subtasks · add_todo_comment · list_todo_comments · update_todo_comment · delete_todo_comment · todo_stats

</details> <details> <summary>Projects</summary>

add_project · list_projects · archive_project · delete_project

</details> <details> <summary>Reminders</summary>

set_reminder · list_reminders · dismiss_reminder

</details> <details> <summary>System</summary>

memory_stats · verify_index · repair_index · token_status · reset_token_session · consolidation_report · backup_create · backup_list · backup_verify · backup_restore · backup_purge

</details>

REST API

160+ endpoints on http://localhost:3030. All /api/* endpoints require X-API-Key header.

Full API reference →

<details> <summary>Quick examples</summary>
bash
# Store a memory
curl -X POST http://localhost:3030/api/remember \
  -H "Content-Type: application/json" \
  -H "X-API-Key: your-key" \
  -d '{"user_id": "user-1", "content": "User prefers dark mode", "memory_type": "Decision"}'

# Search memories
curl -X POST http://localhost:3030/api/recall \
  -H "Content-Type: application/json" \
  -H "X-API-Key: your-key" \
  -d '{"user_id": "user-1", "query": "user preferences", "limit": 5}'
</details>

Robotics & ROS2

Shodh-Memory isn't just for chat agents. It's persistent memory for robots — Spot, drones, humanoids, any system running ROS2 or Zenoh.

bash
# Enable Zenoh transport (compile with --features zenoh)
SHODH_ZENOH_ENABLED=true SHODH_ZENOH_LISTEN=tcp/0.0.0.0:7447 shodh server

# ROS2 robots connect via zenoh-bridge-ros2dds or rmw_zenoh — zero code changes
ros2 run zenoh_bridge_ros2dds zenoh_bridge_ros2dds

What robots can do over Zenoh:

OperationKey ExpressionDescription
Remembershodh/{user_id}/rememberStore with GPS, local position, heading, sensor data, mission context
Recallshodh/{user_id}/recallSpatial search (haversine), mission replay, action-outcome filtering
Streamshodh/{user_id}/stream/sensorAuto-remember high-frequency sensor data via extraction pipeline
Missionshodh/{user_id}/mission/startTrack mission boundaries, searchable across missions
Fleetshodh/fleet/**Automatic peer discovery via Zenoh liveliness tokens

Each robot uses its own user_id as the key segment (e.g., shodh/spot-1/remember). The robot_id is an optional payload field for fleet grouping.

Every Experience carries 26 robotics-specific fields: geo_location, local_position, heading, sensor_data, robot_id, mission_id, action_type, reward, terrain_type, nearby_agents, decision_context, action_params, outcome_type, confidence, failure/anomaly tracking, recovery actions, and prediction learning.

<details> <summary>Zenoh remember example (robot publishing a memory)</summary>
json
{
  "user_id": "spot-1",
  "content": "Detected crack in concrete at waypoint alpha",
  "robot_id": "spot_v2",
  "mission_id": "building_inspection_2026",
  "geo_location": [37.7749, -122.4194, 10.0],
  "local_position": [12.5, 3.2, 0.0],
  "heading": 90.0,
  "sensor_data": {"battery": 72.5, "temperature": 28.3},
  "action_type": "inspect",
  "reward": 0.9,
  "terrain_type": "indoor",
  "tags": ["crack", "concrete", "structural"]
}
</details> <details> <summary>Zenoh spatial recall example (robot querying nearby memories)</summary>
json
{
  "user_id": "spot-1",
  "query": "structural damage near entrance",
  "mode": "spatial",
  "lat": 37.7749,
  "lon": -122.4194,
  "radius_meters": 50.0,
  "mission_id": "building_inspection_2026"
}
</details> <details> <summary>Environment variables</summary>
bash
SHODH_ZENOH_ENABLED=true                # Enable Zenoh transport
SHODH_ZENOH_MODE=peer                   # peer | client | router
SHODH_ZENOH_LISTEN=tcp/0.0.0.0:7447    # Listen endpoints
SHODH_ZENOH_CONNECT=tcp/1.2.3.4:7447   # Connect endpoints
SHODH_ZENOH_PREFIX=shodh               # Key expression prefix

# Auto-subscribe to ROS2 topics (via zenoh-bridge-ros2dds)
SHODH_ZENOH_AUTO_TOPICS='[
  {"key_expr": "rt/spot1/status", "user_id": "spot-1", "mode": "sensor"},
  {"key_expr": "rt/nav/events", "user_id": "spot-1", "mode": "event"}
]'
</details>

Works with ROS2 Kilted (rmw_zenoh), PX4 drones, Boston Dynamics Spot, humanoids — anything that speaks Zenoh or ROS2 DDS.

Platform Support

Linux x86_64 · Linux ARM64 · macOS Apple Silicon · macOS Intel · Windows x86_64

Production Deployment

<details> <summary>Environment variables</summary>
bash
SHODH_ENV=production              # Production mode
SHODH_API_KEYS=key1,key2,key3     # Comma-separated API keys
SHODH_HOST=127.0.0.1              # Bind address (default: localhost)
SHODH_PORT=3030                   # Port (default: 3030)
SHODH_MEMORY_PATH=/var/lib/shodh  # Data directory
SHODH_REQUEST_TIMEOUT=60          # Request timeout in seconds
SHODH_MAX_CONCURRENT=200          # Max concurrent requests
SHODH_CORS_ORIGINS=https://app.example.com
</details> <details> <summary>Docker Compose with TLS</summary>
yaml
services:
  shodh-memory:
    image: varunshodh/shodh-memory:latest
    environment:
      - SHODH_ENV=production
      - SHODH_HOST=0.0.0.0
      - SHODH_API_KEYS=${SHODH_API_KEYS}
    volumes:
      - shodh-data:/data
    networks:
      - internal

  caddy:
    image: caddy:latest
    ports:
      - "443:443"
    volumes:
      - ./Caddyfile:/etc/caddy/Caddyfile
    networks:
      - internal

volumes:
  shodh-data:

networks:
  internal:
</details> <details> <summary>Reverse proxy (Nginx / Caddy)</summary>

The server binds to 127.0.0.1 by default. For network deployments, place behind a reverse proxy:

caddyfile
memory.example.com {
    reverse_proxy localhost:3030
}
</details>

Community

ProjectDescriptionAuthor
SHODH on CloudflareEdge-native implementation on Cloudflare Workers@doobidoo

References

[1] Cowan, N. (2010). The Magical Mystery Four. Current Directions in Psychological Science. [2] Magee & Grienberger (2020). Synaptic Plasticity Forms and Functions. Annual Review of Neuroscience. [3] Subramanya et al. (2019). DiskANN. NeurIPS 2019.

License

Apache 2.0


<p align="center"> <a href="https://registry.modelcontextprotocol.io/v0/servers?search=shodh">MCP Registry</a> · <a href="https://hub.docker.com/r/varunshodh/shodh-memory">Docker Hub</a> · <a href="https://pypi.org/project/shodh-memory/">PyPI</a> · <a href="https://www.npmjs.com/package/@shodh/memory-mcp">npm</a> · <a href="https://crates.io/crates/shodh-memory">crates.io</a> · <a href="https://www.shodh-memory.com">Docs</a> </p>

常见问题

io.github.varun29ankuS/shodh-memory 是什么?

提供带 semantic search 的持久化 AI memory,可跨会话存储、检索并召回上下文信息。

相关 Skills

前端设计

by anthropics

Universal
热门

面向组件、页面、海报和 Web 应用开发,按鲜明视觉方向生成可直接落地的前端代码与高质感 UI,适合做 landing page、Dashboard 或美化现有界面,避开千篇一律的 AI 审美。

想把页面做得既能上线又有设计感,就用前端设计:组件到整站都能产出,难得的是能避开千篇一律的 AI 味。

编码与调试
未扫描109.6k

网页构建器

by anthropics

Universal
热门

面向复杂 claude.ai HTML artifact 开发,快速初始化 React + Tailwind CSS + shadcn/ui 项目并打包为单文件 HTML,适合需要状态管理、路由或多组件交互的页面。

在 claude.ai 里做复杂网页 Artifact 很省心,多组件、状态和路由都能顺手搭起来,React、Tailwind 与 shadcn/ui 组合效率高、成品也更精致。

编码与调试
未扫描109.6k

网页应用测试

by anthropics

Universal
热门

用 Playwright 为本地 Web 应用编写自动化测试,支持启动开发服务器、校验前端交互、排查 UI 异常、抓取截图与浏览器日志,适合调试动态页面和回归验证。

借助 Playwright 一站式验证本地 Web 应用前端功能,调 UI 时还能同步查看日志和截图,定位问题更快。

编码与调试
未扫描109.6k

相关 MCP Server

GitHub

编辑精选

by GitHub

热门

GitHub 是 MCP 官方参考服务器,让 Claude 直接读写你的代码仓库和 Issues。

这个参考服务器解决了开发者想让 AI 安全访问 GitHub 数据的问题,适合需要自动化代码审查或 Issue 管理的团队。但注意它只是参考实现,生产环境得自己加固安全。

编码与调试
82.9k

by Context7

热门

Context7 是实时拉取最新文档和代码示例的智能助手,让你告别过时资料。

它能解决开发者查找文档时信息滞后的问题,特别适合快速上手新库或跟进更新。不过,依赖外部源可能导致偶尔的数据延迟,建议结合官方文档使用。

编码与调试
51.5k

by tldraw

热门

tldraw 是让 AI 助手直接在无限画布上绘图和协作的 MCP 服务器。

这解决了 AI 只能输出文本、无法视觉化协作的痛点——想象让 Claude 帮你画流程图或白板讨论。最适合需要快速原型设计或头脑风暴的开发者。不过,目前它只是个基础连接器,你得自己搭建画布应用才能发挥全部潜力。

编码与调试
46.2k

评论