Cortex MCP — Multi-Level Reasoning Server

平台与服务

by j0hanz

Multi-level reasoning MCP server with configurable depth levels

什么是 Cortex MCP — Multi-Level Reasoning Server

Multi-level reasoning MCP server with configurable depth levels

README

Cortex MCP

npm version License

Install in VS Code Install in VS Code Insiders Install in Visual Studio

Add to LM Studio Install in Cursor Install in Goose

Multi-level reasoning MCP server with configurable depth levels, session-based state management, structured thought input, and real-time trace resources.

Overview

Cortex MCP is a stdio-only MCP server for stateful, depth-controlled reasoning. The runtime entrypoint in src/index.ts connects createServer() to StdioServerTransport, and the server surface in src/server.ts enables tools, prompts, completions, logging, and subscribable resources around a single session-based reasoning engine.

The live MCP surface confirmed by Inspector is 1 tool, 6 concrete resources, 4 resource templates, and 7 prompts. Sessions are stored in memory, exposed as MCP resources, and cleared on process restart.

Key Features

  • reasoning_think supports step-by-step sessions, run_to_completion batches, rollback, early conclusion, and structured observation / hypothesis / evaluation input.
  • Four depth levels are built into the engine: basic, normal, high, and expert, each with bounded thought ranges and token budgets.
  • Prompt helpers expose reasoning.basic, reasoning.normal, reasoning.high, reasoning.expert, reasoning.continue, reasoning.retry, and get-help.
  • Resource endpoints expose internal docs plus live session lists, per-session JSON views, full markdown traces, and individual thought documents.
  • Completions are wired for levels, session IDs, and thought names through completable() and resource-template completion hooks.

Requirements

  • Node.js >=24 for local npx or npm usage.
  • An MCP client that supports stdio transport.
  • Optional: Docker if you want to build or run the container image defined by Dockerfile.

Quick Start

Use this standard MCP client configuration:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

Client Configuration

<details> <summary><b>Install in VS Code</b></summary>

Install in VS Code

Add to .vscode/mcp.json:

json
{
  "servers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

Or install via CLI:

sh
code --add-mcp '{"name":"cortex-mcp","command":"npx","args":["-y","@j0hanz/cortex-mcp@latest"]}'

For more info, see VS Code MCP docs.

</details> <details> <summary><b>Install in VS Code Insiders</b></summary>

Install in VS Code Insiders

Add to .vscode/mcp.json:

json
{
  "servers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

Or install via CLI:

sh
code-insiders --add-mcp '{"name":"cortex-mcp","command":"npx","args":["-y","@j0hanz/cortex-mcp@latest"]}'

For more info, see VS Code Insiders MCP docs.

</details> <details> <summary><b>Install in Cursor</b></summary>

Install in Cursor

Add to ~/.cursor/mcp.json:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Cursor MCP docs.

</details> <details> <summary><b>Install in Visual Studio</b></summary>

Install in Visual Studio

Add to mcp.json (VS integrated):

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Visual Studio MCP docs.

</details> <details> <summary><b>Install in Goose</b></summary>

Install in Goose

Add to Goose extension registry:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Goose MCP docs.

</details> <details> <summary><b>Install in LM Studio</b></summary>

Add to LM Studio

Add to LM Studio MCP config:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see LM Studio MCP docs.

</details> <details> <summary><b>Install in Claude Desktop</b></summary>

Add to claude_desktop_config.json:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Claude Desktop MCP docs.

</details> <details> <summary><b>Install in Claude Code</b></summary>

Add to Claude Code CLI:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

Or install via CLI:

sh
claude mcp add cortex-mcp -- npx -y @j0hanz/cortex-mcp@latest

For more info, see Claude Code MCP docs.

</details> <details> <summary><b>Install in Windsurf</b></summary>

Add to ~/.codeium/windsurf/mcp_config.json:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Windsurf MCP docs.

</details> <details> <summary><b>Install in Amp</b></summary>

Add to Amp MCP config:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

Or install via CLI:

sh
amp mcp add cortex-mcp -- npx -y @j0hanz/cortex-mcp@latest

For more info, see Amp MCP docs.

</details> <details> <summary><b>Install in Cline</b></summary>

Add to cline_mcp_settings.json:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Cline MCP docs.

</details> <details> <summary><b>Install in Codex CLI</b></summary>

Add to ~/.codex/config.yaml or codex CLI:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Codex CLI MCP docs.

</details> <details> <summary><b>Install in GitHub Copilot</b></summary>

Add to .vscode/mcp.json:

json
{
  "servers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see GitHub Copilot MCP docs.

</details> <details> <summary><b>Install in Warp</b></summary>

Add to Warp MCP config:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Warp MCP docs.

</details> <details> <summary><b>Install in Kiro</b></summary>

Add to .kiro/settings/mcp.json:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Kiro MCP docs.

</details> <details> <summary><b>Install in Gemini CLI</b></summary>

Add to ~/.gemini/settings.json:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Gemini CLI MCP docs.

</details> <details> <summary><b>Install in Zed</b></summary>

Add to ~/.config/zed/settings.json:

json
{
  "context_servers": {
    "cortex-mcp": {
      "settings": {
        "command": "npx",
        "args": ["-y", "@j0hanz/cortex-mcp@latest"]
      }
    }
  }
}

For more info, see Zed MCP docs.

</details> <details> <summary><b>Install in Augment</b></summary>

Add to VS Code settings.json:

Add to your VS Code settings.json under augment.advanced.

json
{
  "augment.advanced": {
    "mcpServers": [
      {
        "id": "cortex-mcp",
        "command": "npx",
        "args": ["-y", "@j0hanz/cortex-mcp@latest"]
      }
    ]
  }
}

For more info, see Augment MCP docs.

</details> <details> <summary><b>Install in Roo Code</b></summary>

Add to Roo Code MCP settings:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Roo Code MCP docs.

</details> <details> <summary><b>Install in Kilo Code</b></summary>

Add to Kilo Code MCP settings:

json
{
  "mcpServers": {
    "cortex-mcp": {
      "command": "npx",
      "args": ["-y", "@j0hanz/cortex-mcp@latest"]
    }
  }
}

For more info, see Kilo Code MCP docs.

</details>

Use Cases

Start bounded reasoning at the right depth

Use reasoning.basic, reasoning.normal, reasoning.high, or reasoning.expert when the client wants a prompt-first entrypoint, or call reasoning_think directly with query, level, and the first thought. Each response returns the current session state plus a summary string that tells the client how to continue.

Relevant tool: reasoning_think
Related prompts: reasoning.basic, reasoning.normal, reasoning.high, reasoning.expert

Continue, retry, or batch an active session

Reuse sessionId to continue a prior trace, switch to runMode="run_to_completion" when you already have the remaining thought inputs, or use the continuation and retry prompts to generate the next call payload. The handler also supports rollbackToStep and isConclusion for revising or ending a trace early.

Relevant tool: reasoning_think
Related prompts: reasoning.continue, reasoning.retry

Inspect live traces without re-running the tool

Read reasoning://sessions for the active session list, reasoning://sessions/{sessionId} for the JSON detail view, reasoning://sessions/{sessionId}/trace for the markdown transcript, or reasoning://sessions/{sessionId}/thoughts/{thoughtName} for a single thought. This lets a client present progress or audit a session independently from the next tool call.

Relevant resources: reasoning://sessions, reasoning://sessions/{sessionId}, reasoning://sessions/{sessionId}/trace, reasoning://sessions/{sessionId}/thoughts/{thoughtName}

Architecture

text
[MCP Client]
    |
    | stdio
    v
[src/index.ts]
    createServer()
    -> new StdioServerTransport()
    -> server.connect(transport)
    |
    v
[src/server.ts]
    McpServer("cortex-mcp")
    capabilities:
      - tools
      - prompts
      - completions
      - logging
      - resources { subscribe: true, listChanged: true }
    |
    +--> tools/call
    |     -> reasoning_think
    |     -> src/tools/reasoning-think.ts
    |     -> ReasoningThinkInputSchema / ReasoningThinkToolOutputSchema
    |     -> src/engine/reasoner.ts
    |     -> SessionStore
    |
    +--> prompts/get
    |     -> src/prompts/index.ts
    |
    +--> resources/read
    |     -> src/resources/index.ts
    |     -> internal://* and reasoning://sessions/*
    |
    +--> notifications
          -> logging messages
          -> resources/list_changed
          -> resources/updated
          -> notifications/progress

Request Lifecycle

text
[Client] -- initialize --> [Server]
[Server] -- serverInfo + capabilities --> [Client]
[Client] -- notifications/initialized --> [Server]
[Client] -- tools/call {name: "reasoning_think", arguments} --> [Handler]
[Handler] -- validate args --> [Reasoner + SessionStore]
[Reasoner] -- progress/resource events --> [Server notifications]
[Handler] -- structuredContent + optional trace resource --> [Client]

MCP Surface

Tools

reasoning_think

Stateful reasoning tool for creating and continuing multi-step sessions. It supports one-step interactive calls, run_to_completion batches, structured observation/hypothesis/evaluation input, rollback, and early conclusion while returning structured session state.

ParameterTypeRequiredDescription
querystringnoQuestion or problem to analyze.
levelstringnoDepth level. Required for new sessions. basic (1–3 steps, 2K budget), normal (4–8 steps, 8K budget), high (10–15 steps, 32K budget), expert (20–25 steps, 128K budget).
targetThoughtsintegernoExact step count. Must fit level range.
sessionIdstringnoSession ID to continue.
runModestringno"step" (default) or "run_to_completion".
thoughtanynoReasoning text. Stored verbatim. String for step mode, string[] for batch.
isConclusionbooleannoEnd session early at final answer.
rollbackToStepintegerno0-based index to rollback to. Discards later thoughts.
stepSummarystringnoOne-sentence step summary.
observationstringnoKnown facts at this step.
hypothesisstringnoProposed next idea.
evaluationstringnoCritique of hypothesis.
<details> <summary>Data Flow</summary>
text
1. [Client] -- tools/call {name: "reasoning_think", arguments} --> [Server]
   Transport: stdio
2. [Server] -- dispatch("reasoning_think") --> [Handler: src/tools/reasoning-think.ts]
3. [Handler] -- validate(ReasoningThinkInputSchema) --> [src/engine/reasoner.ts]
4. [Reasoner] -- create/update session --> [src/engine/session-store.ts]
5. [Handler] -- structuredContent + optional embedded trace resource --> [Client]
</details>

Resources

ResourceURI or TemplateMIME TypeDescription
server-instructionsinternal://instructionstext/markdownUsage instructions for the MCP server.
server-configinternal://server-configapplication/jsonRuntime limits and level configurations for the reasoning server.
tool-cataloginternal://tool-catalogtext/markdownTool reference: models, params, outputs, data flow.
tool-infointernal://tool-info/{toolName}text/markdownPer-tool contract details.
tool-info-reasoning_thinkinternal://tool-info/reasoning_thinktext/markdownContract details for reasoning_think.
workflowsinternal://workflowstext/markdownRecommended workflows and tool sequences.
reasoning.sessionsreasoning://sessionsapplication/jsonList of active reasoning sessions with summaries. Updated in real-time as sessions progress.
reasoning.sessionreasoning://sessions/{sessionId}application/jsonDetailed view of a single reasoning session, including all thoughts and metadata.
reasoning.tracereasoning://sessions/{sessionId}/tracetext/markdownMarkdown trace of a reasoning session (full content).
reasoning.thoughtreasoning://sessions/{sessionId}/thoughts/{thoughtName}text/markdownMarkdown content of a single thought (for example Thought-1).

Prompts

PromptArgumentsDescription
get-helpnoneReturn server usage instructions.
reasoning.basicquery required, targetThoughts optionalBasic-depth reasoning (1-3 thoughts).
reasoning.normalquery required, targetThoughts optionalNormal-depth reasoning (4-8 thoughts).
reasoning.highquery required, targetThoughts optionalHigh-depth reasoning (10-15 thoughts).
reasoning.expertquery required, targetThoughts optionalExpert-depth reasoning (20-25 thoughts).
reasoning.continuesessionId required, query optional, level optionalContinue an existing session. Optional follow-up query.
reasoning.retryquery required, level required, targetThoughts optionalRetry a failed reasoning task with modified parameters.

MCP Capabilities

CapabilityStatusEvidence
toolsconfirmedsrc/server.ts:203-205, src/tools/reasoning-think.ts:479
promptsconfirmedsrc/server.ts:205, src/prompts/index.ts:201
completionsconfirmedsrc/server.ts:207, src/prompts/index.ts:249, src/resources/index.ts:375
loggingconfirmedsrc/server.ts:204, src/server.ts:98
resources.subscribeconfirmedsrc/server.ts:208, src/server.ts:121
resources.listChangedconfirmedsrc/server.ts:208, src/server.ts:114
progress notificationsconfirmedsrc/lib/mcp.ts:71, src/tools/reasoning-think.ts:424

Tool Annotations

AnnotationValueEvidence
readOnlyHintfalsesrc/tools/reasoning-think.ts:500
destructiveHintfalsesrc/tools/reasoning-think.ts:502
openWorldHintfalsesrc/tools/reasoning-think.ts:503
idempotentHintfalsesrc/tools/reasoning-think.ts:501

Structured Output

  • reasoning_think declares outputSchema and returns structuredContent, with an embedded trace resource when the trace is small enough. Evidence: src/tools/reasoning-think.ts:498, src/lib/mcp.ts:97-114.

Configuration

VariableDefaultRequiredEvidence
CORTEX_SESSION_TTL_MS1800000 (30 minutes)nosrc/engine/reasoner.ts:22, src/engine/session-store.ts:19
CORTEX_MAX_SESSIONS100nosrc/engine/reasoner.ts:23, src/engine/session-store.ts:20
CORTEX_MAX_TOTAL_TOKENS2000000nosrc/engine/reasoner.ts:24, src/engine/session-store.ts:21
CORTEX_MAX_ACTIVE_REASONING_TASKS32nosrc/engine/config.ts:41-44
CORTEX_REDACT_TRACE_CONTENTfalsenosrc/engine/config.ts:21

[!NOTE] The source does not define any HTTP host/port configuration. The only other environment-related signal is NODE_ENV=production in the Docker image and --env-file=.env in the local dev:run script.

Security

ControlStatusEvidence
input validationconfirmedsrc/schemas/inputs.ts:13, src/schemas/outputs.ts:46, src/prompts/index.ts:207
stdout-safe logging fallbackconfirmedsrc/server.ts:98, src/server.ts:145
main-thread-only runtimeconfirmedsrc/index.ts:15-25
non-root container userconfirmedDockerfile:37

[!NOTE] No auth, OAuth, HTTP origin checks, or rate-limiting controls are implemented in the current source because the server only exposes stdio transport.

Development

ScriptCommandPurpose
devtsc --watch --preserveWatchOutputWatch and compile source during development.
dev:runnode --env-file=.env --watch dist/index.jsRun the built server in watch mode with an optional local .env file.
buildnode scripts/tasks.mjs buildClean dist, compile TypeScript, copy assets, and make the entrypoint executable.
linteslint .Run ESLint across the repository.
type-checknode scripts/tasks.mjs type-checkRun source and test TypeScript checks concurrently.
testnode scripts/tasks.mjs testRun the TypeScript test suites with the configured loader.
test:distnode scripts/tasks.mjs test:distRebuild first, then run tests against the built output.
test:fastnode --test --import tsx/esm src/__tests__/**/*.test.ts node-tests/**/*.test.tsRun the fast direct test command without the task wrapper.
formatprettier --write .Format the repository.
inspectornpm run build && npx -y @modelcontextprotocol/inspector node dist/index.jsBuild the server and open it in the MCP Inspector.
prepublishOnlynpm run lint && npm run type-check && npm run buildEnforce release checks before publishing.

Additional helper scripts for diagnostics, coverage, asset copying, and knip are defined in package.json.

Build and Release

  • .github/workflows/release.yml bumps package.json and server.json, then runs npm run lint, npm run type-check, npm run test, and npm run build before tagging and creating a GitHub release.
  • The same workflow publishes the package to npm with Trusted Publishing, publishes to the MCP Registry with mcp-publisher, and pushes a multi-arch Docker image to ghcr.io.
  • Dockerfile uses a multi-stage Node 24 Alpine build, prunes dev dependencies, and runs the released container as the mcp user.

Troubleshooting

  • Sessions are in memory and expire after 30 minutes by default. If you receive E_SESSION_NOT_FOUND, start a new session or increase CORTEX_SESSION_TTL_MS.
  • runMode="run_to_completion" requires enough thought entries to cover the remaining steps. If you want the server to return after each step, keep the default step mode.
  • For stdio transport, do not add custom stdout logging around the server process. This server routes logs through MCP logging and falls back to stderr on failures.

Credits

DependencyRegistry
@modelcontextprotocol/sdknpm
zodnpm

Contributing and License

  • License: MIT
  • Contributions are welcome via pull requests.

常见问题

Cortex MCP — Multi-Level Reasoning Server 是什么?

Multi-level reasoning MCP server with configurable depth levels

相关 Skills

MCP构建

by anthropics

Universal
热门

聚焦高质量 MCP Server 开发,覆盖协议研究、工具设计、错误处理与传输选型,适合用 FastMCP 或 MCP SDK 对接外部 API、封装服务能力。

想让 LLM 稳定调用外部 API,就用 MCP构建:从 Python 到 Node 都有成熟指引,帮你更快做出高质量 MCP 服务器。

平台与服务
未扫描123.0k

Slack动图

by anthropics

Universal
热门

面向Slack的动图制作Skill,内置emoji/消息GIF的尺寸、帧率和色彩约束、校验与优化流程,适合把创意或上传图片快速做成可直接发送的Slack动画。

帮你快速做出适配 Slack 的动图,内置约束规则和校验工具,少踩上传与播放坑,做表情包和演示都更省心。

平台与服务
未扫描123.0k

邮件模板

by alirezarezvani

Universal
热门

快速搭建生产可用的事务邮件系统:生成 React Email/MJML 模板,接入 Resend、Postmark、SendGrid 或 AWS SES,并支持本地预览、i18n、暗色模式、反垃圾优化与追踪埋点。

面向营销与服务场景,快速搭建高质量邮件模板,省去反复设计与切图成本,成熟度和社区认可都很高。

平台与服务
未扫描12.5k

相关 MCP Server

Slack 消息

编辑精选

by Anthropic

热门

Slack 是让 AI 助手直接读写你的 Slack 频道和消息的 MCP 服务器。

这个服务器解决了团队协作中需要 AI 实时获取 Slack 信息的痛点,特别适合开发团队让 Claude 帮忙汇总频道讨论或发送通知。不过,它目前只是参考实现,文档有限,不建议在生产环境直接使用——更适合开发者学习 MCP 如何集成第三方服务。

平台与服务
84.2k

by netdata

热门

io.github.netdata/mcp-server 是让 AI 助手实时监控服务器指标和日志的 MCP 服务器。

这个工具解决了运维人员需要手动检查系统状态的痛点,最适合 DevOps 团队让 Claude 自动分析性能数据。不过,它依赖 NetData 的现有部署,如果你没用过这个监控平台,得先花时间配置。

平台与服务
78.5k

by d4vinci

热门

Scrapling MCP Server 是专为现代网页设计的智能爬虫工具,支持绕过 Cloudflare 等反爬机制。

这个工具解决了爬取动态网页和反爬网站时的头疼问题,特别适合需要批量采集电商价格或新闻数据的开发者。不过,它依赖外部浏览器引擎,资源消耗较大,不适合轻量级任务。

平台与服务
38.1k

评论