DeepSeek MCP Server
平台与服务by dmontgomery40
官方 DeepSeek MCP server,提供 chat、completion、模型列表与余额接口访问。
想快速接入 DeepSeek 的对话、补全和模型管理能力,用官方 MCP Server 最省心,连余额查询也一并打通。
什么是 DeepSeek MCP Server?
官方 DeepSeek MCP server,提供 chat、completion、模型列表与余额接口访问。
README
DeepSeek MCP Server
<p align="center"> <a href="https://github.com/deepseek-ai/awesome-deepseek-integration"><img alt="DeepSeek Official List" src="https://img.shields.io/badge/DeepSeek%20Official%20List-Linked-0A66FF?logo=github&logoColor=white" /></a> <a href="https://registry.modelcontextprotocol.io/v0.1/servers?search=io.github.DMontgomery40/deepseek"><img alt="Official MCP Registry" src="https://img.shields.io/badge/MCP%20Registry-Official%20Active-0A66FF" /></a> <a href="https://www.npmjs.com/package/deepseek-mcp-server"><img alt="npm version" src="https://img.shields.io/npm/v/deepseek-mcp-server?logo=npm" /></a> <a href="https://www.npmjs.com/package/deepseek-mcp-server"><img alt="npm downloads" src="https://img.shields.io/npm/dm/deepseek-mcp-server?logo=npm" /></a> <a href="https://github.com/DMontgomery40/deepseek-mcp-server/blob/main/server.json"><img alt="OCI package" src="https://img.shields.io/badge/OCI-docker.io%2Fdmontgomery40%2Fdeepseek--mcp--server%3A0.4.0-2496ED?logo=docker&logoColor=white" /></a> <a href="https://github.com/DMontgomery40/deepseek-mcp-server"><img alt="GitHub stars" src="https://img.shields.io/github/stars/DMontgomery40/deepseek-mcp-server?logo=github" /></a> <a href="https://glama.ai/mcp/servers/asht4rqltn"><img alt="Glama MCP Listing" src="https://img.shields.io/badge/Glama-MCP%20Listing-7B61FF" /></a> <a href="https://spark.entire.vc/assets/vb-deepseek-mcp-server?utm_source=github&utm_medium=readme"><img alt="Listed on Spark" src="https://spark.entire.vc/badges/listed.svg" /></a> <a href="https://spark.entire.vc/assets/vb-deepseek-mcp-server?utm_source=github&utm_medium=readme"><img alt="Install via Spark" src="https://spark.entire.vc/badges/vb-deepseek-mcp-server/install.svg" /></a> </p>Official DeepSeek MCP server for chat/completions/models/balance. Why V4 is a big deal (plain-language explainer).
- Hosted remote endpoint:
https://deepseek-mcp.ragweld.com/mcp - Auth:
Authorization: Bearer <token> - Local package and Docker are also supported.
Quick Install (Copy/Paste)
1) Set your hosted token once
export DEEPSEEK_MCP_AUTH_TOKEN="REPLACE_WITH_TOKEN"
2) Codex CLI (remote MCP)
codex mcp add deepseek --url https://deepseek-mcp.ragweld.com/mcp --bearer-token-env-var DEEPSEEK_MCP_AUTH_TOKEN
3) Claude Code (remote MCP)
claude mcp add --transport http deepseek https://deepseek-mcp.ragweld.com/mcp --header "Authorization: Bearer $DEEPSEEK_MCP_AUTH_TOKEN"
4) Cursor (remote MCP)
node -e 'const fs=require("fs"),p=process.env.HOME+"/.cursor/mcp.json";let j={mcpServers:{}};try{j=JSON.parse(fs.readFileSync(p,"utf8"))}catch{};j.mcpServers={...(j.mcpServers||{}),deepseek:{url:"https://deepseek-mcp.ragweld.com/mcp",headers:{Authorization:"Bearer ${env:DEEPSEEK_MCP_AUTH_TOKEN}"}}};fs.mkdirSync(process.env.HOME+"/.cursor",{recursive:true});fs.writeFileSync(p,JSON.stringify(j,null,2));'
5) Local install (stdio, if you prefer self-hosted)
DEEPSEEK_API_KEY="REPLACE_WITH_DEEPSEEK_KEY" npx -y deepseek-mcp-server
6) Local install with Docker (stdio, self-hosted)
docker pull docker.io/dmontgomery40/deepseek-mcp-server:0.4.0 && \
docker run --rm -i -e DEEPSEEK_API_KEY="REPLACE_WITH_DEEPSEEK_KEY" docker.io/dmontgomery40/deepseek-mcp-server:0.4.0
Non-Technical Users
If you mostly use chat apps and don’t want terminal setup:
- Use Cursor’s MCP settings UI and add:
- URL:
https://deepseek-mcp.ragweld.com/mcp - Header:
Authorization: Bearer <token>
- URL:
- If your app does not support custom remote MCP servers with bearer headers yet, use Codex/Claude Code/Cursor as your MCP-enabled client and keep your usual model provider.
OpenRouter users (API + chat UI)
OpenRouter now documents MCP usage, but its MCP flow is SDK/client-centric (not “paste URL in chat and done” for most users). Easiest path is: keep OpenRouter for models, and connect this MCP server through an MCP-capable client (Codex/Claude Code/Cursor).
Remote vs Local (Which Should I Use?)
Remote server
Use remote if you want the fastest setup and centralized updates.
- Pros: no local server process, easy multi-device use, one shared endpoint.
- Cons: depends on network + hosted token.
Local server
Use local if you want full runtime control.
- Pros: fully self-managed, easy private-network workflows.
- Cons: you manage updates/secrets/process lifecycle.
Code Execution with MCP (What This Actually Means)
In basic tool-calling mode, the model usually needs:
- many tool definitions loaded into context before it starts;
- one model round-trip per tool call;
- intermediate results repeatedly fed back into context.
That works for small toolsets, but it scales poorly. You burn tokens on tool metadata, add latency from repeated inference hops, and raise failure risk when tools are similarly named or require multi-step orchestration.
Code execution changes the control flow. Instead of repeatedly asking the model to call one tool at a time, the model can write a small program that calls tools directly in an execution runtime. That runtime handles loops, branching, filtering, joins, retries, and result shaping. The model then gets a compact summary instead of every raw intermediate payload.
Why this matters in practice:
- lower context pressure: you avoid dumping full tool catalogs and every raw result into prompt history;
- better orchestration: code handles deterministic logic that is awkward in pure natural-language loops;
- lower latency at scale: fewer model turns for multi-step workflows;
- usually better reliability: less chance of drifting tool choice across long chains.
Limits to keep in mind:
- code execution does not remove the need for good tool schemas and permissions;
- this is still an agent system, so guardrails/quotas/auditing matter;
- for tiny single-tool tasks, plain tool calling can still be simpler.
For this DeepSeek MCP server, the practical takeaway is: keep tool interfaces explicit and stable, then let MCP clients choose direct tool-calling or code-execution orchestration based on workload size and complexity.
Learn More (Curated)
-
Anthropic Engineering: Code execution with MCP: Building more efficient agents
Why it matters: the clearest explanation of why direct tool-calling becomes expensive at scale, and how code execution reduces token overhead and orchestration friction. -
Anthropic Engineering: Introducing advanced tool use on the Claude Developer Platform
Why it matters: practical architecture for large tool ecosystems: Tool Search Tool, Programmatic Tool Calling, and Tool Use Examples. -
Cloudflare (Matt Carey, Feb 2026): Code Mode: give agents an entire API in 1,000 tokens
Why it matters: concrete implementation patterns for model-controlled tool discovery and token-efficient execution loops. -
Anthropic Help (updated 2026): Getting started with custom connectors using remote MCP
Why it matters: clean product-level explanation of what remote MCP is and when to use it. -
Cursor docs: Model Context Protocol (MCP)
Why it matters: currentmcp.jsonsetup model for Cursor. -
OpenRouter docs: Using MCP Servers with OpenRouter
Why it matters: current integration path for OpenRouter-centric workflows.
Registry Identity
- MCP Registry name:
io.github.DMontgomery40/deepseek
License
MIT
常见问题
DeepSeek MCP Server 是什么?
官方 DeepSeek MCP server,提供 chat、completion、模型列表与余额接口访问。
相关 Skills
MCP构建
by anthropics
聚焦高质量 MCP Server 开发,覆盖协议研究、工具设计、错误处理与传输选型,适合用 FastMCP 或 MCP SDK 对接外部 API、封装服务能力。
✎ 想让 LLM 稳定调用外部 API,就用 MCP构建:从 Python 到 Node 都有成熟指引,帮你更快做出高质量 MCP 服务器。
Slack动图
by anthropics
面向Slack的动图制作Skill,内置emoji/消息GIF的尺寸、帧率和色彩约束、校验与优化流程,适合把创意或上传图片快速做成可直接发送的Slack动画。
✎ 帮你快速做出适配 Slack 的动图,内置约束规则和校验工具,少踩上传与播放坑,做表情包和演示都更省心。
接口设计评审
by alirezarezvani
审查 REST API 设计是否符合行业规范,自动检查命名、HTTP 方法、状态码与文档覆盖,识别破坏性变更并给出设计评分,适合评审接口方案和版本迭代前把关。
✎ 做API和架构方案时,它能帮你提前揪出接口设计问题并对齐最佳实践,评审视角系统,团队协作更省心。
相关 MCP Server
Slack 消息
编辑精选by Anthropic
Slack 是让 AI 助手直接读写你的 Slack 频道和消息的 MCP 服务器。
✎ 这个服务器解决了团队协作中需要 AI 实时获取 Slack 信息的痛点,特别适合开发团队让 Claude 帮忙汇总频道讨论或发送通知。不过,它目前只是参考实现,文档有限,不建议在生产环境直接使用——更适合开发者学习 MCP 如何集成第三方服务。
by netdata
io.github.netdata/mcp-server 是让 AI 助手实时监控服务器指标和日志的 MCP 服务器。
✎ 这个工具解决了运维人员需要手动检查系统状态的痛点,最适合 DevOps 团队让 Claude 自动分析性能数据。不过,它依赖 NetData 的现有部署,如果你没用过这个监控平台,得先花时间配置。
by d4vinci
Scrapling MCP Server 是专为现代网页设计的智能爬虫工具,支持绕过 Cloudflare 等反爬机制。
✎ 这个工具解决了爬取动态网页和反爬网站时的头疼问题,特别适合需要批量采集电商价格或新闻数据的开发者。不过,它依赖外部浏览器引擎,资源消耗较大,不适合轻量级任务。