io.github.qso-graph/qsp-mcp
AI 与智能体by qso-graph
QSP:可将 MCP 工具中继到任意兼容 OpenAI 的本地 LLM,如 llama.cpp、Ollama、vLLM。
什么是 io.github.qso-graph/qsp-mcp?
QSP:可将 MCP 工具中继到任意兼容 OpenAI 的本地 LLM,如 llama.cpp、Ollama、vLLM。
README
qsp-mcp
QSP — relay MCP tools to any OpenAI-compatible local LLM endpoint.
Named after the Q-signal QSP ("Will you relay?"), qsp-mcp relays tool calls between a local LLM and MCP servers. Any model with function calling capability gains access to the full qso-graph tool ecosystem — 80 tools across 14 packages — from local weights, not from cloud.
Install
pip install qsp-mcp
Quick Start
# Interactive mode
qsp-mcp --config ~/.config/qsp-mcp/config.json
# Single query
qsp-mcp --query "What bands are open from DN13 to JN48 right now?"
# Direct endpoint (no config file needed if no MCP servers configured)
qsp-mcp --endpoint http://localhost:8000/v1/chat/completions --api-key sk-xxx
Configuration
The config format is Claude Desktop compatible — copy your existing mcpServers block directly:
{
"mcpServers": {
"ionis": {
"command": "ionis-mcp",
"env": { "IONIS_DATA_DIR": "/path/to/datasets/v1.0" }
},
"solar": {
"command": "solar-mcp"
},
"wspr": {
"command": "wspr-mcp"
}
},
"bridge": {
"endpoint": "http://localhost:8000/v1/chat/completions",
"model": "AstroSage-70B",
"temperature": 0.3,
"system_prompt": "You are an expert ham radio operator and RF engineer.",
"max_tool_calls_per_turn": 5,
"profiles": {
"contest": {
"servers": ["n1mm", "ionis", "solar", "wspr"],
"temperature": 0.2,
"system_prompt": "You are a contest advisor. Be concise."
},
"propagation": {
"servers": ["ionis", "solar", "wspr"],
"temperature": 0.3
},
"full": {
"servers": "*",
"temperature": 0.3
}
},
"server_timeouts": {
"ionis": 1,
"solar": 8,
"qrz": 5
}
}
}
The mcpServers block uses the exact same format as Claude Desktop. The bridge section is qsp-mcp specific (ignored by Claude Desktop).
CLI Options
qsp-mcp [OPTIONS]
Options:
-c, --config PATH Config file path (default: ~/.config/qsp-mcp/config.json)
-e, --endpoint URL LLM endpoint URL (overrides config)
-k, --api-key KEY API key for the LLM endpoint
-m, --model NAME Model name (overrides config)
-p, --profile NAME Tool profile (contest, dx, propagation, full)
-q, --query TEXT Single query mode — ask one question and exit
--enable-writes Enable write-capable tools (disabled by default)
--list-tools List available tools and exit
--version Show version
Interactive Commands
| Command | Action |
|---|---|
/tools | List available tools |
/help | Show help |
quit | Exit (also: exit, q, 73) |
Design
qsp-mcp is a strict, stateless pipe between an LLM and MCP tools:
- No caching, no shared state, no health polling
- All state lives in MCP servers
- All inference optimization lives in the inference server (prefix caching, KV-cache)
- qsp-mcp just connects the two sides
Works with any OpenAI-compatible endpoint: llama.cpp, Ollama, vLLM, SGLang.
Security
- Write-capable tools disabled by default (
--enable-writesopt-in) - Credentials stay inside MCP servers (OS keyring) — never exposed to qsp-mcp or the LLM
- No subprocess, no shell execution, no eval
- All external connections HTTPS only (LAN endpoints exempted)
License
MIT — see LICENSE.
Part of the qso-graph ecosystem
qso-graph.io — MCP servers for amateur radio.
常见问题
io.github.qso-graph/qsp-mcp 是什么?
QSP:可将 MCP 工具中继到任意兼容 OpenAI 的本地 LLM,如 llama.cpp、Ollama、vLLM。
相关 Skills
Claude接口
by anthropics
面向接入 Claude API、Anthropic SDK 或 Agent SDK 的开发场景,自动识别项目语言并给出对应示例与默认配置,快速搭建 LLM 应用。
✎ 想把Claude能力接进应用或智能体,用claude-api上手快、兼容Anthropic与Agent SDK,集成路径清晰又省心
智能体流程设计
by alirezarezvani
面向生产级多 Agent 编排,梳理顺序、并行、分层、事件驱动、共识五种工作流设计,覆盖 handoff、状态管理、容错重试、上下文预算与成本优化,适合搭建复杂 AI 协作系统。
✎ 帮你把多智能体流程设计、编排和自动化统一起来,复杂工作流也能更稳地落地,适合追求强控制力的团队。
提示工程专家
by alirezarezvani
覆盖Prompt优化、Few-shot设计、结构化输出、RAG评测与Agent工作流编排,适合分析token成本、评估LLM输出质量,并搭建可落地的AI智能体系统。
✎ 把提示优化、LLM评测到RAG与智能体设计串成一套方法,适合想系统提升AI开发效率的人。
相关 MCP Server
知识图谱记忆
编辑精选by Anthropic
Memory 是一个基于本地知识图谱的持久化记忆系统,让 AI 记住长期上下文。
✎ 帮 AI 和智能体补上“记不住”的短板,用本地知识图谱沉淀长期上下文,连续对话更聪明,数据也更可控。
顺序思维
编辑精选by Anthropic
Sequential Thinking 是让 AI 通过动态思维链解决复杂问题的参考服务器。
✎ 这个服务器展示了如何让 Claude 像人类一样逐步推理,适合开发者学习 MCP 的思维链实现。但注意它只是个参考示例,别指望直接用在生产环境里。
PraisonAI
编辑精选by mervinpraison
PraisonAI 是一个支持自反思和多 LLM 的低代码 AI 智能体框架。
✎ 如果你需要快速搭建一个能 24/7 运行的 AI 智能体团队来处理复杂任务(比如自动研究或代码生成),PraisonAI 的低代码设计和多平台集成(如 Telegram)让它上手极快。但作为非官方项目,它的生态成熟度可能不如 LangChain 等主流框架,适合愿意尝鲜的开发者。