io.github.zero-abd/llmmcp
平台与服务by zero-abd
提供实时 LLM API documentation 的 MCP server,帮助减少模型 hallucinations。
什么是 io.github.zero-abd/llmmcp?
提供实时 LLM API documentation 的 MCP server,帮助减少模型 hallucinations。
README
llmmcp
🌐 Website: https://llmmcp.vercel.app<br> 🎥 Demo:
https://github.com/user-attachments/assets/eaad8d05-b7a8-4bf0-86c6-4fe2726da628
Stop LLM hallucinations and outdated code patterns.
llmmcp is a Model Context Protocol (MCP) server that provides real-time, up-to-date documentation for major LLM providers (OpenAI, Anthropic, and Google Gemini). It ensures your AI agents—like Cursor, Claude Desktop, or Windsurf—base their work on current official documentation instead of stale training data or deprecated library patterns.
Why use llmmcp?
LLMs frequently hallucinate about their own latest versions, feature availability (e.g., tool use in certain models), and pricing. llmmcp fixes this by providing:
- ✅ Up-to-Date Model Info: Always know the latest available models (e.g., Gemini 2.0 Flash, Claude 3.5 Sonnet).
- ✅ Detailed API Params: Verified tool use syntax, context window sizes, and rate limits.
- ✅ Latest Implementation Patterns: Force your AI agent to follow current best practices instead of using legacy or deprecated library versions.
- ✅ Real-Time Search: Queries an indexed vector database of official provider documentation.
- ✅ Dynamic Listings: Get the current state of providers without hardcoded lists.
🚀 Quick Start
You can use llmmcp immediately in your favorite AI tools without local installation.
Cursor
Add a new MCP server in Settings > Models > MCP Servers:
- Name:
llmmcp - Type:
command - Command:
npx -y llmmcp@latest
Claude Desktop
Add the following to your claude_desktop_config.json:
{
"mcpServers": {
"llmmcp": {
"command": "npx",
"args": ["-y", "llmmcp@latest"]
}
}
}
🛠 Features
search_docs
Search the latest official documentation for specific technical details. Example: "What are the tool use parameters for Gemini 1.5 Pro?"
list_providers
Get a dynamically updated list of available providers (OpenAI, Anthropic, Google) and their currently promoted models.
🏗 How it Works
llmmcp is designed for speed and reliability:
- Indexer: A weekly scraper fetches raw markdown/text from official documentation.
- Vector DB: Chunks are embedded and stored in Pinecone with integrated embedding support.
- Backend: A Cloudflare Worker handles query embedding and retrieval, caching frequent results in Workers KV.
- MCP Client: A thin CLI translates MCP requests into API calls for the Worker.
🤝 Contributing & Self-Hosting
This project is open-source. If you'd like to run your own instance of the backend:
- See Architecture & Deployment (coming soon, see current setup in logs).
- Fork the repo and submit a PR for new documentation sources.
Developed by Abdullah Al Mahmud
License
MIT
常见问题
io.github.zero-abd/llmmcp 是什么?
提供实时 LLM API documentation 的 MCP server,帮助减少模型 hallucinations。
相关 Skills
MCP构建
by anthropics
聚焦高质量 MCP Server 开发,覆盖协议研究、工具设计、错误处理与传输选型,适合用 FastMCP 或 MCP SDK 对接外部 API、封装服务能力。
✎ 想让 LLM 稳定调用外部 API,就用 MCP构建:从 Python 到 Node 都有成熟指引,帮你更快做出高质量 MCP 服务器。
Slack动图
by anthropics
面向Slack的动图制作Skill,内置emoji/消息GIF的尺寸、帧率和色彩约束、校验与优化流程,适合把创意或上传图片快速做成可直接发送的Slack动画。
✎ 帮你快速做出适配 Slack 的动图,内置约束规则和校验工具,少踩上传与播放坑,做表情包和演示都更省心。
MCP服务构建器
by alirezarezvani
从 OpenAPI 一键生成 Python/TypeScript MCP server 脚手架,并校验 tool schema、命名规范与版本兼容性,适合把现有 REST API 快速发布成可生产演进的 MCP 服务。
✎ 帮你快速搭建 MCP 服务与后端 API,脚手架完善、扩展顺手,尤其适合想高效验证服务能力的开发者。
相关 MCP Server
Slack 消息
编辑精选by Anthropic
Slack 是让 AI 助手直接读写你的 Slack 频道和消息的 MCP 服务器。
✎ 这个服务器解决了团队协作中需要 AI 实时获取 Slack 信息的痛点,特别适合开发团队让 Claude 帮忙汇总频道讨论或发送通知。不过,它目前只是参考实现,文档有限,不建议在生产环境直接使用——更适合开发者学习 MCP 如何集成第三方服务。
by netdata
io.github.netdata/mcp-server 是让 AI 助手实时监控服务器指标和日志的 MCP 服务器。
✎ 这个工具解决了运维人员需要手动检查系统状态的痛点,最适合 DevOps 团队让 Claude 自动分析性能数据。不过,它依赖 NetData 的现有部署,如果你没用过这个监控平台,得先花时间配置。
by d4vinci
Scrapling MCP Server 是专为现代网页设计的智能爬虫工具,支持绕过 Cloudflare 等反爬机制。
✎ 这个工具解决了爬取动态网页和反爬网站时的头疼问题,特别适合需要批量采集电商价格或新闻数据的开发者。不过,它依赖外部浏览器引擎,资源消耗较大,不适合轻量级任务。