io.github.baixianger/langchain-mcp
AI 与智能体by baixianger
为 LangChain、LangGraph、LangSmith 和 DeepAgents 的文档与源代码提供语义搜索能力。
什么是 io.github.baixianger/langchain-mcp?
为 LangChain、LangGraph、LangSmith 和 DeepAgents 的文档与源代码提供语义搜索能力。
README
LangChain MCP
Give your AI assistant complete knowledge of LangChain, LangGraph & LangSmith
Website • Installation • Features • Documentation
</div>Overview
LangChain MCP is a Model Context Protocol (MCP) server that provides semantic search across the entire LangChain ecosystem. Build AI applications faster with instant access to documentation and source code for LangChain, LangGraph, LangSmith, and DeepAgents.
<img src="img/homepage.png" alt="LangChain MCP Homepage" width="800">Features
- Semantic Search - Natural language queries across all LangChain ecosystem docs
- Source Code Search - Find code examples in Python and JavaScript repositories
- MCP Protocol - Works seamlessly with Claude Code, Claude Desktop, Cursor, and any MCP-compatible client
- Production Ready - Scalable API with authentication and usage tracking
- Fast & Accurate - Powered by ChromaDB and OpenRouter embeddings
Installation
Quick Start (Recommended)
# Install globally
npm install -g langchain-mcp
# Login with Google
langchain-mcp login
# Add to Claude Code
claude mcp add langchain-mcp -- npx langchain-mcp
Manual Configuration
Add the following configuration to your client's config file:
Claude Desktop
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Cursor
- macOS/Linux:
~/.cursor/mcp.json - Windows:
%USERPROFILE%\.cursor\mcp.json
{
"mcpServers": {
"langchain-mcp": {
"command": "npx",
"args": ["langchain-mcp"]
}
}
}
Usage
CLI Commands
langchain-mcp login # Login via Google OAuth
langchain-mcp status # Check usage and remaining credits
langchain-mcp logout # Logout and clear credentials
Available MCP Tools
| Tool | Description | Parameters |
|---|---|---|
search_docs | Search documentation, references, and tutorials | query, limit (default: 5) |
search_langchain_code | Search LangChain source code | query, language (py/js), limit |
search_langgraph_code | Search LangGraph source code | query, language (py/js), limit |
search_deepagents_code | Search DeepAgents source code | query, language (py/js), limit |
Pricing
- Free tier available for new users
- Donation bonus for supporters
Documentation
Self-deploy
Project Structure
langchain-MCP/
├── packages/
│ ├── ingest/ # Python - Data ingestion (uv)
│ ├── api/ # TypeScript - API server (Express)
│ ├── mcp-server/ # TypeScript - MCP client (npm package)
│ └── mcp-server-local/ # TypeScript - Local MCP server (dev)
├── config/
│ └── settings.json # Shared configuration
└── deploy.sh # Deployment script
Architecture
<img src="img/Architecture.png" alt="Architecture" width="600">Setup Development Environment
1. Ingest Documentation & Source Code
cd packages/ingest
uv sync
uv run ingest --list # List available repositories
uv run ingest docs # Ingest documentation only
uv run ingest # Ingest all (docs + code)
2. Run API Server
cd packages/api
npm install
npm run dev # Development server on port 3000
3. Test Local MCP Server
cd packages/mcp-server-local
npm install
npm run dev
Configuration
All settings in config/settings.json:
{
"embedding": {
"provider": "openrouter",
"model": "qwen/qwen3-embedding-8b"
},
"chromadb": {
"path": "./data/chroma"
},
"chunking": {
"docs": { "chunk_size": 2000, "chunk_overlap": 200 },
"code": { "chunk_size": 4000, "chunk_overlap": 200 }
},
"repos": [
{
"name": "langchain",
"url": "https://github.com/langchain-ai/langchain",
"type": "code",
"languages": ["python", "javascript"]
}
]
}
Supported Embedding Providers
sentence-transformer(local)openaicoheregoogleollamaopenrouter(default)
See ChromaDB Integrations for more options.
Deployment
The project includes automated deployment scripts for VPS hosting:
# Manual deployment
./deploy.sh
# GitHub Actions (production branch)
git push origin main:production
Deployment includes:
- Code synchronization via rsync
- Automatic npm installation and build
- PM2 process management
- Nginx static file serving
- Environment variable management
Roadmap
- Semantic search across docs and code
- Google OAuth authentication
- Usage tracking and credits system
- MCP registry registration
- Claude Code, Desktop, and Cursor support
- Rate limiting (per user / per IP)
- Additional embedding model options
- Local mode (no API key required)
- Browser extension for quick searches
- VSCode extension integration
Contributing
Forking and contributions are welcome!
Support
- Website: langchain-mcp.xyz
- Issues: GitHub Issues
- Donate: Support development at Ko-fi
License
MIT License - see the LICENSE file for details.
<div align="center">
Built with ❤️ by baixianger
</div>常见问题
io.github.baixianger/langchain-mcp 是什么?
为 LangChain、LangGraph、LangSmith 和 DeepAgents 的文档与源代码提供语义搜索能力。
相关 Skills
Claude接口
by anthropics
面向接入 Claude API、Anthropic SDK 或 Agent SDK 的开发场景,自动识别项目语言并给出对应示例与默认配置,快速搭建 LLM 应用。
✎ 想把Claude能力接进应用或智能体,用claude-api上手快、兼容Anthropic与Agent SDK,集成路径清晰又省心
RAG架构师
by alirezarezvani
聚焦生产级RAG系统设计与优化,覆盖文档切块、检索链路、索引构建、召回评估等关键环节,适合搭建可扩展、高准确率的知识库问答与检索增强应用。
✎ 面向RAG落地,把知识库、向量检索和生成链路系统串联起来,做架构设计时更清晰,也更少踩坑。
计算机视觉
by alirezarezvani
聚焦目标检测、图像分割与视觉系统落地,覆盖 YOLO、DETR、Mask R-CNN、SAM 等方案,适合定制数据集训练、推理优化及 ONNX/TensorRT 部署。
✎ 把目标检测、图像分割到推理部署串成完整工程链路,主流框架与 YOLO、DETR、SAM 等方案都覆盖,落地视觉 AI 会省心很多。
相关 MCP Server
顺序思维
编辑精选by Anthropic
Sequential Thinking 是让 AI 通过动态思维链解决复杂问题的参考服务器。
✎ 这个服务器展示了如何让 Claude 像人类一样逐步推理,适合开发者学习 MCP 的思维链实现。但注意它只是个参考示例,别指望直接用在生产环境里。
知识图谱记忆
编辑精选by Anthropic
Memory 是一个基于本地知识图谱的持久化记忆系统,让 AI 记住长期上下文。
✎ 帮 AI 和智能体补上“记不住”的短板,用本地知识图谱沉淀长期上下文,连续对话更聪明,数据也更可控。
PraisonAI
编辑精选by mervinpraison
PraisonAI 是一个支持自反思和多 LLM 的低代码 AI 智能体框架。
✎ 如果你需要快速搭建一个能 24/7 运行的 AI 智能体团队来处理复杂任务(比如自动研究或代码生成),PraisonAI 的低代码设计和多平台集成(如 Telegram)让它上手极快。但作为非官方项目,它的生态成熟度可能不如 LangChain 等主流框架,适合愿意尝鲜的开发者。