io.github.connerlambden/bgpt-mcp

编码与调试

by connerlambden

可检索科研论文及全文实验数据,覆盖 methods、results 与 quality scores 等信息。

什么是 io.github.connerlambden/bgpt-mcp

可检索科研论文及全文实验数据,覆盖 methods、results 与 quality scores 等信息。

README

BGPT MCP API

Search scientific papers from Claude, Cursor, or any MCP-compatible AI tool.

BGPT is a remote Model Context Protocol (MCP) server that gives AI assistants access to a database of scientific papers built from full-text studies. Unlike typical search tools that return titles and abstracts, BGPT extracts raw experimental data — methods, results, conclusions, quality scores, sample sizes, limitations, and 25+ metadata fields per paper.

MCP Compatible npm License: MIT bgpt-mcp MCP server


Quick Start

Add BGPT to your MCP client — no API key required for the free tier (50 free results).

Option A: Remote Connection (Recommended)

Most modern MCP clients support direct remote connections. BGPT offers two transports:

TransportEndpoint
SSEhttps://bgpt.pro/mcp/sse
Streamable HTTPhttps://bgpt.pro/mcp/stream

Claude Desktop (claude_desktop_config.json):

json
{
  "mcpServers": {
    "bgpt": {
      "url": "https://bgpt.pro/mcp/sse"
    }
  }
}

Cursor (.cursor/mcp.json):

json
{
  "mcpServers": {
    "bgpt": {
      "url": "https://bgpt.pro/mcp/sse"
    }
  }
}

Claude Code (CLI):

bash
claude mcp add bgpt --transport sse https://bgpt.pro/mcp/sse

Cline / Roo Code / Windsurf — same config:

json
{
  "mcpServers": {
    "bgpt": {
      "url": "https://bgpt.pro/mcp/sse"
    }
  }
}

Tip: If your client supports Streamable HTTP, you can use https://bgpt.pro/mcp/stream instead.

Option B: Via npx (for clients that need a local command)

json
{
  "mcpServers": {
    "bgpt": {
      "command": "npx",
      "args": ["-y", "bgpt-mcp"]
    }
  }
}

Option C: Install globally

bash
npm install -g bgpt-mcp

Then add to your MCP config:

json
{
  "mcpServers": {
    "bgpt": {
      "command": "bgpt-mcp"
    }
  }
}

Any MCP Client

Connect to either endpoint:

code
SSE:              https://bgpt.pro/mcp/sse
Streamable HTTP:  https://bgpt.pro/mcp/stream

That's it. No Docker, no build step.


What You Get

BGPT provides one tool: search_papers

ParameterTypeRequiredDescription
querystringYesSearch terms (e.g. "CRISPR gene editing efficiency")
num_resultsintegerNoNumber of results to return (1–100, default 10)
days_backintegerNoOnly return papers published within the last N days
api_keystringNoYour Stripe subscription ID for paid access

What comes back

Each paper result includes 25+ fields, extracted from the full text:

  • Title & DOI — standard identifiers
  • Methods — experimental design, techniques used
  • Results — raw findings, measurements, statistical outcomes
  • Conclusions — what the authors determined
  • Quality scores — methodological rigor assessment
  • Sample sizes — participant/specimen counts
  • Limitations — acknowledged weaknesses
  • And more — funding, conflicts of interest, study type, etc.

Example

Ask your AI assistant:

"Search for recent papers on CAR-T cell therapy response rates"

BGPT returns structured experimental data your AI can reason over — not just a list of titles.


Pricing

TierCostDetails
Free$050 free results, no API key needed
Pay-as-you-go$0.02/resultBilled per result returned. Get an API key at bgpt.pro/mcp

How It Works

code
Your AI Assistant (Claude, Cursor, etc.)
        │
        │  MCP Protocol (SSE or Streamable HTTP)
        ▼
   BGPT MCP Server
   https://bgpt.pro/mcp/sse
   https://bgpt.pro/mcp/stream
        │
        │  search_papers(query, ...)
        ▼
   BGPT Paper Database
   (full-text extracted data)
        │
        ▼
   Structured Results
   (methods, results, quality scores, 25+ fields)

BGPT is a hosted remote server — your MCP client connects via SSE or Streamable HTTP. No local installation needed.


Use Cases

  • Literature reviews — Ask your AI to survey a topic with real experimental data
  • Evidence synthesis — Ground AI responses in actual study findings
  • Research assistance — Find papers by methodology, outcome, or recency
  • Fact-checking — Verify claims against published experimental results
  • Grant writing — Quickly gather supporting evidence for proposals

Configuration Reference

Server Details

FieldValue
ProtocolMCP (Model Context Protocol)
TransportSSE (Server-Sent Events) or Streamable HTTP
SSE Endpointhttps://bgpt.pro/mcp/sse
Streamable HTTP Endpointhttps://bgpt.pro/mcp/stream
AuthenticationNone required (free tier) / Stripe API key (paid)

Full MCP Client Config

json
{
  "mcpServers": {
    "bgpt": {
      "url": "https://bgpt.pro/mcp/sse"
    }
  }
}

Documentation

Full documentation, FAQ, and setup guides: bgpt.pro/mcp


Support


Contributing

See CONTRIBUTING.md for guidelines on reporting bugs, requesting features, and contributing.


License

This repository (documentation, examples, and configuration files) is licensed under the MIT License.

The BGPT MCP API service itself is operated by BGPT and subject to its own terms of service.

常见问题

io.github.connerlambden/bgpt-mcp 是什么?

可检索科研论文及全文实验数据,覆盖 methods、results 与 quality scores 等信息。

相关 Skills

前端设计

by anthropics

Universal
热门

面向组件、页面、海报和 Web 应用开发,按鲜明视觉方向生成可直接落地的前端代码与高质感 UI,适合做 landing page、Dashboard 或美化现有界面,避开千篇一律的 AI 审美。

想把页面做得既能上线又有设计感,就用前端设计:组件到整站都能产出,难得的是能避开千篇一律的 AI 味。

编码与调试
未扫描111.8k

网页构建器

by anthropics

Universal
热门

面向复杂 claude.ai HTML artifact 开发,快速初始化 React + Tailwind CSS + shadcn/ui 项目并打包为单文件 HTML,适合需要状态管理、路由或多组件交互的页面。

在 claude.ai 里做复杂网页 Artifact 很省心,多组件、状态和路由都能顺手搭起来,React、Tailwind 与 shadcn/ui 组合效率高、成品也更精致。

编码与调试
未扫描111.8k

网页应用测试

by anthropics

Universal
热门

用 Playwright 为本地 Web 应用编写自动化测试,支持启动开发服务器、校验前端交互、排查 UI 异常、抓取截图与浏览器日志,适合调试动态页面和回归验证。

借助 Playwright 一站式验证本地 Web 应用前端功能,调 UI 时还能同步查看日志和截图,定位问题更快。

编码与调试
未扫描111.8k

相关 MCP Server

GitHub

编辑精选

by GitHub

热门

GitHub 是 MCP 官方参考服务器,让 Claude 直接读写你的代码仓库和 Issues。

这个参考服务器解决了开发者想让 AI 安全访问 GitHub 数据的问题,适合需要自动化代码审查或 Issue 管理的团队。但注意它只是参考实现,生产环境得自己加固安全。

编码与调试
83.1k

by Context7

热门

Context7 是实时拉取最新文档和代码示例的智能助手,让你告别过时资料。

它能解决开发者查找文档时信息滞后的问题,特别适合快速上手新库或跟进更新。不过,依赖外部源可能导致偶尔的数据延迟,建议结合官方文档使用。

编码与调试
51.8k

by tldraw

热门

tldraw 是让 AI 助手直接在无限画布上绘图和协作的 MCP 服务器。

这解决了 AI 只能输出文本、无法视觉化协作的痛点——想象让 Claude 帮你画流程图或白板讨论。最适合需要快速原型设计或头脑风暴的开发者。不过,目前它只是个基础连接器,你得自己搭建画布应用才能发挥全部潜力。

编码与调试
46.2k

评论