io.github.Lykhoyda/ask-gemini

AI 与智能体

by lykhoyda

连接 Claude 与 Gemini CLI,支持 AI 协作、代码审查,以及获取第二意见。

什么是 io.github.Lykhoyda/ask-gemini

连接 Claude 与 Gemini CLI,支持 AI 协作、代码审查,以及获取第二意见。

README

Ask LLM

<div align="center">

CI GitHub Release License: MIT

PackageTypeVersionDownloads
ask-gemini-mcpMCP Servernpmdownloads
ask-codex-mcpMCP Servernpmdownloads
ask-ollama-mcpMCP Servernpmdownloads
ask-llm-mcpMCP Servernpmdownloads
@ask-llm/pluginClaude Code PluginGitHub/plugin install

MCP servers + Claude Code plugin for AI-to-AI collaboration

</div>

MCP servers that bridge your AI client with multiple LLM providers for AI-to-AI collaboration. Works with Claude Code, Claude Desktop, Cursor, Warp, Copilot, and 40+ other MCP clients. Leverage Gemini's 1M+ token context, Codex's GPT-5.4, or local Ollama models — all via standard MCP.

Why?

  • Get a second opinion — Ask another AI to review your coding approach before committing
  • Debate plans — Send architecture proposals for critique and alternative suggestions
  • Review changes — Have multiple AIs analyze diffs to catch issues your primary AI might miss
  • Massive context — Gemini reads entire codebases (1M+ tokens) that would overflow other models
  • Local & private — Use Ollama for reviews where no data leaves your machine

Quick Start

Claude Code

bash
# Individual providers
claude mcp add --scope user gemini -- npx -y ask-gemini-mcp
claude mcp add --scope user codex -- npx -y ask-codex-mcp
claude mcp add --scope user ollama -- npx -y ask-ollama-mcp

# Or all-in-one (auto-detects installed providers)
claude mcp add --scope user ask-llm -- npx -y ask-llm-mcp

Claude Desktop

Add to claude_desktop_config.json:

json
{
  "mcpServers": {
    "gemini": {
      "command": "npx",
      "args": ["-y", "ask-gemini-mcp"]
    },
    "codex": {
      "command": "npx",
      "args": ["-y", "ask-codex-mcp"]
    },
    "ollama": {
      "command": "npx",
      "args": ["-y", "ask-ollama-mcp"]
    }
  }
}
<details> <summary>Cursor, Codex CLI, OpenCode, and other clients</summary>

Cursor (.cursor/mcp.json):

json
{
  "mcpServers": {
    "gemini": { "command": "npx", "args": ["-y", "ask-gemini-mcp"] }
  }
}

Codex CLI (~/.codex/config.toml):

toml
[mcp_servers.gemini]
command = "npx"
args = ["-y", "ask-gemini-mcp"]

Any MCP Client (STDIO transport):

json
{ "command": "npx", "args": ["-y", "ask-gemini-mcp"] }

Replace ask-gemini-mcp with ask-codex-mcp, ask-ollama-mcp, or ask-llm-mcp as needed.

</details>

Claude Code Plugin

The Ask LLM plugin adds multi-provider code review, brainstorming, and automated hooks directly into Claude Code:

code
/plugin marketplace add Lykhoyda/ask-llm
/plugin install ask-llm@ask-llm-plugins

What You Get

FeatureDescription
/multi-reviewParallel Gemini + Codex review with 4-phase validation pipeline and consensus highlighting
/gemini-reviewGemini-only review with confidence filtering
/codex-reviewCodex-only review with confidence filtering
/ollama-reviewLocal review — no data leaves your machine
/brainstormMulti-LLM brainstorm: Claude Opus researches the topic against real files in parallel with external providers (Gemini/Codex/Ollama), then synthesizes all findings with verified findings weighted higher
Pre-commit hookReviews staged changes before git commit, warns about critical issues

The review agents use a 4-phase pipeline inspired by Anthropic's code-review plugin: context gathering, prompt construction with explicit false-positive exclusions, synthesis, and source-level validation of each finding.

See the plugin docs for details.

Prerequisites

  • Node.js v20.0.0 or higher (LTS)
  • At least one provider:
    • Gemini CLInpm install -g @google/gemini-cli && gemini login
    • Codex CLI — installed and authenticated
    • Ollama — running locally with a model pulled (ollama pull qwen2.5-coder:7b)

MCP Tools

ToolPackagePurpose
ask-geminiask-gemini-mcpSend prompts to Gemini CLI with @ file syntax. 1M+ token context
ask-gemini-editask-gemini-mcpGet structured OLD/NEW code edit blocks from Gemini
fetch-chunkask-gemini-mcpRetrieve chunks from cached large responses
ask-codexask-codex-mcpSend prompts to Codex CLI. GPT-5.4 with mini fallback
ask-ollamaask-ollama-mcpSend prompts to local Ollama. Fully private, zero cost
pingallConnection test — verify MCP setup

Usage Examples

code
ask gemini to review the changes in @src/auth.ts for security issues
ask codex to suggest a better algorithm for @src/sort.ts
ask ollama to explain @src/config.ts (runs locally, no data sent anywhere)
use gemini to summarize @. the current directory

Models

ProviderDefaultFallback
Geminigemini-3.1-pro-previewgemini-3-flash-preview (on quota)
Codexgpt-5.4gpt-5.4-mini (on quota)
Ollamaqwen2.5-coder:7bqwen2.5-coder:1.5b (if not found)

All providers automatically fall back to a lighter model on errors.

Documentation

Contributing

Contributions are welcome! See open issues for things to work on.

License

MIT License. See LICENSE for details.

Disclaimer: This is an unofficial, third-party tool and is not affiliated with, endorsed, or sponsored by Google or OpenAI.

常见问题

io.github.Lykhoyda/ask-gemini 是什么?

连接 Claude 与 Gemini CLI,支持 AI 协作、代码审查,以及获取第二意见。

相关 Skills

Claude接口

by anthropics

Universal
热门

面向接入 Claude API、Anthropic SDK 或 Agent SDK 的开发场景,自动识别项目语言并给出对应示例与默认配置,快速搭建 LLM 应用。

想把Claude能力接进应用或智能体,用claude-api上手快、兼容Anthropic与Agent SDK,集成路径清晰又省心

AI 与智能体
未扫描114.1k

RAG架构师

by alirezarezvani

Universal
热门

聚焦生产级RAG系统设计与优化,覆盖文档切块、检索链路、索引构建、召回评估等关键环节,适合搭建可扩展、高准确率的知识库问答与检索增强应用。

面向RAG落地,把知识库、向量检索和生成链路系统串联起来,做架构设计时更清晰,也更少踩坑。

AI 与智能体
未扫描10.2k

计算机视觉

by alirezarezvani

Universal
热门

聚焦目标检测、图像分割与视觉系统落地,覆盖 YOLO、DETR、Mask R-CNN、SAM 等方案,适合定制数据集训练、推理优化及 ONNX/TensorRT 部署。

把目标检测、图像分割到推理部署串成完整工程链路,主流框架与 YOLO、DETR、SAM 等方案都覆盖,落地视觉 AI 会省心很多。

AI 与智能体
未扫描10.2k

相关 MCP Server

顺序思维

编辑精选

by Anthropic

热门

Sequential Thinking 是让 AI 通过动态思维链解决复杂问题的参考服务器。

这个服务器展示了如何让 Claude 像人类一样逐步推理,适合开发者学习 MCP 的思维链实现。但注意它只是个参考示例,别指望直接用在生产环境里。

AI 与智能体
83.4k

知识图谱记忆

编辑精选

by Anthropic

热门

Memory 是一个基于本地知识图谱的持久化记忆系统,让 AI 记住长期上下文。

帮 AI 和智能体补上“记不住”的短板,用本地知识图谱沉淀长期上下文,连续对话更聪明,数据也更可控。

AI 与智能体
83.4k

PraisonAI

编辑精选

by mervinpraison

热门

PraisonAI 是一个支持自反思和多 LLM 的低代码 AI 智能体框架。

如果你需要快速搭建一个能 24/7 运行的 AI 智能体团队来处理复杂任务(比如自动研究或代码生成),PraisonAI 的低代码设计和多平台集成(如 Telegram)让它上手极快。但作为非官方项目,它的生态成熟度可能不如 LangChain 等主流框架,适合愿意尝鲜的开发者。

AI 与智能体
6.8k

评论