Ask MCP Server

平台与服务

by SerPepe

Provide seamless access to multiple premium AI models through OpenRouter with secure OAuth authentication and easy setup. Integrate effortlessly with MCP-compatible clients like Cursor and Claude Desktop to leverage advanced AI capabilities for reasoning, coding, translation, and more. Benefit from automatic fallback to free model variants for cost-effective usage and robust error handling.

什么是 Ask MCP Server

Provide seamless access to multiple premium AI models through OpenRouter with secure OAuth authentication and easy setup. Integrate effortlessly with MCP-compatible clients like Cursor and Claude Desktop to leverage advanced AI capabilities for reasoning, coding, translation, and more. Benefit from automatic fallback to free model variants for cost-effective usage and robust error handling.

README

Ask MCP 🤖

npm version License: MIT Install MCP Server

A powerful Model Context Protocol (MCP) server that provides seamless access to multiple state-of-the-art AI models through OpenRouter. Perfect for integration with Cursor, Claude Desktop, and other MCP-compatible clients.

✨ Features

  • 🎯 5 Premium AI Models: Grok, Gemini 2.5 Pro, Kimi, Qwen3 Coder, GLM-4.5
  • 💰 Free Tier Support: Access free variants when available
  • 🔐 OAuth Authentication: Secure OpenRouter integration
  • 🚀 Easy Setup: One-command OAuth configuration
  • 🔌 MCP Compatible: Works with Cursor, Claude Desktop, and more
  • 📦 TypeScript: Full type safety and modern development

Supported Models

  • Grok: Advanced reasoning and analysis (x-ai/grok-4)
  • Gemini 2.5 Pro: Google's latest multimodal AI (google/gemini-2.5-pro)
  • Kimi: Efficient and fast responses (moonshotai/kimi-k2)
  • Qwen 3 Coder: Specialized for coding tasks (qwen/qwen3-coder)
  • GLM: General language model (z-ai/glm-4.5)

Free Model Variants

The tool supports free model variants for cost-effective usage:

  • Kimi Free: moonshotai/kimi-k2:free
  • Qwen Free: qwen/qwen3-coder:free
  • GLM Free: z-ai/glm-4.5-air:free

Free models are automatically used as fallback when payment errors occur, or can be explicitly requested using the free parameter.

Installation

Option 1: NPM Package (Recommended)

bash
npm install -g ask-mcp

First-time setup: After installation, you'll need to authenticate with OpenRouter. You can either:

  1. Set your API key: export OPENROUTER_API_KEY="your-key-here"
  2. Or use OAuth setup (from source): npm run setup and press ENTER when prompted

Option 2: From Source Installation

  1. Clone and install:

    bash
    git clone https://github.com/SerPepe/Ask-MCP
    cd ask-mcp
    npm install
    npm run build
    
  2. Configure in your MCP client (see Configuration section below)

  3. First-time setup: When you first use the MCP server, it will automatically prompt for OAuth authentication. You can also run the setup manually:

    bash
    npm run setup
    

Quick Start

Option 1: NPM Installation (Recommended)

bash
1. Install globally: npm install -g ask-mcp
2. Configure in your MCP client (Cursor/Claude Desktop)
3. First-time setup: When you first use the MCP server, it will automatically detect the missing API key and prompt you to authenticate with OpenRouter via OAuth. Simply press ENTER when prompted, and your browser will open automatically for secure authentication.
4. Start using the AI models!

Option 2: From Source with OAuth Setup (Auto-Authentication)

bash
1. Clone the repository
2. Install dependencies: npm install
3. Run OAuth setup: npm run setup
4. Press ENTER when prompted (browser will auto-open for OpenRouter OAuth)
5. Complete authentication in browser
6. Test the server: npm test

Option 3: Manual API Key Setup

bash
1. Clone the repository
2. Install dependencies: npm install
3. Build the project: npm run build
4. Set your OpenRouter API key: export OPENROUTER_API_KEY="your-key-here"
5. Test the server: npm test

Configuration

Authentication Options

The Ask MCP tool supports two authentication methods:

Option 1: API Key Authentication (Recommended)

  1. Sign up at OpenRouter
  2. Get your API key from the dashboard
  3. Add credits to your account
  4. Set the OPENROUTER_API_KEY environment variable

Option 2: OAuth Authentication

  1. Use the built-in OAuth flow for secure authentication
  2. Generate authorization URLs and exchange codes for API keys
  3. Perfect for applications requiring user consent

Configure MCP Client

For Cursor IDE

Option A: Using NPM Package (Recommended)

json
{
  "mcpServers": {
    "ask-mcp": {
      "command": "ask-mcp"
    }
  }
}

Option B: Using Local Build

json
{
  "mcpServers": {
    "ask-mcp": {
      "command": "node",
      "args": ["/path/to/ask-mcp/dist/index.js"]
    }
  }
}

Option C: Manual API Key (Optional) If you prefer to set the API key manually instead of using OAuth:

json
{
  "mcpServers": {
    "ask-mcp": {
      "command": "ask-mcp",
      "env": {
        "OPENROUTER_API_KEY": "your-openrouter-api-key-here"
      }
    }
  }
}

For Claude Desktop

Same configuration options as above work for Claude Desktop.

Usage

Once configured, you can use the following commands in your MCP client:

Ask Grok

code
ask-grok "What is the latest in AI development?"

Ask Gemini

code
ask-gemini "Explain quantum computing in simple terms"

Ask Kimi

code
ask-kimi "Write a Python function to sort a list"

Ask Qwen

code
ask-qwen "Debug this JavaScript code: console.log('hello world')"

Ask GLM

code
ask-glm "Translate this to Spanish: Hello, how are you?"

Each model has specific strengths:

  • Use Grok for complex reasoning and analysis
  • Use Gemini for multimodal tasks (text + images)
  • Use Kimi for quick, efficient responses
  • Use Qwen for coding and technical tasks
  • Use GLM for general language tasks

Smart Fallback System

The tool includes intelligent error handling:

  1. Automatic Fallback: When payment/quota errors occur, the tool automatically retries with free model variants

  2. Manual Free Mode: Use the free: true parameter to directly use free models

  3. Error Recovery: Comprehensive error messages help diagnose issues

With System Prompt

You can also provide a custom system prompt:

code
ask-grok {
  "question": "What's the weather like?",
  "system_prompt": "You are a helpful weather assistant. Always ask for location if not provided."
}

Using Free Models

You can explicitly use free model variants:

code
ask-kimi {
  "question": "What is machine learning?",
  "free": true
}

Usage Examples

Basic Usage

json
{
  "name": "ask-grok",
  "arguments": {
    "question": "Explain quantum computing"
  }
}

With Custom System Prompt

json
{
  "name": "ask-gemini",
  "arguments": {
    "question": "Write a Python function to sort a list",
    "system_prompt": "You are a senior Python developer. Provide clean, well-documented code."
  }
}

Using Free Models

json
{
  "name": "ask-kimi",
  "arguments": {
    "question": "What is machine learning?",
    "free": true
  }
}

GLM Model Usage

json
{
  "name": "ask-glm",
  "arguments": {
    "question": "Translate this to Spanish: Hello, how are you?",
    "system_prompt": "You are a professional translator."
  }
}

Troubleshooting

Common Issues

  1. "OpenRouter API key not configured"

    • Make sure you've set the OPENROUTER_API_KEY environment variable
    • Check that your API key is valid and has credits
  2. "No response received from the model"

    • Check your OpenRouter account has sufficient credits
    • Verify the model is available on OpenRouter
  3. Connection issues

    • Ensure you have internet connectivity
    • Check if OpenRouter API is accessible from your network
  4. NPM package issues

    • Try reinstalling: npm uninstall -g ask-mcp && npm install -g ask-mcp
    • Check Node.js version (requires Node.js 16+)

License

MIT License - see LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

常见问题

Ask MCP Server 是什么?

Provide seamless access to multiple premium AI models through OpenRouter with secure OAuth authentication and easy setup. Integrate effortlessly with MCP-compatible clients like Cursor and Claude Desktop to leverage advanced AI capabilities for reasoning, coding, translation, and more. Benefit from automatic fallback to free model variants for cost-effective usage and robust error handling.

相关 Skills

MCP构建

by anthropics

Universal
热门

聚焦高质量 MCP Server 开发,覆盖协议研究、工具设计、错误处理与传输选型,适合用 FastMCP 或 MCP SDK 对接外部 API、封装服务能力。

想让 LLM 稳定调用外部 API,就用 MCP构建:从 Python 到 Node 都有成熟指引,帮你更快做出高质量 MCP 服务器。

平台与服务
未扫描123.0k

Slack动图

by anthropics

Universal
热门

面向Slack的动图制作Skill,内置emoji/消息GIF的尺寸、帧率和色彩约束、校验与优化流程,适合把创意或上传图片快速做成可直接发送的Slack动画。

帮你快速做出适配 Slack 的动图,内置约束规则和校验工具,少踩上传与播放坑,做表情包和演示都更省心。

平台与服务
未扫描123.0k

邮件模板

by alirezarezvani

Universal
热门

快速搭建生产可用的事务邮件系统:生成 React Email/MJML 模板,接入 Resend、Postmark、SendGrid 或 AWS SES,并支持本地预览、i18n、暗色模式、反垃圾优化与追踪埋点。

面向营销与服务场景,快速搭建高质量邮件模板,省去反复设计与切图成本,成熟度和社区认可都很高。

平台与服务
未扫描12.5k

相关 MCP Server

Slack 消息

编辑精选

by Anthropic

热门

Slack 是让 AI 助手直接读写你的 Slack 频道和消息的 MCP 服务器。

这个服务器解决了团队协作中需要 AI 实时获取 Slack 信息的痛点,特别适合开发团队让 Claude 帮忙汇总频道讨论或发送通知。不过,它目前只是参考实现,文档有限,不建议在生产环境直接使用——更适合开发者学习 MCP 如何集成第三方服务。

平台与服务
84.2k

by netdata

热门

io.github.netdata/mcp-server 是让 AI 助手实时监控服务器指标和日志的 MCP 服务器。

这个工具解决了运维人员需要手动检查系统状态的痛点,最适合 DevOps 团队让 Claude 自动分析性能数据。不过,它依赖 NetData 的现有部署,如果你没用过这个监控平台,得先花时间配置。

平台与服务
78.5k

by d4vinci

热门

Scrapling MCP Server 是专为现代网页设计的智能爬虫工具,支持绕过 Cloudflare 等反爬机制。

这个工具解决了爬取动态网页和反爬网站时的头疼问题,特别适合需要批量采集电商价格或新闻数据的开发者。不过,它依赖外部浏览器引擎,资源消耗较大,不适合轻量级任务。

平台与服务
38.1k

评论