io.github.VictoriaMetrics-Community/mcp-victoriatraces

平台与服务

by victoriametrics-community

MCP Server for VictoriaTraces. Provides integration with VictoriaTraces API and documentation

什么是 io.github.VictoriaMetrics-Community/mcp-victoriatraces

MCP Server for VictoriaTraces. Provides integration with VictoriaTraces API and documentation

README

VictoriaTraces MCP Server

Latest Release Trust Score License Slack X Reddit

The implementation of Model Context Protocol (MCP) server for VictoriaTraces.

This provides access to your VictoriaTraces instance and seamless integration with VictoriaTraces APIs and documentation. It can give you a comprehensive interface for traces, observability, and debugging tasks related to your VictoriaTraces instances, enable advanced automation and interaction capabilities for engineers and tools.

Features

This MCP server allows you to use almost all read-only APIs of VictoriaTraces:

  • Get services and operations (span names)
  • Query traces, explore and analyze traces data

In addition, the MCP server contains embedded up-to-date documentation and is able to search it without online access.

More details about the exact available tools and prompts can be found in the Usage section.

You can combine functionality of tools, docs search in your prompts and invent great usage scenarios for your VictoriaTraces instance. And please note the fact that the quality of the MCP Server and its responses depends very much on the capabilities of your client and the quality of the model you are using.

You can also combine the MCP server with other observability or doc search MCP Servers and get even more powerful results.

Requirements

Installation

Go

bash
go install github.com/VictoriaMetrics-Community/mcp-victoriatraces/cmd/mcp-victoriatraces@latest

Binaries

Just download the latest release from Releases page and put it to your PATH.

Example for Linux x86_64 (note that other architectures and platforms are also available):

bash
latest=$(curl -s https://api.github.com/repos/VictoriaMetrics-Community/mcp-victoriatraces/releases/latest | grep 'tag_name' | cut -d\" -f4)
wget https://github.com/VictoriaMetrics-Community/mcp-victoriatraces/releases/download/$latest/mcp-victoriatraces_Linux_x86_64.tar.gz
tar axvf mcp-victoriatraces_Linux_x86_64.tar.gz

Docker

You can run VictoriaTraces MCP Server using Docker.

This is the easiest way to get started without needing to install Go or build from source.

bash
docker run -d --name mcp-victoriatraces \
  -e VT_INSTANCE_ENTRYPOINT=https://localhost:10428 \
  -e MCP_SERVER_MODE=http \
  -e MCP_LISTEN_ADDR=:8081 \
  -p 8081:8081 \
  ghcr.io/victoriametrics-community/mcp-victoriatraces

You should replace environment variables with your own parameters.

Note that the MCP_SERVER_MODE=http flag is used to enable Streamable HTTP mode. More details about server modes can be found in the Configuration section.

See available docker images in github registry.

Also see Using Docker instead of binary section for more details about using Docker with MCP server with clients in stdio mode.

Source Code

For building binary from source code you can use the following approach:

  • Clone repo:

    bash
    git clone https://github.com/VictoriaMetrics-Community/mcp-victoriatraces.git
    cd mcp-victoriatraces
    
  • Build binary from cloned source code:

    bash
    make build
    # after that you can find binary mcp-victoriatraces and copy this file to your PATH or run inplace
    
  • Build image from cloned source code:

    bash
    docker build -t mcp-victoriatraces .
    # after that you can use docker image mcp-victoriatraces for running or pushing
    

Configuration

MCP Server for VictoriaTraces is configured via environment variables:

VariableDescriptionRequiredDefaultAllowed values
VT_INSTANCE_ENTRYPOINTURL to VictoriaTraces instanceYes--
VT_INSTANCE_BEARER_TOKENAuthentication token for VictoriaTraces APINo--
VT_INSTANCE_HEADERSCustom HTTP headers to send with requests (comma-separated key=value pairs)No--
VT_DEFAULT_TENANT_IDDefault tenant ID used when tenant is not specified in requests (format: AccountID:ProjectID or AccountID)No0:0-
MCP_SERVER_MODEServer operation mode. See Modes for details.Nostdiostdio, sse, http
MCP_LISTEN_ADDRAddress for SSE or HTTP server to listen onNolocalhost:8081-
MCP_DISABLED_TOOLSComma-separated list of tools to disableNo--
MCP_HEARTBEAT_INTERVALDefines the heartbeat interval for the streamable-http protocol. <br /> It means the MCP server will send a heartbeat to the client through the GET connection, <br /> to keep the connection alive from being closed by the network infrastructure (e.g. gateways)No30s-
MCP_LOG_FORMATLog output formatNotexttext, json
MCP_LOG_LEVELMinimum log levelNoinfodebug, info, warn, error

Modes

MCP Server supports the following modes of operation (transports):

  • stdio - Standard input/output mode, where the server reads commands from standard input and writes responses to standard output. This is the default mode and is suitable for local servers.
  • sse - Server-Sent Events. Server will expose the /sse and /message endpoints for SSE connections.
  • http - Streamable HTTP. Server will expose the /mcp endpoint for HTTP connections.

More info about traqnsports you can find in MCP docs:

Сonfiguration examples

bash
export VT_INSTANCE_ENTRYPOINT="https://localhost:10428"

# Custom headers for authentication (e.g., behind a reverse proxy)
# Expected syntax is key=value separated by commas
export VT_INSTANCE_HEADERS="<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"

# Server mode
export MCP_SERVER_MODE="sse"
export MCP_SSE_ADDR="0.0.0.0:8082"

Endpoints

In SSE and HTTP modes the MCP server provides the following endpoints:

EndpointDescription
/sse + /messageEndpoints for messages in SSE mode (for MCP clients that support SSE)
/mcpHTTP endpoint for streaming messages in HTTP mode (for MCP clients that support Streamable HTTP)
/metricsMetrics in Prometheus format for monitoring the MCP server
/health/livenessLiveness check endpoint to ensure the server is running
/health/readinessReadiness check endpoint to ensure the server is ready to accept requests

Setup in clients

Cursor

Go to: Settings -> Cursor Settings -> MCP -> Add new global MCP server and paste the following configuration into your Cursor ~/.cursor/mcp.json file:

json
{
  "mcpServers": {
    "victoriatraces": {
      "command": "/path/to/mcp-victoriatraces",
      "env": {
        "VT_INSTANCE_ENTRYPOINT": "<YOUR_VT_INSTANCE>",
        "VT_INSTANCE_BEARER_TOKEN": "<YOUR_VT_BEARER_TOKEN>",
        "VT_INSTANCE_HEADERS": "<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"
      }
    }
  }
}

See Cursor MCP docs for more info.

Claude Desktop

Add this to your Claude Desktop claude_desktop_config.json file (you can find it if open Settings -> Developer -> Edit config):

json
{
  "mcpServers": {
    "victoriatraces": {
      "command": "/path/to/mcp-victoriatraces",
      "env": {
        "VT_INSTANCE_ENTRYPOINT": "<YOUR_VT_INSTANCE>",
        "VT_INSTANCE_BEARER_TOKEN": "<YOUR_VT_BEARER_TOKEN>",
        "VT_INSTANCE_HEADERS": "<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"
      }
    }
  }
}

See Claude Desktop MCP docs for more info.

Claude Code

Run the command:

sh
claude mcp add victoriatraces -- /path/to/mcp-victoriatraces \
  -e VT_INSTANCE_ENTRYPOINT=<YOUR_VT_INSTANCE> \
  -e VT_INSTANCE_BEARER_TOKEN=<YOUR_VT_BEARER_TOKEN> \
  -e VT_INSTANCE_HEADERS="<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"

See Claude Code MCP docs for more info.

Visual Studio Code

Add this to your VS Code MCP config file:

json
{
  "servers": {
    "victoriatraces": {
      "type": "stdio",
      "command": "/path/to/mcp-victoriatraces",
      "env": {
        "VT_INSTANCE_ENTRYPOINT": "<YOUR_VT_INSTANCE>",
        "VT_INSTANCE_BEARER_TOKEN": "<YOUR_VT_BEARER_TOKEN>",
        "VT_INSTANCE_HEADERS": "<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"
      }
    }
  }
}

See VS Code MCP docs for more info.

Zed

Add the following to your Zed config file:

json
  "context_servers": {
    "victoriatraces": {
      "command": {
        "path": "/path/to/mcp-victoriatraces",
        "args": [],
        "env": {
          "VT_INSTANCE_ENTRYPOINT": "<YOUR_VT_INSTANCE>",
          "VT_INSTANCE_BEARER_TOKEN": "<YOUR_VT_BEARER_TOKEN>",
          "VT_INSTANCE_HEADERS": "<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"
        }
      },
      "settings": {}
    }
  }
}

See Zed MCP docs for more info.

JetBrains IDEs

  • Open Settings -> Tools -> AI Assistant -> Model Context Protocol (MCP).
  • Click Add (+)
  • Select As JSON
  • Put the following to the input field:
json
{
  "mcpServers": {
    "victoriatraces": {
      "command": "/path/to/mcp-victoriatraces",
      "env": {
        "VT_INSTANCE_ENTRYPOINT": "<YOUR_VT_INSTANCE>",
        "VT_INSTANCE_BEARER_TOKEN": "<YOUR_VT_BEARER_TOKEN>",
        "VT_INSTANCE_HEADERS": "<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"
      }
    }
  }
}

Windsurf

Add the following to your Windsurf MCP config file.

json
{
  "mcpServers": {
    "victoriatraces": {
      "command": "/path/to/mcp-victoriatraces",
      "env": {
        "VT_INSTANCE_ENTRYPOINT": "<YOUR_VT_INSTANCE>",
        "VT_INSTANCE_BEARER_TOKEN": "<YOUR_VT_BEARER_TOKEN>",
        "VT_INSTANCE_HEADERS": "<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"
      }
    }
  }
}

See Windsurf MCP docs for more info.

Using Docker instead of binary

You can run VictoriaTraces MCP Server using Docker instead of local binary.

You should replace run command in configuration examples above in the following way:

code
{
  "mcpServers": {
    "victoriatraces": {
      "command": "docker",
        "args": [
          "run",
          "-i", "--rm",
          "-e", "VT_INSTANCE_ENTRYPOINT",
          "-e", "VT_INSTANCE_BEARER_TOKEN",
          "-e", "VT_INSTANCE_HEADERS",
          "ghcr.io/victoriametrics-community/mcp-victoriatraces",
        ],
      "env": {
        "VT_INSTANCE_ENTRYPOINT": "<YOUR_VT_INSTANCE>",
        "VT_INSTANCE_BEARER_TOKEN": "<YOUR_VT_BEARER_TOKEN>",
        "VT_INSTANCE_HEADERS": "<HEADER>=<HEADER_VALUE>,<HEADER>=<HEADER_VALUE>"
      }
    }
  }
}

Usage

After installing and configuring the MCP server, you can start using it with your favorite MCP client.

You can start dialog with AI assistant from the phrase:

code
Use MCP VictoriaTraces in the following answers

But it's not required, you can just start asking questions and the assistant will automatically use the tools and documentation to provide you with the best answers.

Toolset

MCP VictoriaTraces provides numerous tools for interacting with your VictoriaTraces instance.

Here's a list of available tools:

ToolDescription
documentationSearch in embedded VictoriaTraces documentation
servicesList of all traced services
service_namesGet all the span names (operations) of a service
tracesQuery traces
traceGet trace info by trace ID
dependenciesQuery the service dependency graph

Prompts

The server includes pre-defined prompts for common tasks.

These are just examples at the moment, the prompt library will be added to in the future:

PromptDescription
documentationSearch VictoriaTraces documentation for specific topics

Roadmap

  • Implement multitenant version of MCP (that will support several deployments)
  • Add service graph tool after release of this feature (see the PR)

Disclaimer

AI services and agents along with MCP servers like this cannot guarantee the accuracy, completeness and reliability of results. You should double check the results obtained with AI.

The quality of the MCP Server and its responses depend very much on the capabilities of your client and the quality of the model you are using.

Contributing

Contributions to the MCP VictoriaTraces project are welcome!

Please feel free to submit issues, feature requests, or pull requests.

常见问题

io.github.VictoriaMetrics-Community/mcp-victoriatraces 是什么?

MCP Server for VictoriaTraces. Provides integration with VictoriaTraces API and documentation

相关 Skills

MCP构建

by anthropics

Universal
热门

聚焦高质量 MCP Server 开发,覆盖协议研究、工具设计、错误处理与传输选型,适合用 FastMCP 或 MCP SDK 对接外部 API、封装服务能力。

想让 LLM 稳定调用外部 API,就用 MCP构建:从 Python 到 Node 都有成熟指引,帮你更快做出高质量 MCP 服务器。

平台与服务
未扫描114.1k

Slack动图

by anthropics

Universal
热门

面向Slack的动图制作Skill,内置emoji/消息GIF的尺寸、帧率和色彩约束、校验与优化流程,适合把创意或上传图片快速做成可直接发送的Slack动画。

帮你快速做出适配 Slack 的动图,内置约束规则和校验工具,少踩上传与播放坑,做表情包和演示都更省心。

平台与服务
未扫描114.1k

MCP服务构建器

by alirezarezvani

Universal
热门

从 OpenAPI 一键生成 Python/TypeScript MCP server 脚手架,并校验 tool schema、命名规范与版本兼容性,适合把现有 REST API 快速发布成可生产演进的 MCP 服务。

帮你快速搭建 MCP 服务与后端 API,脚手架完善、扩展顺手,尤其适合想高效验证服务能力的开发者。

平台与服务
未扫描10.2k

相关 MCP Server

Slack 消息

编辑精选

by Anthropic

热门

Slack 是让 AI 助手直接读写你的 Slack 频道和消息的 MCP 服务器。

这个服务器解决了团队协作中需要 AI 实时获取 Slack 信息的痛点,特别适合开发团队让 Claude 帮忙汇总频道讨论或发送通知。不过,它目前只是参考实现,文档有限,不建议在生产环境直接使用——更适合开发者学习 MCP 如何集成第三方服务。

平台与服务
83.4k

by netdata

热门

io.github.netdata/mcp-server 是让 AI 助手实时监控服务器指标和日志的 MCP 服务器。

这个工具解决了运维人员需要手动检查系统状态的痛点,最适合 DevOps 团队让 Claude 自动分析性能数据。不过,它依赖 NetData 的现有部署,如果你没用过这个监控平台,得先花时间配置。

平台与服务
78.4k

by d4vinci

热门

Scrapling MCP Server 是专为现代网页设计的智能爬虫工具,支持绕过 Cloudflare 等反爬机制。

这个工具解决了爬取动态网页和反爬网站时的头疼问题,特别适合需要批量采集电商价格或新闻数据的开发者。不过,它依赖外部浏览器引擎,资源消耗较大,不适合轻量级任务。

平台与服务
35.4k

评论