AWS MCP Server

平台与服务

by aws

AWS MCP Server 通过文档、API 调用与 SOP 工作流,让 AI 安全访问 AWS 资源与能力。

让 AI 能安全调用 AWS 文档、API 与 SOP 流程,省去手搓集成的麻烦,特别适合要把云资源接入智能体的团队。

什么是 AWS MCP Server

AWS MCP Server 通过文档、API 调用与 SOP 工作流,让 AI 安全访问 AWS 资源与能力。

README

MCP Proxy for AWS

Overview

The MCP Proxy for AWS package provides two ways to connect AI applications to MCP servers on AWS:

  1. Using it as a proxy - It becomes a lightweight, client-side bridge between MCP clients (AI assistants like Claude Desktop, Kiro CLI) and MCP servers on AWS. (See MCP Proxy)
  2. Using it as a library - Programmatically connect popular AI agent frameworks (LangChain, LlamaIndex, Strands Agents, etc.) to MCP servers on AWS. (See Programmatic Access)

When Do You Need This Package?

  • You want to connect to MCP servers on AWS (e.g., using Amazon Bedrock AgentCore) that use AWS IAM authentication (SigV4) instead of OAuth
  • You're using MCP clients (like Claude Desktop, Kiro CLI) that don't natively support AWS IAM authentication
  • You're building AI agents with popular frameworks like LangChain, Strands Agents, LlamaIndex, etc., that need to connect to MCP servers on AWS
  • You want to avoid building custom SigV4 request signing logic yourself

How This Package Helps

The Problem: The official MCP specification supports OAuth-based authentication, but MCP servers on AWS can also use AWS IAM authentication (SigV4). Standard MCP clients don't know how to sign requests with AWS credentials.

The Solution: This package bridges that gap by:

  • Handling SigV4 authentication automatically - Uses your local AWS credentials (from AWS CLI, environment variables, or IAM roles) to sign all MCP requests using SigV4
  • Providing seamless integration - Works with existing MCP clients and frameworks
  • Eliminating custom code - No need to build your own MCP client with SigV4 signing logic

Which Feature Should I Use?

Use as a proxy if you want to:

  • Connect MCP clients like Claude Desktop or Kiro CLI to MCP servers on AWS with IAM credentials
  • Add MCP servers on AWS to your AI assistant's configuration
  • Use a command-line tool that runs as a bridge between your MCP client and AWS

Use as a library if you want to:

  • Build AI agents programmatically using popular frameworks like LangChain, Strands Agents, or LlamaIndex
  • Integrate AWS IAM-secured MCP servers directly into your Python applications
  • Have fine-grained control over the MCP session lifecycle in your code

Prerequisites


MCP Proxy

The MCP Proxy serves as a lightweight, client-side bridge between MCP clients (AI assistants and developer tools) and IAM-secured MCP servers on AWS. The proxy handles SigV4 authentication using local AWS credentials and provides dynamic tool discovery.

Installation

Using PyPi

bash
# Run the server
uvx mcp-proxy-for-aws@latest <SigV4 MCP endpoint URL>

Note: The first run may take tens of seconds as uvx downloads and caches dependencies. Subsequent runs will start in seconds. Actual startup time depends on your network and hardware.

Using a local repository

bash
git clone https://github.com/aws/mcp-proxy-for-aws.git
cd mcp-proxy-for-aws
uv run mcp_proxy_for_aws/server.py <SigV4 MCP endpoint URL>

Using Docker

Docker images are published to the public AWS ECR registry.

You can use the pre-built image:

bash
# Pull the latest image
docker pull public.ecr.aws/mcp-proxy-for-aws/mcp-proxy-for-aws:latest

# Or pull a specific version
docker pull public.ecr.aws/mcp-proxy-for-aws/mcp-proxy-for-aws:1.1.6

Or build the image locally:

bash
# Build the Docker image
docker build -t mcp-proxy-for-aws .

Configuration Parameters

ParameterDescriptionDefaultRequired
endpointMCP endpoint URL (e.g., https://your-service.us-east-1.amazonaws.com/mcp)N/AYes
------------
--serviceAWS service name for SigV4 signing, if omitted we try to infer this from the urlInferred from endpoint if not providedNo
--profileAWS profile for AWS credentials to useUses AWS_PROFILE environment variable if not setNo
--regionAWS region to useUses AWS_REGION environment variable if not set, defaults to us-east-1No
--metadataMetadata to inject into MCP requests as key=value pairs (e.g., --metadata KEY1=value1 KEY2=value2)AWS_REGION is automatically injected based on --region if not providedNo
--read-onlyDisable tools which may require write permissions (tools which DO NOT require write permissions are annotated with readOnlyHint=true)FalseNo
--retriesConfigures number of retries done when calling upstream services, setting this to 0 disables retries.0No
--log-levelSet the logging level (DEBUG/INFO/WARNING/ERROR/CRITICAL)INFONo
--timeoutSet desired timeout in seconds across all operations180No
--connect-timeoutSet desired connect timeout in seconds60No
--read-timeoutSet desired read timeout in seconds120No
--write-timeoutSet desired write timeout in seconds180No

Optional Environment Variables

Set the environment variables for the MCP Proxy for AWS:

bash
# Credentials through profile
export AWS_PROFILE=<aws_profile>

# Credentials through parameters
export AWS_ACCESS_KEY_ID=<access_key_id>
export AWS_SECRET_ACCESS_KEY=<secret_access_key>
export AWS_SESSION_TOKEN=<session_token>

# AWS Region
export AWS_REGION=<aws_region>

Setup Examples

Add the following configuration to your MCP client config file (e.g., for Kiro CLI, edit ~/.kiro/settings/mcp.json): Note Add your own endpoint by replacing <SigV4 MCP endpoint URL>

Running from local - using uv

json
{
  "mcpServers": {
    "<mcp server name>": {
      "disabled": false,
      "type": "stdio",
      "command": "uv",
      "args": [
        "--directory",
        "/path/to/mcp_proxy_for_aws",
        "run",
        "server.py",
        "<SigV4 MCP endpoint URL>",
        "--service",
        "<your service code>",
        "--profile",
        "default",
        "--region",
        "us-east-1",
        "--read-only",
        "--log-level",
        "INFO",
      ]
    }
  }
}

[!NOTE] Cline users should not use --log-level argument because Cline checks the log messages in stderr for text "error" (case insensitive).

Using Docker

Using the pre-built public ECR image:

json
{
  "mcpServers": {
    "<mcp server name>": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "--volume",
        "/full/path/to/.aws:/app/.aws:ro",
        "public.ecr.aws/mcp-proxy-for-aws/mcp-proxy-for-aws:latest",
        "<SigV4 MCP endpoint URL>"
      ],
      "env": {}
    }
  }
}

Or using a locally built image:

json
{
  "mcpServers": {
    "<mcp server name>": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "--volume",
        "/full/path/to/.aws:/app/.aws:ro",
        "mcp-proxy-for-aws",
        "<SigV4 MCP endpoint URL>"
      ],
      "env": {}
    }
  }
}

Programmatic Access

The MCP Proxy for AWS enables programmatic integration of IAM-secured MCP servers into AI agent frameworks. The library provides authenticated transport layers that work with popular Python AI frameworks.

By default, the library resolves AWS credentials automatically from the standard boto3 credential chain (environment variables, shared credentials file, etc.). You can optionally pass credentials programmatically via the credentials parameter. When provided, these take precedence over the aws_profile parameter. Note that aws_region must be explicitly specified when using credentials.

python
from botocore.credentials import Credentials
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client

creds = Credentials(access_key="...", secret_key="...", token="...")

mcp_client = aws_iam_streamablehttp_client(
    endpoint=mcp_url,
    aws_region=region,
    aws_service=service,
    credentials=creds,  # Optional: explicitly pass AWS credentials
)

Integration Patterns

The library supports two integration patterns depending on your framework:

Pattern 1: Client Factory Integration

Use with: Frameworks that accept a factory function that returns an MCP client, e.g. Strands Agents, Microsoft Agent Framework. The aws_iam_streamablehttp_client is passed as a factory to the framework, which handles the connection lifecycle internally.

Example - Strands Agents:

python
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client

mcp_client_factory = lambda: aws_iam_streamablehttp_client(
    endpoint=mcp_url,    # The URL of the MCP server
    aws_region=region,   # The region of the MCP server
    aws_service=service  # The underlying AWS service, e.g. "bedrock-agentcore"
)

with MCPClient(mcp_client_factory) as mcp_client:
    mcp_tools = mcp_client.list_tools_sync()
    agent = Agent(tools=mcp_tools, ...)

Example - Microsoft Agent Framework:

python
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client

mcp_client_factory = lambda: aws_iam_streamablehttp_client(
    endpoint=mcp_url,    # The URL of the MCP server
    aws_region=region,   # The region of the MCP server
    aws_service=service  # The underlying AWS service, e.g. "bedrock-agentcore"
)

mcp_tools = MCPStreamableHTTPTool(name="MCP Tools", url=mcp_url)
mcp_tools.get_mcp_client = mcp_client_factory

async with mcp_tools:
    agent = ChatAgent(tools=[mcp_tools], ...)

Pattern 2: Direct MCP Session Integration

Use with: Frameworks that require direct access to the MCP sessions, e.g. LangChain, LlamaIndex. The aws_iam_streamablehttp_client provides the authenticated transport streams, which are then used to create an MCP ClientSession.

Example - LangChain:

python
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client

mcp_client = aws_iam_streamablehttp_client(
    endpoint=mcp_url,    # The URL of the MCP server
    aws_region=region,   # The region of the MCP server
    aws_service=service  # The underlying AWS service, e.g. "bedrock-agentcore"
)

async with mcp_client as (read, write, session_id_callback):
    async with ClientSession(read, write) as session:
        mcp_tools = await load_mcp_tools(session)
        agent = create_langchain_agent(tools=mcp_tools, ...)

Example - LlamaIndex:

python
from mcp_proxy_for_aws.client import aws_iam_streamablehttp_client

mcp_client = aws_iam_streamablehttp_client(
    endpoint=mcp_url,    # The URL of the MCP server
    aws_region=region,   # The region of the MCP server
    aws_service=service  # The underlying AWS service, e.g. "bedrock-agentcore"
)

async with mcp_client as (read, write, session_id_callback):
    async with ClientSession(read, write) as session:
        mcp_tools = await McpToolSpec(client=session).to_tool_list_async()
        agent = ReActAgent(tools=mcp_tools, ...)

Running Examples

Explore complete working examples for different frameworks in the ./examples/mcp-client directory:

Available examples:

Run examples individually:

bash
cd examples/mcp-client/[framework]  # e.g. examples/mcp-client/strands
uv run main.py

Installation

The client library is included when you install the package:

bash
pip install mcp-proxy-for-aws

For development:

bash
git clone https://github.com/aws/mcp-proxy-for-aws.git
cd mcp-proxy-for-aws
uv sync

Troubleshooting

Handling Authentication error - Invalid credentials

We try to autodetect the service from the url, sometimes this fails, ensure that --service is set correctly to the service you are attempting to connect to. Otherwise the SigV4 signing will not be able to be verified by the service you connect to, resulting in this error. Also ensure that you have valid IAM credentials on your machine before retrying.

Development & Contributing

For development setup, testing, and contribution guidelines, see:

Resources to understand SigV4:

License

Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the "License").

Disclaimer

LLMs are non-deterministic and they make mistakes, we advise you to always thoroughly test and follow the best practices of your organization before using these tools on customer facing accounts. Users of this package are solely responsible for implementing proper security controls and MUST use AWS Identity and Access Management (IAM) to manage access to AWS resources. You are responsible for configuring appropriate IAM policies, roles, and permissions, and any security vulnerabilities resulting from improper IAM configuration are your sole responsibility. By using this package, you acknowledge that you have read and understood this disclaimer and agree to use the package at your own risk.

<!-- mcp-name: io.github.aws/mcp-proxy-for-aws --> <!-- mcp-name: io.github.aws/aws-mcp --> <!-- mcp-name: aws.api.us-east-1.eks-mcp/server --> <!-- mcp-name: aws.api.us-east-1.ecs-mcp/server -->

常见问题

AWS MCP Server 是什么?

AWS MCP Server 通过文档、API 调用与 SOP 工作流,让 AI 安全访问 AWS 资源与能力。

相关 Skills

MCP构建

by anthropics

Universal
热门

聚焦高质量 MCP Server 开发,覆盖协议研究、工具设计、错误处理与传输选型,适合用 FastMCP 或 MCP SDK 对接外部 API、封装服务能力。

想让 LLM 稳定调用外部 API,就用 MCP构建:从 Python 到 Node 都有成熟指引,帮你更快做出高质量 MCP 服务器。

平台与服务
未扫描109.6k

Slack动图

by anthropics

Universal
热门

面向Slack的动图制作Skill,内置emoji/消息GIF的尺寸、帧率和色彩约束、校验与优化流程,适合把创意或上传图片快速做成可直接发送的Slack动画。

帮你快速做出适配 Slack 的动图,内置约束规则和校验工具,少踩上传与播放坑,做表情包和演示都更省心。

平台与服务
未扫描109.6k

接口设计评审

by alirezarezvani

Universal
热门

审查 REST API 设计是否符合行业规范,自动检查命名、HTTP 方法、状态码与文档覆盖,识别破坏性变更并给出设计评分,适合评审接口方案和版本迭代前把关。

做API和架构方案时,它能帮你提前揪出接口设计问题并对齐最佳实践,评审视角系统,团队协作更省心。

平台与服务
未扫描9.0k

相关 MCP Server

Slack 消息

编辑精选

by Anthropic

热门

Slack 是让 AI 助手直接读写你的 Slack 频道和消息的 MCP 服务器。

这个服务器解决了团队协作中需要 AI 实时获取 Slack 信息的痛点,特别适合开发团队让 Claude 帮忙汇总频道讨论或发送通知。不过,它目前只是参考实现,文档有限,不建议在生产环境直接使用——更适合开发者学习 MCP 如何集成第三方服务。

平台与服务
82.9k

by netdata

热门

io.github.netdata/mcp-server 是让 AI 助手实时监控服务器指标和日志的 MCP 服务器。

这个工具解决了运维人员需要手动检查系统状态的痛点,最适合 DevOps 团队让 Claude 自动分析性能数据。不过,它依赖 NetData 的现有部署,如果你没用过这个监控平台,得先花时间配置。

平台与服务
78.3k

by d4vinci

热门

Scrapling MCP Server 是专为现代网页设计的智能爬虫工具,支持绕过 Cloudflare 等反爬机制。

这个工具解决了爬取动态网页和反爬网站时的头疼问题,特别适合需要批量采集电商价格或新闻数据的开发者。不过,它依赖外部浏览器引擎,资源消耗较大,不适合轻量级任务。

平台与服务
34.5k

评论