io.github.ricardo-hdrn/mcp-await
编码与调试by ricardo-hdrn
面向 AI assistants 的条件等待工具,可监听并等待 ports、files、URLs、processes 等状态就绪。
什么是 io.github.ricardo-hdrn/mcp-await?
面向 AI assistants 的条件等待工具,可监听并等待 ports、files、URLs、processes 等状态就绪。
README
mcp-await
Condition watcher MCP server + CLI for AI CLI assistants (Claude Code, Codex, Cursor, etc.).
Instead of polling with sleep loops and curl --retry that waste API round-trips, call a wait tool once — it blocks until the condition is met and returns the result.

Installation
# Prebuilt binary (Linux, macOS, Windows) — download from GitHub Releases
# https://github.com/ricardo-hdrn/mcp-await/releases/latest
# From crates.io
cargo install mcp-await
# From source
git clone https://github.com/ricardo-hdrn/mcp-await.git
cd mcp-await
cargo build --release
Quick Start
# Wait for a service to be ready
mcp-await port localhost 8080 --timeout 30
# Wait for a file to appear
mcp-await file /tmp/deploy.lock --event create --timeout 60
# Wait for a command to succeed
mcp-await cmd "curl -sf http://localhost:8080/health" --interval 2 --timeout 30
Tools
| Tool | Key Params | How it watches |
|---|---|---|
wait_for_port | host, port | TCP dial loop, 500ms interval |
wait_for_file | path, event (create/modify/delete) | inotify via notify crate, no polling |
wait_for_url | url, expected_status (default 200) | curl loop, 2s interval (requires curl) |
wait_for_pid | pid | /proc/{pid} check, 500ms interval |
wait_for_docker | container | docker wait (requires docker) |
wait_for_gh_run | run_id, repo (optional) | gh run watch (requires gh) |
wait_for_command | command, interval_seconds (default 5) | Re-run via sh -c until exit 0 |
cancel_watch | watch_id | Cancels a non-blocking watch |
All tools accept timeout_seconds (default: 300) and blocking (default: true).
CLI Usage
The binary doubles as a standalone CLI tool:
# TCP port
mcp-await port localhost 5432 --timeout 30
# File events
mcp-await file /var/log/app.log --event modify --timeout 120
mcp-await file /tmp/flag --event create --timeout 60
mcp-await file /tmp/old.pid --event delete --timeout 30
# HTTP status
mcp-await url https://api.example.com/health --status 200 --timeout 120
# Process exit
mcp-await pid 12345 --timeout 300
# Docker container exit
mcp-await docker my-container --timeout 600
# GitHub Actions run
mcp-await gh-run 12345678 --repo owner/repo --timeout 1800
# Arbitrary shell command (exit 0 = success)
mcp-await cmd "test -f /tmp/ready" --interval 2 --timeout 30
Exit Codes
| Code | Meaning |
|---|---|
| 0 | Condition met (success) |
| 1 | Timeout |
| 2 | Error |
Output Format
All commands output JSON:
{
"status": "success",
"elapsed_seconds": 1.23,
"detail": "localhost:8080 is accepting connections"
}
MCP Server Setup
Claude Code
Add to ~/.claude.json:
{
"mcpServers": {
"await": {
"command": "/path/to/mcp-await"
}
}
}
The binary runs as a stdio MCP server when invoked without a subcommand (or with mcp-await serve).
MCP Inspector
npx @modelcontextprotocol/inspector ./target/release/mcp-await
Blocking vs Non-Blocking Mode
Blocking (default)
The tool call holds until the condition is met, times out, or is cancelled. This is the simplest mode — the AI assistant waits for the result.
Non-Blocking
Set blocking: false to get an immediate response with a watch_id and resource URI. The server monitors in the background and pushes a notification when done.
Flow:
- Call
wait_for_portwithblocking: false - Get back immediately:
json
{"watch_id": "port-1", "resource": "watch://port-1", "status": "watching"} - Do other work while waiting
- Receive
notifications/resources/updatedwhen the condition is met - Read
watch://port-1for the full result
Cancellation
Cancel any non-blocking watch with cancel_watch:
{"watch_id": "port-1"}
Resources
Non-blocking watches are exposed as MCP resources at watch://{watch_id}.
list_resources— returns all active and completed watchesread_resource("watch://port-1")— returns JSON with the watch status and result
Reinforcing Agent Usage
Agents sometimes fall back to shell workarounds instead of using mcp-await. Add a snippet to your project's agent instructions file to reinforce the behavior:
| Agent | Instructions file |
|---|---|
| Claude Code | CLAUDE.md |
| Codex | AGENTS.md |
| Gemini CLI | GEMINI.md |
| Cursor | .cursor/rules/ |
| Windsurf | .windsurfrules |
## Waiting for conditions
Use mcp-await tools instead of shell workarounds:
- `wait_for_port` instead of `while ! nc ...; do sleep 1; done`
- `wait_for_url` instead of `curl --retry`
- `wait_for_file` instead of polling loops
- `wait_for_command` instead of `while ! cmd; do sleep N; done`
Prefer `blocking: false` when there is independent work to do in parallel.
Roadmap
- Agent-level instructions to enforce mcp-await usage across all agents (system prompts, tool metadata hints)
- Broader agent compatibility testing (Codex, Gemini CLI, Cursor, Windsurf)
Development
cargo build # debug build
cargo build --release # release build
cargo test # run tests
cargo clippy # lint
cargo fmt # format
License
常见问题
io.github.ricardo-hdrn/mcp-await 是什么?
面向 AI assistants 的条件等待工具,可监听并等待 ports、files、URLs、processes 等状态就绪。
相关 Skills
网页构建器
by anthropics
面向复杂 claude.ai HTML artifact 开发,快速初始化 React + Tailwind CSS + shadcn/ui 项目并打包为单文件 HTML,适合需要状态管理、路由或多组件交互的页面。
✎ 在 claude.ai 里做复杂网页 Artifact 很省心,多组件、状态和路由都能顺手搭起来,React、Tailwind 与 shadcn/ui 组合效率高、成品也更精致。
前端设计
by anthropics
面向组件、页面、海报和 Web 应用开发,按鲜明视觉方向生成可直接落地的前端代码与高质感 UI,适合做 landing page、Dashboard 或美化现有界面,避开千篇一律的 AI 审美。
✎ 想把页面做得既能上线又有设计感,就用前端设计:组件到整站都能产出,难得的是能避开千篇一律的 AI 味。
网页应用测试
by anthropics
用 Playwright 为本地 Web 应用编写自动化测试,支持启动开发服务器、校验前端交互、排查 UI 异常、抓取截图与浏览器日志,适合调试动态页面和回归验证。
✎ 借助 Playwright 一站式验证本地 Web 应用前端功能,调 UI 时还能同步查看日志和截图,定位问题更快。
相关 MCP Server
GitHub
编辑精选by GitHub
GitHub 是 MCP 官方参考服务器,让 Claude 直接读写你的代码仓库和 Issues。
✎ 这个参考服务器解决了开发者想让 AI 安全访问 GitHub 数据的问题,适合需要自动化代码审查或 Issue 管理的团队。但注意它只是参考实现,生产环境得自己加固安全。
Context7 文档查询
编辑精选by Context7
Context7 是实时拉取最新文档和代码示例的智能助手,让你告别过时资料。
✎ 它能解决开发者查找文档时信息滞后的问题,特别适合快速上手新库或跟进更新。不过,依赖外部源可能导致偶尔的数据延迟,建议结合官方文档使用。
by tldraw
tldraw 是让 AI 助手直接在无限画布上绘图和协作的 MCP 服务器。
✎ 这解决了 AI 只能输出文本、无法视觉化协作的痛点——想象让 Claude 帮你画流程图或白板讨论。最适合需要快速原型设计或头脑风暴的开发者。不过,目前它只是个基础连接器,你得自己搭建画布应用才能发挥全部潜力。