NovelAI适配器
novelai-openclaw-adaptor
by askkumptenchen
Explain how to connect NovelAI to OpenClaw through a local OpenAI-compatible shim. Use when the user wants configuration guidance for a local NovelAI adaptor, model selection, or OpenClaw `base_url` setup.
安装
claude skill add --url https://github.com/openclaw/skills文档
NovelAI OpenClaw Adaptor
Use this skill to explain how a local adaptor can bridge OpenClaw's OpenAI-style API calls to NovelAI, and how to configure OpenClaw to talk to that local endpoint.
Scope
This skill is for explanation and local configuration guidance:
- Explain that the adaptor is a local proxy/shim, not a hosted service.
- Explain how
base_urland model names should be configured in OpenClaw. - Help the user choose a supported text model or image model.
- Describe the local commands a user may run after they have verified the package source.
Do not use this skill to:
- Auto-install software.
- ask the user to paste secrets into chat.
- present unverified third-party packages as implicitly trusted.
Safety rules
Follow these rules whenever this skill is used:
- Treat installation as optional and approval-based.
- Before suggesting installation, tell the user to verify the package source, maintainer, and repository or project page.
- Prefer local source checkout or an already-verified package source when available.
- Never ask the user to paste a NovelAI API key into chat.
- Never include secrets inline in command examples.
- If the package source cannot be verified, stop at configuration guidance and ask the user how they want to proceed.
Verification-first workflow
If the user wants to enable runtime use, follow this order:
- Check whether the adaptor is already present locally or already installed.
- If it is not present, explain that the package source should be verified before any installation.
- Ask for approval before running any install command.
- After the user has verified the source and approved installation, use the package's documented install method.
- Prefer interactive or local-only credential entry during configuration.
Safe examples:
pip install novelai-openclaw-adaptor
novelai-config init
Do not use examples like:
novelai-config init --api-key "YOUR_NOVELAI_API_KEY"
If the user needs help deciding whether the package is trustworthy, suggest reviewing its repository, release history, and maintainers before installation.
Supported models
Text models:
glm-4-6eratokayracliokrakeeuterpesigurdgenjisnek
Image models:
nai-diffusion-4-5-fullnai-diffusion-4-5-curatednai-diffusion-4-fullnai-diffusion-4-curatednai-diffusion-3nai-diffusion-3-furry
When helping with setup, ask the user which model they want instead of assuming silently. If they do not care, recommend:
- Text:
glm-4-6 - Image:
nai-diffusion-4-5-full
OpenClaw configuration
When explaining how to connect OpenClaw to the adaptor:
- Set
base_urlto the local adaptor endpoint, such ashttp://127.0.0.1:xxxx/v1orhttp://localhost:xxxx/v1. - Set the OpenClaw model name to the adaptor-exposed model the user selected.
- Clarify that credential handling belongs to the local adaptor configuration, not the chat.
- If OpenClaw insists on an API key field, explain that some clients accept a placeholder value such as
sk-local, but the real NovelAI credential should stay in the local adaptor config only.
Helpful local commands
If the user has already verified the package source and approved local usage, these commands are relevant:
novelai-config --help
novelai-shim --help
novelai-image --help
Use novelai-config init as the normal guided setup entry point. It should collect local configuration such as:
- UI language
- NovelAI credential through local input
- Default shim model
- Default image output directory
- Default image model
Image generation usage
Once the local adaptor is configured and running, image generation can use the normal OpenClaw or OpenAI-style prompt flow.
Example prompt:
1girl, solo, masterpiece, best quality, highly detailed
The adaptor is responsible for translating that prompt into the format expected by NovelAI.
相关 Skills
可观测性设计
by alirezarezvani
面向生产系统规划可落地的可观测性体系,串起指标、日志、链路追踪与 SLI/SLO、错误预算、告警和仪表盘设计,适合搭建监控平台与优化故障响应。
✎ 把监控、日志、链路追踪串起来,帮助团队从设计阶段构建可观测性,排障更快、系统演进更稳。
资深开发运维
by alirezarezvani
覆盖 CI/CD 流水线生成、Terraform 基建脚手架和自动化部署,适合在 AWS、GCP、Azure 上搭建云原生发布流程,管理 Docker/Kubernetes 基础设施并持续优化交付。
✎ 把CI/CD、基础设施即代码、容器与监控串成一条交付链,尤其适合AWS/GCP/Azure多云团队高效落地。
环境密钥管理
by alirezarezvani
统一梳理dev/staging/prod的.env和密钥流程,自动生成.env.example、校验必填变量、扫描Git历史泄漏,并联动Vault、AWS SSM、1Password、Doppler完成轮换。
✎ 统一管理环境变量、密钥与配置,减少泄露和部署混乱,安全治理与团队协作一起做好,DevOps 场景很省心。
相关 MCP 服务
kubefwd
编辑精选by txn2
kubefwd 是让 AI 帮你批量转发 Kubernetes 服务到本地的开发神器。
✎ 微服务开发者最头疼的本地调试问题,它一键搞定——自动分配 IP 避免端口冲突,还能用自然语言查询状态。但依赖 AI 工作流,纯命令行爱好者可能觉得不够直接。
Cloudflare
编辑精选by Cloudflare
Cloudflare MCP Server 是让你用自然语言管理 Workers、KV 和 R2 等云资源的工具。
✎ 这个工具解决了开发者频繁切换控制台和文档的痛点,特别适合那些在 Cloudflare 上部署无服务器应用、需要快速调试或管理配置的团队。不过,由于它依赖多个子服务器,初次设置可能有点繁琐,建议先从 Workers Bindings 这类核心功能入手。
Terraform
编辑精选by hashicorp
Terraform MCP Server 是让 AI 助手直接操作 Terraform Registry 和 HCP Terraform 的桥梁。
✎ 如果你经常在 Terraform 里翻文档找模块配置,这个服务器能省不少时间——直接问 Claude 就能生成准确的代码片段。最适合管理多云基础设施的团队,但注意它目前只适合本地使用,别在生产环境里暴露 HTTP 端点。