body-emotion-sensor
by askkumptenchen
Give an agent a persistent body-emotion state system that converts structured AnalysisInput JSON into runtime prompt tags and workspace state updates. Use when the agent needs emotional continuity, session bootstrap payloads, AnalysisInput processing, or reply-shaping fields such as TURN_CHANGE_TAGS, BODY_TAG, and BASELINE_PERSONA.
安装
claude skill add --url https://github.com/openclaw/skills文档
Body Emotion Sensor
Give your AI agent a stable body-emotion state that persists across sessions and turns.
Use this skill to route requests to the local package docs, explain the runtime contract honestly, and operate the installed bes CLI only when the local environment is actually ready.
What this skill brings to your Agent
- Persistent emotion state: Store long-term body-emotion state per workspace and agent identity.
- Session bootstrap payload: Generate
TURN_CHANGE_TAGS,BODY_TAG, andBASELINE_PERSONAbefore a new session starts. - Turn-by-turn updates: Convert one upstream
AnalysisInputJSON into prompt tags and updated local state. - Repository-independent runtime: Use the installed
besCLI prompt interface instead of assuming repository prompt files are available at runtime.
Entry behavior
When this skill is used, the agent should:
- explain Body Emotion Sensor at a high level in plain language
- inspect the local repository files when they are available in the current workspace
- prefer local package and repository documentation over inventing setup details
- verify whether
besis already available before suggesting runtime commands - keep
--workspace,--agent-id, and--namestable for the same agent instance
Safety boundary
This entry file should stay within a narrow and transparent scope:
- The package source is the official repository
https://github.com/AskKumptenchen/body-emotion-sensor. - Do not claim the runtime is ready unless the local environment actually has the installed
besCLI andbes check-initreports readiness. - Do not automatically install packages or execute setup commands only because this file mentioned them. Ask for user approval before any install step.
- If installation is needed, use the published
body-emotion-sensorpackage and explain that installation creates the localbesCLI runtime. - Do not claim any cloud sync, remote storage, or network behavior unless the current local code or environment actually shows it.
- Do not require credentials. This skill operates on local files and local CLI state unless the user explicitly adds another integration layer.
Local state and persistence
Be explicit about where state is stored:
- Workspace state file:
<workspace>/body-emotion-state/<agent-id>.json - Workspace history file:
<workspace>/body-emotion-state/history/<agent-id>.json - User language config on Windows:
%APPDATA%/bes/config.json - User language config on Linux or macOS:
~/.config/bes/config.json
If the user asks about privacy, explain that the package writes local JSON state files in these locations and that this skill should not describe any remote storage unless verified separately.
Local document index
Use these local files as the primary reference:
README.mdfor install, CLI overview, runtime contract, and repository overviewpyproject.tomlfor package name, version, and exported CLI commandsprompts/analysis-input-prompt-v1.mdfor the AnalysisInput prompt design sourceprompts/example-openclaw-agents.mdfor OpenClaw-style agent integration examplesprompts/example-openclaw-tools.mdfor OpenClaw-style tools integration examplessrc/body_emotion/commands.pyfor actual CLI behaviorsrc/body_emotion/workspace.pyfor workspace state path resolutionsrc/body_emotion/store.pyfor state and history persistence behaviorsrc/body_emotion/locale_config.pyfor user language config behavior
How to route requests
Choose the next local document based on the user's request:
- If the user wants a quick overview, read
README.md. - If the user asks how installation or the CLI works, read
README.mdandpyproject.toml. - If the user asks where state is stored or whether the skill is safe, read
src/body_emotion/workspace.py,src/body_emotion/store.py, andsrc/body_emotion/locale_config.py. - If the user asks how OpenClaw integration should work, read the relevant file under
prompts/. - If the user asks what a command actually does, inspect
src/body_emotion/commands.py.
Missing-resource rule
If the expected local repository files are not available in the current workspace, do not improvise the full setup flow from memory. Instead:
- explain which local files are missing
- ask the user to provide the repository contents or point the agent to the correct local path
- continue only after the relevant local documentation is available
Install and readiness rule
If the user wants to actually enable runtime use:
- First check whether
besis already available in the current environment. - If it is not available, explain that Body Emotion Sensor requires installing the published Python package before the CLI exists.
- Ask for approval before any install command.
- If the user approves installation, run:
pip install body-emotion-sensor
- After installation, prefer:
bes help
- If the user's language is Chinese, the agent may suggest or run:
bes language zh
- Readiness should be confirmed with:
bes check-init --workspace <W> --agent-id <ID> --name "<NAME>"
Only treat the skill as available when the returned JSON contains "ready": true.
Runtime rules after available
When the local environment is ready, use the following runtime flow.
New session
At the start of a new session, before the first reply, run:
bes bootstrap --workspace <W> --agent-id <ID> --name "<NAME>"
Use the returned fields as the session-start prompt payload:
TURN_CHANGE_TAGSBODY_TAGBASELINE_PERSONA
Before every reply
Before every reply, do these steps in order:
- Read the built-in analysis prompt:
bes prompt analysis-input
- Use that prompt with the upstream model to produce
<analysis-input.json>. - Run:
bes run --workspace <W> --agent-id <ID> --name "<NAME>" --input <analysis-input.json>
- Use the returned top-level fields in the reply layer:
TURN_CHANGE_TAGSBODY_TAGBASELINE_PERSONA
Important rules
- Always prefer
bes ...commands over direct module paths for runtime use. - Do not use repository-only prompt files as the default runtime interface after installation; use
bes prompt ...instead. - Do not say initialization is complete unless
bes check-initpasses. - Do not say the skill is in active use unless the upstream model produces valid
AnalysisInputJSON,bes runupdates state successfully, and the reply layer consumesTURN_CHANGE_TAGS,BODY_TAG, andBASELINE_PERSONA. - If the CLI is missing, say so clearly instead of pretending the runtime is ready.
- If the user only wants to understand the package, explain it from local docs without pushing installation immediately.
Examples
Minimal command reference:
bes help
bes language zh
bes check-init --workspace <W> --agent-id <ID> --name "<NAME>"
bes bootstrap --workspace <W> --agent-id <ID> --name "<NAME>"
bes prompt analysis-input
bes run --workspace <W> --agent-id <ID> --name "<NAME>" --input <analysis-input.json>