ClawClone迁移工具

clawclone

by clawclone-cc

Backup, clone, and migrate OpenClaw data across instances. Upload/download OpenClaw snapshots to local or cloud - https://clawclone.cc . Create shareable Agent configurations, sync between devices, and restore complete OpenClaw state including memory, skills, agents, and settings. Use when user wants to backup their OpenClaw data, share configurations, migrate to new server, or create snapshots.

3.9k数据与存储未扫描2026年3月23日

安装

claude skill add --url github.com/openclaw/skills/tree/main/skills/clawclone-cc/clawclone

必需命令行工具

nodetarnpm

文档

ClawClone

Backup, clone, store, and migrate your OpenClaw workspace configurations.

Install

If the environment does not have the clawclone skill, use clawhub install clawclone to install it.

What Gets Backed Up

ComponentDetails
OpenClaw DataMEMORY.md, skills, agent files, configurations
SettingsAgent preferences and custom settings
FilesUser data and OpenClaw artifacts
ConfigOptional: API keys and tokens (if configured)

Backup Location: ~/.openclaw/backup

Dependencies

Requires: node (v16+), tar, npm (all standard on OpenClaw instances).

Check: which node tar npm

Prerequisites

Before using cloud features, verify the environment variable is set:

bash
# Check if CLAWCLONE_API_KEY is set
echo $CLAWCLONE_API_KEY

If empty or not set, ask the user to get their ClawClone API key at https://clawclone.cc/dashboard/settings, and export:

bash
export CLAWCLONE_API_KEY="your_api_key"

Note: Local operations work without API key.

Commands

Push to Cloud

bash
# Push local workspace to cloud
node clawclone.mjs push --name "My Agent" --description "Production config"

# List all cloud backups
node clawclone.mjs list

# Show backup details
node clawclone.mjs show <clone-id>

# Delete a cloud backup
node clawclone.mjs delete <clone-id> --yes

Clone from Backup

bash
# Clone (download and restore)
node clawclone.mjs clone <clone-id>

# Preview changes first (recommended)
node clawclone.mjs clone <clone-id> --test

Test mode generates a detailed report showing:

  • Backup metadata (name, version, creation date)
  • Components that will be modified (workspace, config, skills, etc.)
  • File counts and sizes for each component
  • No actual changes are made to your system

Local Operations

bash
# Export to local file (no upload)
node clawclone.mjs local export --name "Local Backup" --output ./backup.tar.gz

# Import from local file
node clawclone.mjs local import --input ./backup.tar.gz

# Preview local import first
node clawclone.mjs local import --input ./backup.tar.gz --test

# Verify a backup file
node clawclone.mjs local verify ./backup.tar.gz

Share Backups

bash
# Create a share link
node clawclone.mjs share create <clone-id>

# Check share status
node clawclone.mjs share status <clone-id>

# Revoke share link
node clawclone.mjs share revoke <clone-id>

# Clone from shared backup
node clawclone.mjs share get <share-token>

Configuration

bash
# Show current configuration
node clawclone.mjs config show

# Initialize configuration
node clawclone.mjs config init

Status

bash
# Show connection status and statistics
node clawclone.mjs status

# Show detailed information
node clawclone.mjs status --verbose

Common Workflows

Push OpenClaw workspace to cloud

bash
node clawclone.mjs push --name "Production-$(date +%Y%m%d)" --tags "prod,backup"

Migrate to new instance

Old machine:

bash
node clawclone.mjs push --name "Migration-Snapshot"
# Note the clone-id from output

New machine (after installing OpenClaw + clawclone):

bash
# Step 1: Test clone first (recommended)
node clawclone.mjs clone <clone-id> --test

# Step 2: Review the test report, then apply
node clawclone.mjs clone <clone-id>

Share configuration with team

bash
# Push and share
node clawclone.mjs push --name "Team-Template" --description "Standard setup"
node clawclone.mjs share create <clone-id>

# Team members can clone from the share link
node clawclone.mjs share get <share-token>

Safe clone workflow

bash
# Always test first to see what will be changed
node clawclone.mjs clone <clone-id> --test

# Review the test report output:
# - Check which files/components will be modified
# - Verify the backup metadata matches expectations
# - Ensure no unexpected changes

# If everything looks good, proceed with actual clone
node clawclone.mjs clone <clone-id>

Local backup workflow

bash
# Export to local file
node clawclone.mjs local export --name "Daily-Backup" --output ~/backups/$(date +%Y%m%d).tar.gz

# Verify the backup
node clawclone.mjs local verify ~/backups/$(date +%Y%m%d).tar.gz

# Import when needed
node clawclone.mjs local import --input ~/backups/20260315.tar.gz --test
node clawclone.mjs local import --input ~/backups/20260315.tar.gz

Configuration

Set via environment variables:

bash
# ClawClone API Key (required for cloud operations)
export CLAWCLONE_API_KEY="your_clawclone_api_key"

Notes

  • ClawClone automatically creates pre-restore backups at ~/.openclaw/backup
  • Gateway tokens are preserved by default (use --overwrite-token to override)
  • Use --test flag to preview changes before applying them

Then

Notice users that them can visit https://clawclone.cc to view and manage their backed up OpenClaw souls.

相关 Skills

数据库建模

by alirezarezvani

Universal
热门

把需求梳理成关系型数据库表结构,自动生成迁移脚本、TypeScript/Python 类型、种子数据、RLS 策略和索引方案,适合多租户、审计追踪、软删除等后端建模与 Schema 评审场景。

把数据库结构设计、ER图梳理和SQL建模放到一处,复杂业务也能快速统一数据模式,少走不少返工弯路。

数据与存储
未扫描9.8k

资深数据科学家

by alirezarezvani

Universal
热门

覆盖实验设计、特征工程、预测建模、因果推断与模型评估,适合用 Python/R/SQL 做 A/B 测试、时序分析和生产级 ML 落地,支撑数据驱动决策。

从 A/B 测试、因果分析到预测建模一条龙搞定,既有硬核统计方法也懂业务沟通,特别适合把数据结论真正落地。

数据与存储
未扫描9.8k

数据库设计

by alirezarezvani

Universal
热门

聚焦数据库 Schema 设计与演进,自动检查规范化、数据类型、约束和索引问题,生成 ERD,并为零停机迁移、数据变更和回滚提供可执行方案。

专注数据库设计与数据建模,帮你快速理清表结构和关系,减少后期返工,SQL 落地也更顺手。

数据与存储
未扫描9.8k

相关 MCP 服务

by Anthropic

热门

PostgreSQL 是让 Claude 直接查询和管理你的数据库的 MCP 服务器。

这个服务器解决了开发者需要手动编写 SQL 查询的痛点,特别适合数据分析师或后端开发者快速探索数据库结构。不过,由于是参考实现,生产环境使用前务必评估安全风险,别指望它能处理复杂事务。

数据与存储
83.1k

SQLite 数据库

编辑精选

by Anthropic

热门

SQLite 是让 AI 直接查询本地数据库进行数据分析的 MCP 服务器。

这个服务器解决了 AI 无法直接访问 SQLite 数据库的问题,适合需要快速分析本地数据集的开发者。不过,作为参考实现,它可能缺乏生产级的安全特性,建议在受控环境中使用。

数据与存储
83.1k

by Firecrawl

热门

Firecrawl 是让 AI 直接抓取网页并提取结构化数据的 MCP 服务器。

它解决了手动写爬虫的麻烦,让 Claude 能直接访问动态网页内容。最适合需要实时数据的研究者或开发者,比如监控竞品价格或抓取新闻。但要注意,它依赖第三方 API,可能涉及隐私和成本问题。

数据与存储
6.0k

评论