BackupTool
by BytesAgain
Create timestamped backups with compression, rotation, and scheduling. Use when archiving directories, restoring snapshots, or managing retention.
安装
claude skill add --url github.com/openclaw/skills/tree/main/skills/ckchzh/backuptool文档
BackupTool
Sysops toolkit for logging, tracking, and managing system operations entries. Each command records timestamped entries to individual log files and supports viewing recent entries, searching, exporting, and statistics.
Commands
| Command | Description |
|---|---|
backuptool scan <input> | Record a scan entry; without args shows recent scan entries |
backuptool monitor <input> | Record a monitor entry; without args shows recent monitor entries |
backuptool report <input> | Record a report entry; without args shows recent report entries |
backuptool alert <input> | Record an alert entry; without args shows recent alert entries |
backuptool top <input> | Record a top entry; without args shows recent top entries |
backuptool usage <input> | Record a usage entry; without args shows recent usage entries |
backuptool check <input> | Record a check entry; without args shows recent check entries |
backuptool fix <input> | Record a fix entry; without args shows recent fix entries |
backuptool cleanup <input> | Record a cleanup entry; without args shows recent cleanup entries |
backuptool backup <input> | Record a backup entry; without args shows recent backup entries |
backuptool restore <input> | Record a restore entry; without args shows recent restore entries |
backuptool log <input> | Record a log entry; without args shows recent log entries |
backuptool benchmark <input> | Record a benchmark entry; without args shows recent benchmark entries |
backuptool compare <input> | Record a compare entry; without args shows recent compare entries |
backuptool stats | Show summary statistics across all log files (entry counts, data size) |
backuptool export <fmt> | Export all data in json, csv, or txt format |
backuptool search <term> | Search all log files for a term (case-insensitive) |
backuptool recent | Show last 20 lines from history.log |
backuptool status | Show health check: version, entry count, disk usage, last activity |
backuptool help | Show help message with all available commands |
backuptool version | Show version (v2.0.0) |
Data Storage
Data stored in ~/.local/share/backuptool/
Each command writes timestamped entries to its own .log file (e.g., scan.log, backup.log, restore.log). All actions are also recorded in history.log.
Requirements
- Bash 4+
When to Use
- Logging backup and restore operations with timestamps for audit trails
- Tracking system scans, alerts, and monitoring events
- Searching historical operations logs for specific incidents or keywords
- Exporting accumulated operations data in JSON, CSV, or TXT format
- Getting summary statistics on all tracked system activities
Examples
# Record a backup operation
backuptool backup "/var/www full backup completed"
# Record a scan finding
backuptool scan "disk usage at 85% on /dev/sda1"
# Search all logs for a keyword
backuptool search "disk"
# Export all data as CSV
backuptool export csv
# View health check status
backuptool status
Powered by BytesAgain | bytesagain.com | hello@bytesagain.com
相关 Skills
迁移架构师
by alirezarezvani
为数据库、API 与基础设施迁移制定分阶段零停机方案,提前校验兼容性与风险,生成回滚策略、验证关卡和时间线,适合复杂系统平滑切换。
✎ 做数据库与存储迁移时,用它统一梳理表结构和数据搬迁流程,架构视角更完整,复杂迁移也更稳。
数据库建模
by alirezarezvani
把需求梳理成关系型数据库表结构,自动生成迁移脚本、TypeScript/Python 类型、种子数据、RLS 策略和索引方案,适合多租户、审计追踪、软删除等后端建模与 Schema 评审场景。
✎ 把数据库结构设计、ER图梳理和SQL建模放到一处,复杂业务也能快速统一数据模式,少走不少返工弯路。
资深数据工程师
by alirezarezvani
聚焦生产级数据工程,覆盖 ETL/ELT、批处理与流式管道、数据建模、Airflow/dbt/Spark 优化和数据质量治理,适合设计数据架构、搭建现代数据栈与排查性能问题。
✎ 复杂数据管道、ETL/ELT 和治理难题交给它,凭 Spark、Airflow、dbt 等现代数据栈经验,能更稳地搭起可扩展的数据基础设施。
相关 MCP 服务
PostgreSQL 数据库
编辑精选by Anthropic
PostgreSQL 是让 Claude 直接查询和管理你的数据库的 MCP 服务器。
✎ 这个服务器解决了开发者需要手动编写 SQL 查询的痛点,特别适合数据分析师或后端开发者快速探索数据库结构。不过,由于是参考实现,生产环境使用前务必评估安全风险,别指望它能处理复杂事务。
SQLite 数据库
编辑精选by Anthropic
SQLite 是让 AI 直接查询本地数据库进行数据分析的 MCP 服务器。
✎ 这个服务器解决了 AI 无法直接访问 SQLite 数据库的问题,适合需要快速分析本地数据集的开发者。不过,作为参考实现,它可能缺乏生产级的安全特性,建议在受控环境中使用。
Firecrawl 智能爬虫
编辑精选by Firecrawl
Firecrawl 是让 AI 直接抓取网页并提取结构化数据的 MCP 服务器。
✎ 它解决了手动写爬虫的麻烦,让 Claude 能直接访问动态网页内容。最适合需要实时数据的研究者或开发者,比如监控竞品价格或抓取新闻。但要注意,它依赖第三方 API,可能涉及隐私和成本问题。