DataView
by BytesAgain
Explore CSV and JSON files with quick queries, filters, and aggregation. Use when inspecting data, running queries, filtering rows, aggregating.
安装
claude skill add --url github.com/openclaw/skills/tree/main/skills/bytesagain1/dataview文档
DataView
A data processing toolkit for ingesting, transforming, querying, and managing data entries from the command line. All operations are logged with timestamps and stored locally.
Commands
Data Operations
Each data command works in two modes: run without arguments to view recent entries, or pass input to record a new entry.
| Command | Description |
|---|---|
dataview ingest <input> | Ingest data — record a new ingest entry or view recent ones |
dataview transform <input> | Transform data — record a transformation or view recent ones |
dataview query <input> | Query data — record a query or view recent ones |
dataview filter <input> | Filter data — record a filter operation or view recent ones |
dataview aggregate <input> | Aggregate data — record an aggregation or view recent ones |
dataview visualize <input> | Visualize data — record a visualization or view recent ones |
dataview export <input> | Export data — record an export entry or view recent ones |
dataview sample <input> | Sample data — record a sample or view recent ones |
dataview schema <input> | Schema management — record a schema entry or view recent ones |
dataview validate <input> | Validate data — record a validation or view recent ones |
dataview pipeline <input> | Pipeline management — record a pipeline step or view recent ones |
dataview profile <input> | Profile data — record a profile or view recent ones |
Utility Commands
| Command | Description |
|---|---|
dataview stats | Show summary statistics — entry counts per category, total entries, disk usage |
dataview export <fmt> | Export all data to a file (formats: json, csv, txt) |
dataview search <term> | Search all log files for a term (case-insensitive) |
dataview recent | Show last 20 entries from activity history |
dataview status | Health check — version, data directory, entry count, disk usage, last activity |
dataview help | Show available commands |
dataview version | Show version (v2.0.0) |
Data Storage
All data is stored locally at ~/.local/share/dataview/:
- Each data command writes to its own log file (e.g.,
ingest.log,transform.log) - Entries are stored as
timestamp|valuepairs (pipe-delimited) - All actions are tracked in
history.logwith timestamps - Export generates files in the data directory (
export.json,export.csv, orexport.txt)
Requirements
- Bash (with
set -euo pipefail) - Standard Unix utilities:
date,wc,du,grep,tail,cat,sed - No external dependencies or API keys required
When to Use
- To log and track data processing operations (ingest, transform, query, etc.)
- To maintain a searchable history of data viewing and analysis activities
- To export accumulated records in JSON, CSV, or plain text format
- As part of larger automation or data inspection workflows
- When you need a lightweight, local-only data operation tracker
Examples
# Record a new ingest entry
dataview ingest "loaded sales_report.csv 2500 rows"
# View recent transform entries
dataview transform
# Record a query
dataview query "top 10 products by revenue"
# Filter data
dataview filter "region=APAC"
# Search across all logs
dataview search "sales"
# Export everything as CSV
dataview export csv
# Check overall statistics
dataview stats
# View recent activity
dataview recent
# Health check
dataview status
Powered by BytesAgain | bytesagain.com | hello@bytesagain.com 💬 Feedback & Feature Requests: https://bytesagain.com/feedback
相关 Skills
迁移架构师
by alirezarezvani
为数据库、API 与基础设施迁移制定分阶段零停机方案,提前校验兼容性与风险,生成回滚策略、验证关卡和时间线,适合复杂系统平滑切换。
✎ 做数据库与存储迁移时,用它统一梳理表结构和数据搬迁流程,架构视角更完整,复杂迁移也更稳。
数据库建模
by alirezarezvani
把需求梳理成关系型数据库表结构,自动生成迁移脚本、TypeScript/Python 类型、种子数据、RLS 策略和索引方案,适合多租户、审计追踪、软删除等后端建模与 Schema 评审场景。
✎ 把数据库结构设计、ER图梳理和SQL建模放到一处,复杂业务也能快速统一数据模式,少走不少返工弯路。
资深数据工程师
by alirezarezvani
聚焦生产级数据工程,覆盖 ETL/ELT、批处理与流式管道、数据建模、Airflow/dbt/Spark 优化和数据质量治理,适合设计数据架构、搭建现代数据栈与排查性能问题。
✎ 复杂数据管道、ETL/ELT 和治理难题交给它,凭 Spark、Airflow、dbt 等现代数据栈经验,能更稳地搭起可扩展的数据基础设施。
相关 MCP 服务
PostgreSQL 数据库
编辑精选by Anthropic
PostgreSQL 是让 Claude 直接查询和管理你的数据库的 MCP 服务器。
✎ 这个服务器解决了开发者需要手动编写 SQL 查询的痛点,特别适合数据分析师或后端开发者快速探索数据库结构。不过,由于是参考实现,生产环境使用前务必评估安全风险,别指望它能处理复杂事务。
SQLite 数据库
编辑精选by Anthropic
SQLite 是让 AI 直接查询本地数据库进行数据分析的 MCP 服务器。
✎ 这个服务器解决了 AI 无法直接访问 SQLite 数据库的问题,适合需要快速分析本地数据集的开发者。不过,作为参考实现,它可能缺乏生产级的安全特性,建议在受控环境中使用。
Firecrawl 智能爬虫
编辑精选by Firecrawl
Firecrawl 是让 AI 直接抓取网页并提取结构化数据的 MCP 服务器。
✎ 它解决了手动写爬虫的麻烦,让 Claude 能直接访问动态网页内容。最适合需要实时数据的研究者或开发者,比如监控竞品价格或抓取新闻。但要注意,它依赖第三方 API,可能涉及隐私和成本问题。