.faf - Universal Project Context

AI 与智能体

by wolfe-jam

适用于各类 MCP 平台的通用项目上下文方案,可自动检测 CLI,并兼容 50 个 MCP tools。

什么是 .faf - Universal Project Context

适用于各类 MCP 平台的通用项目上下文方案,可自动检测 CLI,并兼容 50 个 MCP tools。

README

<div style="display: flex; align-items: center; gap: 12px;"> <img src="https://www.faf.one/orange-smiley.svg" alt="FAF" width="40" /> <div> <h1 style="margin: 0; color: #FF8C00;">faf-mcp</h1> <p style="margin: 4px 0 0 0;"><strong>v2.0.0 — The Interop MCP for Context</strong></p> </div> </div>

The MCP you didn't realise you needed, or wanted but didn't know who to ask, is here. Building on 36,000+ downloads across Claude and now Gemini, we bring you faf-mcp v2.0.0 to cure your syncing pain and fuel your chosen AI with optimized context, on-demand.

The only IANA-Registered Format for AI Context · application/vnd.faf+yaml

CI NPM Downloads npm version Website License: MIT project.faf Deploy


Define once. Sync everywhere.

You maintain .cursorrules. Your teammate uses AGENTS.md. Someone on the team just switched to Gemini. Every AI tool wants its own context file — and they all say the same thing in different formats.

faf-mcp is the dedicated MCP server for Cursor, Windsurf, Cline, VS Code, and every non-Claude platform. One .faf file in your repo, synced to every format your team needs.

code
                      project.faf
                           │
          ┌────────┬───────┴───────┬────────────┐
          ▼        ▼               ▼            ▼
      CLAUDE.md  AGENTS.md  .cursorrules  GEMINI.md
      (Claude)   (Codex)      (Cursor)    (Gemini)

Quick Start

bash
npx faf-mcp

Add to your MCP config:

json
{"mcpServers": {"faf": {"command": "npx", "args": ["-y", "faf-mcp"]}}}
PlatformConfig File
Cursor~/.cursor/mcp.json
Windsurf~/.codeium/windsurf/mcp_config.json
ClineCline MCP settings
VS CodeMCP extension config
Claude DesktopUse claude-faf-mcp

Three Ways to Deploy

DoorMethodBest For
Hostedmcpaas.liveZero-install, point any MCP client to the URL
Self-DeployDeploy to VercelYour own instance, full control
Localnpx faf-mcpIDE integration via stdio transport

Hosted (mcpaas.live)

Point your MCP client to https://mcpaas.live/sse — no install, no config, no maintenance. Served from 300+ Cloudflare edges with sub-ms cold starts via 2.7KB Zig-WASM engine.

Self-Deploy (Vercel)

Deploy your own MCP server on Vercel in one click. Once deployed, your server exposes:

  • /health — Health check
  • /info — Server metadata + tool list
  • /sse — MCP Server-Sent Events transport

Local (npm)

bash
npx faf-mcp

🔄 Interop Tools

ToolPlatformAction
faf_agentsOpenAI CodexImport/export/sync AGENTS.md
faf_cursorCursor IDEImport/export/sync .cursorrules
faf_geminiGoogle GeminiImport/export/sync GEMINI.md
faf_conductorConductorImport/export directory structure
faf_gitGitHubGenerate .faf from any repo URL
bash
# Sync to all formats at once
faf bi-sync --all

# Generate .faf from any GitHub repo
faf_git { url: "https://github.com/facebook/react" }

61 tools (25 core + 36 advanced) · 309 tests (9 suites) · 7 bundled parsers


☁️ Cloud Sync

Share your FAF context globally via mcpaas.live:

ToolPurpose
faf_cloud_publishUpload to cloud, get shareable URL
faf_cloud_fetchPull context from cloud
faf_cloud_listList available souls
faf_cloud_searchSearch across souls
faf_cloud_shareGenerate share links

Example Workflow:

bash
# Upload your project.faf
faf_cloud_publish { soul_name: "my-project" }
→ https://mcpaas.live/souls/my-project

# Anyone can fetch it
faf_cloud_fetch { soul_name: "my-project" }
→ Context merged into local project.faf

Zero-install sharing - Recipients need no MCP setup. Served from 300+ Cloudflare edges with <1ms cold starts via 2.7KB Zig-WASM engine.


🔄 Eternal Bi-Sync

Your .faf file and your platform context files stay synchronized in milliseconds.

code
project.faf  ←── 8ms ──→  .cursorrules / AGENTS.md / CLAUDE.md / GEMINI.md
                    Single source of truth
  • Update either side → both stay aligned
  • --all flag syncs to all four formats at once
  • Zero manual maintenance
  • Works across teams, branches, sessions

AI assistants forget. They drift. Every new session, AI starts guessing again. Bi-sync means context never goes stale.


Tier System: From Blind to Optimized

TierScoreStatus
🏆 Trophy100%AI Optimized — Gold Code
🥇 Gold99%+Near-perfect context
🥈 Silver95%+Excellent
🥉 Bronze85%+Production ready
🟢 Green70%+Solid foundation
🟡 Yellow55%+AI flipping coins
🔴 Red<55%AI working blind
🤍 White0%No context at all

At 55%, AI is guessing half the time. At 100%, AI is optimized.


💬 use>faf | Prompt Pattern

Start every prompt with "Use FAF" to invoke MCP tools:

code
Use FAF to initialize my project
Use FAF to score my AI-readiness
Use FAF to sync my context
Use FAF to enhance my project

Works on all platforms — stops web search, forces tool usage.


🛠️ 25 Core MCP Tools

ToolPurpose
faf_initInitialize project.faf
faf_scoreCheck AI-readiness (0-100%)
faf_syncSync context across platforms
faf_bi_syncBi-directional .faf ↔ CLAUDE.md
faf_enhanceIntelligent enhancement
faf_readParse and validate FAF files
faf_writeCreate/update FAF with validation
🔄 Interop Tools
faf_agentsImport/export/sync AGENTS.md
faf_cursorImport/export/sync .cursorrules
faf_geminiImport/export/sync GEMINI.md
faf_conductorImport/export directory structure
faf_gitGenerate .faf from GitHub repo URL
☁️ Cloud Tools
faf_cloud_publishUpload to mcpaas.live
faf_cloud_fetchPull from cloud
faf_cloud_listList souls
faf_cloud_searchSearch souls
faf_cloud_shareGenerate share links

Plus 36 advanced tools and CLI fallback (via faf-cli v5.0.1):

  • faf readme - Extract 6 Ws from README (+25-35% boost)
  • faf human-add - Non-interactive YAML merge (6Ws Builder)
  • faf git - GitHub repo analysis without cloning
  • And 40+ more commands...

📦 Ecosystem


📄 License

MIT License — Free and open source


Zero drift. Eternal sync. AI optimized. 🏆

"It's so logical if it didn't exist, AI would have built it itself" — Claude

常见问题

.faf - Universal Project Context 是什么?

适用于各类 MCP 平台的通用项目上下文方案,可自动检测 CLI,并兼容 50 个 MCP tools。

相关 Skills

Claude接口

by anthropics

Universal
热门

面向接入 Claude API、Anthropic SDK 或 Agent SDK 的开发场景,自动识别项目语言并给出对应示例与默认配置,快速搭建 LLM 应用。

想把Claude能力接进应用或智能体,用claude-api上手快、兼容Anthropic与Agent SDK,集成路径清晰又省心

AI 与智能体
未扫描114.1k

RAG架构师

by alirezarezvani

Universal
热门

聚焦生产级RAG系统设计与优化,覆盖文档切块、检索链路、索引构建、召回评估等关键环节,适合搭建可扩展、高准确率的知识库问答与检索增强应用。

面向RAG落地,把知识库、向量检索和生成链路系统串联起来,做架构设计时更清晰,也更少踩坑。

AI 与智能体
未扫描10.2k

计算机视觉

by alirezarezvani

Universal
热门

聚焦目标检测、图像分割与视觉系统落地,覆盖 YOLO、DETR、Mask R-CNN、SAM 等方案,适合定制数据集训练、推理优化及 ONNX/TensorRT 部署。

把目标检测、图像分割到推理部署串成完整工程链路,主流框架与 YOLO、DETR、SAM 等方案都覆盖,落地视觉 AI 会省心很多。

AI 与智能体
未扫描10.2k

相关 MCP Server

顺序思维

编辑精选

by Anthropic

热门

Sequential Thinking 是让 AI 通过动态思维链解决复杂问题的参考服务器。

这个服务器展示了如何让 Claude 像人类一样逐步推理,适合开发者学习 MCP 的思维链实现。但注意它只是个参考示例,别指望直接用在生产环境里。

AI 与智能体
83.4k

知识图谱记忆

编辑精选

by Anthropic

热门

Memory 是一个基于本地知识图谱的持久化记忆系统,让 AI 记住长期上下文。

帮 AI 和智能体补上“记不住”的短板,用本地知识图谱沉淀长期上下文,连续对话更聪明,数据也更可控。

AI 与智能体
83.4k

PraisonAI

编辑精选

by mervinpraison

热门

PraisonAI 是一个支持自反思和多 LLM 的低代码 AI 智能体框架。

如果你需要快速搭建一个能 24/7 运行的 AI 智能体团队来处理复杂任务(比如自动研究或代码生成),PraisonAI 的低代码设计和多平台集成(如 Telegram)让它上手极快。但作为非官方项目,它的生态成熟度可能不如 LangChain 等主流框架,适合愿意尝鲜的开发者。

AI 与智能体
6.8k

评论