Neural network background

Fabric Indexer

MCP server that turns any codebase into a typed knowledge graph for AI assistants.

What it is

fabric-indexer is a local-first Model Context Protocol server that parses your source tree into a typed graph of modules, functions, types, tests, dependencies, contracts, and tech debt. It exposes that graph to any MCP-compatible AI client (Claude Code, Cursor, Codex, Claude Desktop) as structural-query tools. The graph lives in an embedded LMDB store on your machine, persists across sessions, and is incrementally maintained via a file watcher.

Quickstart

npx @cognisos/fabric-mcp@beta

Your AI client launches the server for you. See the integration pages for client-specific config:

Once configured, ask your client to run fabric_index on your project, then fabric_status.

What it indexes

13 languages with tree-sitter parsing. Modules, functions, types, interfaces, tests, and dependencies as graph nodes; structural edges (Contains, Imports, Tests, Implements, Extends, Exports) between them.

  • Full call-graph resolution — TypeScript, Python, Go, Rust, Java
  • Structural extraction — C, C++, C#, Ruby, PHP, Swift, Kotlin, Scala
  • Regex fallback — Shell, Lua, R, Perl, and others

fabric_index respects .gitignore and .fabricignore.

For the full tool surface, see the MCP reference.

Performance

Median 73 K LoC/sec with < 70 MB peak RSS, measured cold-cache against production OSS repos on M-series macOS.

Status

Public beta. Pin to a version tag (@cognisos/fabric-mcp@<version>) for production use. See the changelog.

License

MIT.