Neural network background
Generated reference·generated-cli
Source: packages/cli/README.mdcligenerated
v2.5.1

  Package: @cognisos/liminal


@cognisos/liminal

Transparent LLM context compression proxy. Liminal sits between your AI coding tools and the LLM API, compressing context to save tokens, reduce costs, and extend effective context windows — all without changing your workflow.

Install

npm i @cognisos/liminal

Quick Start

liminal init       # Guided setup — auth, tool detection, config
liminal start      # Start the compression proxy
liminal            # Launch the TUI dashboard

Features

  • Zero-config compression — Routes through Claude Code, Codex, Cursor, and OpenAI-compatible tools automatically
  • TUI dashboard — Run liminal to launch a full-screen live dashboard with stats, config, and logs
  • Setup wizard — 5-step guided setup with verification and error recovery
  • Stats tracking — Session and all-time metrics with token savings, context extension, and cost estimates
  • Cursor hooks — Transparent file compression via preToolUse hooks (no sudo, no TLS hacks)
  • Multi-session — Concurrent session management with circuit breakers and graceful degradation
  • Zero UI dependencies — All terminal rendering uses raw ANSI codes

Commands

liminal                               Launch TUI dashboard
liminal init                          Guided setup wizard
liminal start [-d] [--port PORT]      Start the compression proxy
liminal stop                          Stop the proxy
liminal status                        Quick health check
liminal stats [--json]                Compression metrics & savings
liminal config [--set k=v] [--get k]  View or edit configuration
liminal logs [--follow] [--lines N]   View proxy logs
liminal setup cursor [--teardown]     Install Cursor compression hooks
liminal login                         Log in or create an account
liminal logout                        Log out
liminal trust-ca                      Install CA cert (TLS intercept)
liminal untrust-ca                    Remove CA cert
liminal uninstall                     Remove all Liminal configuration

TUI Dashboard

Run liminal with no arguments to launch the interactive dashboard:

  • Dashboard — Live daemon health, tool routing status, session metrics, recent activity
  • Stats — Token savings, cost impact, context extension (session + all-time)
  • Config — Current configuration at a glance
  • Logs — Colorized live tail of daemon logs

Navigate with arrow keys or Tab. Press q to exit.

How It Works

  1. Proxy — Liminal runs a local HTTP proxy (default port 3141)
  2. Intercept — Your AI tool sends API requests through the proxy
  3. Compress — RSC (Recursive Semiotic Computation) normalizes and compresses the context
  4. Forward — Compressed request goes to the upstream LLM API
  5. Learn — Patterns are learned over time to improve compression

Supported protocols: Anthropic Messages API, OpenAI Chat Completions, OpenAI Responses API.

Configuration

Config is stored at ~/.liminal/config.json. Key settings:

KeyDefaultDescription
port3141Proxy listen port
compressionThreshold100Min tokens to compress
learnFromResponsestrueLearn patterns from LLM responses
latencyBudgetMs10000Max compression time before fallback
enabledtrueGlobal compression toggle

Requirements

  • Node.js >= 18.0.0
  • A Cognisos account (created during liminal init)

License

MIT

Command Reference

Commands are listed alphabetically. Each links to a dedicated reference page.

CommandDescriptionUsage
byok listList providers with a stored keyliminal byok list [--json]
byok removeRemove a stored provider keyliminal byok remove <provider>
byok setStore a provider API key (stdin only)liminal byok set <provider>
configView or edit configurationliminal config [--set k=v] [--get k]
daemon get-tokenPrint the stored auth tokenliminal daemon get-token
daemon inspectDiagnostic details (port, pid, binary)liminal daemon inspect
daemon logsView daemon log outputliminal daemon logs [--follow]
daemon restartRestart the Fabric daemonliminal daemon restart
daemon startStart the Fabric background daemonliminal daemon start
daemon statusShow daemon status and healthliminal daemon status [--json]
daemon stopStop the Fabric daemonliminal daemon stop
initSet up Liminal (login, config)liminal init
inspectInspect pipeline phases for textliminal inspect <text> [--json]
install-serviceInstall as a launchd LaunchAgent (macOS) or systemd user unit (Linux)liminal install-service
loginLog in or create an accountliminal login
logoutLog out of your accountliminal logout
logsView proxy logsliminal logs [--follow] [--lines N]
mcp-stdiostdio↔HTTP MCP bridge (used by Codex)liminal mcp-stdio
setup claude-codeConnect Fabric MCP tools to Claude Codeliminal setup claude-code [--global]
setup codexConnect Fabric MCP tools to Codex CLIliminal setup codex [--teardown]
setup cursorInstall file compression hooks for Cursorliminal setup cursor [--teardown]
startStart the compression proxyliminal start [-d] [--port PORT]
statsCompression metrics & savingsliminal stats [--json]
statusShow proxy health checkliminal status [--json]
stopStop the running proxyliminal stop
trust-caInstall CA cert (for TLS intercept)liminal trust-ca
uninstallRemove Liminal configurationliminal uninstall
uninstall-serviceRemove the launchd LaunchAgent (macOS) or systemd user unit (Linux)liminal uninstall-service [--purge]
untrust-caRemove CA certliminal untrust-ca