Starlight
VaultsBenedictionFeaturedQuickstartArchitectureDocsGitHubDeploy
VaultsQuickstartDocs

Starlight Intelligence

A memory system for humans and agents. Local-first. Forkable. Free forever.

Navigate

Public VaultsDocumentationAPI

Connect

GitHubArcanea

Two minutes to compound intelligence

Quickstart

Pick your tool. Copy the config. Your next AI session remembers everything.

1

Install the package

Install once from npm. Works with any MCP-compatible client.

terminal
$ npm install @frankx/starlight-intelligence-system
2

Configure your tool

Every adapter speaks the same MCP protocol. Pick yours and paste.

Claude Code200k tokens

Best for deep work — persistent sessions and rich tool access.

memory:
CLAUDE.md
config:
~/.claude/mcp.json
mcp config
{
  "mcpServers": {
    "starlight-sis": {
      "command": "node",
      "args": ["node_modules/@frankx/starlight-intelligence-system/dist/mcp-server.js"]
    }
  }
}
Cursor200k tokens

Best for IDE-native editing with memory-aware completions.

memory:
.cursorrules
config:
.cursor/mcp.json
mcp config
{
  "mcpServers": {
    "starlight-sis": {
      "command": "node",
      "args": ["node_modules/@frankx/starlight-intelligence-system/dist/mcp-server.js"]
    }
  }
}
Codex128k tokens

Best for terminal-first workflows with OpenAI reasoning.

memory:
AGENTS.md
config:
~/.codex/mcp.json
mcp config
{
  "mcpServers": {
    "starlight-sis": {
      "command": "node",
      "args": ["node_modules/@frankx/starlight-intelligence-system/dist/mcp-server.js"]
    }
  }
}
Gemini CLI1M tokens

Best for massive context — feed your whole vault in one shot.

memory:
GEMINI.md
config:
~/.gemini/settings.json
mcp config
{
  "mcpServers": {
    "starlight-sis": {
      "command": "node",
      "args": ["node_modules/@frankx/starlight-intelligence-system/dist/mcp-server.js"]
    }
  }
}
OpenCodemodel-dependent

Best for multi-model routing — swap models without losing memory.

memory:
AGENTS.md
config:
~/.config/opencode/config.json
mcp config
{
  "mcp": {
    "starlight-sis": {
      "type": "local",
      "command": ["node", "node_modules/@frankx/starlight-intelligence-system/dist/mcp-server.js"]
    }
  }
}
3

Verify it works

Run the MCP server directly to confirm it starts and lists its tools.

terminal
$ npx @frankx/starlight-intelligence-system --list-tools
→ sis_append_entry
→ sis_recent_entries
→ sis_vault_search
→ sis_stats
4

Add your first entry

In any configured tool, just tell the agent to remember something. Under the hood it calls sis_append_entry.

terminal
// in Claude Code, just ask:
"Remember that we chose Next.js 16 for the App Router streaming work."

// or call the tool directly:
sis_append_entry(
  vault: "technical",
  insight: "Chose Next.js 16 for App Router streaming",
  confidence: "high"
)

That entry is now a plain JSONL line in vaults/frank/technical.jsonl. Every tool with the MCP configured can read it. Your memory compounds across every session.

Next

See the architecture →Explore live vaults →