Self-hosted · Open Source · MIT Licensed

Build Your Knowledge Vault

A local-first LLM wiki compiler that ingests files and URLs, follows a vault-specific schema, and compiles a structured knowledge graph with Markdown wiki pages. Part of the SwarmClaw network.

Quick Start

Install globally and start compiling knowledge in minutes.

terminal

$ # Install the CLI

$ npm install -g @swarmvaultai/cli

$ # Initialize and build your vault

$ swarmvault init

$ sed -n '1,120p' swarmvault.schema.md

$ swarmvault ingest ./my-document.pdf

$ swarmvault compile

$ swarmvault query "What are the key concepts?"

How It Works

Four steps from raw input to schema-guided, navigable knowledge. Unlike ephemeral RAG, SwarmVault builds a persistent compiled artifact that compounds over time.

Step 01

Shape

Start with swarmvault.schema.md so the vault has explicit naming rules, categories, and grounding expectations.

Step 02

Ingest

Feed in files, URLs, PDFs, images, or markdown. SwarmVault extracts text and creates immutable source manifests with content hashes.

Step 03

Compile

The engine analyzes sources with your chosen LLM, applies the vault schema, and builds a knowledge graph with provenance.

Step 04

Query

Ask natural language questions against your compiled wiki, save useful answers, and expose the vault to agents over MCP.

Everything you need to build your knowledge base

From raw sources to structured knowledge - SwarmVault handles the full compilation lifecycle.

Local-First

Everything runs on your machine. No cloud required. Works fully offline with the built-in heuristic provider - no API keys needed.

Schema-Guided

Each vault ships with swarmvault.schema.md, so compile and query behavior can follow domain-specific naming, categories, and grounding rules.

Graph-Based Wiki

Compile sources into a knowledge graph with nodes, edges, provenance tracking, and auto-generated Markdown wiki pages.

Multi-Provider LLM

Plug in OpenAI, Anthropic, Gemini, Ollama, or any OpenAI-compatible API. Switch providers per-task without changing workflows.

CLI-Powered

Init, ingest, inbox import, compile, query, lint, watch, MCP, and graph serve - all from the command line.

Anti-Drift Linting

Detects stale sources, conflicting claims, and missing citations. Keep your knowledge base fresh and trustworthy over time.

Agent-Ready

Auto-install rules for Claude, Cursor, and Codex, or expose the vault directly through MCP for compatible clients.

Works with

Heuristic (offline)
OpenAI
Anthropic
Google Gemini
Ollama
OpenAI-Compatible
Custom Module

Ready to build your knowledge vault?

Install the CLI and start compiling knowledge in minutes. No API keys required to get started.