Skip to content
Crucible chat with Precognition

Crucible Documentation

A knowledge-grounded agent runtime. Agents that draw from a knowledge graph make better decisions — all backed by markdown files you own.

Memory and knowledge are too fundamental to be an afterthought. Most AI tools treat conversations as disposable — Crucible makes them the foundation.

  • Knowledge-grounded agents. Precognition auto-injects relevant context from your knowledge graph before each LLM turn. Block-level embeddings power semantic search at paragraph granularity. The more you use it, the smarter your agents get.
  • Sessions are notes. Every chat saves as markdown in your kiln. Search them, link them with wikilinks, version them in git. Conversations become permanent, connectable knowledge.
  • Neovim-like architecture. Lua/Fennel plugins, TUI-first, headless daemon with RPC. Most behaviors beyond the knowledge core can be scripted.
  • Bring any LLM. Ollama, OpenAI, Anthropic, local GGUF models — swap freely.
  • Plaintext first. No proprietary formats. Files are the source of truth.
CrucibleChatGPTObsidian + AIOpenClaw
Local-first
Sessions as markdown
Knowledge graph
Bring your own LLMPartial
Plugin system✅ Lua/Fennel✅ JS✅ TS
MCP server
Semantic search✅ Block-levelPlugin
Setup time~2 min0~5 min2-7 hrs
Terminal window
# Install
cargo install --git https://github.com/Mootikins/crucible.git crucible-cli
# Start chatting
cru chat
# Or expose your knowledge base via MCP
cru mcp