LLM Wiki, a cross-platform desktop application that transforms documents into persistent, interlinked knowledge bases, has gained 1,134 GitHub stars in just six days since its April 8, 2026 release. Created by developer nashsu, the tool implements Andrej Karpathy's LLM Wiki pattern, which moves beyond traditional retrieval-augmented generation (RAG) by building persistent compiled knowledge rather than regenerating answers from scratch.
Persistent Compilation Over Ephemeral Retrieval
The core innovation of LLM Wiki addresses a fundamental limitation in traditional RAG systems. Where standard RAG approaches retrieve relevant passages and generate answers anew for each query, LLM Wiki "incrementally builds and maintains a persistent wiki from your sources," compiling knowledge once into structured wiki pages with cross-references that persist over time.
The application uses a two-step chain-of-thought ingestion process: the LLM first analyzes documents, then generates wiki pages with full source traceability. This compiled knowledge base maintains context and connections that would be lost in ephemeral RAG approaches.
Four-Signal Knowledge Graph Powers Discovery
LLM Wiki implements a sophisticated four-signal knowledge graph that connects pages through multiple dimensions: direct links between topics, source overlap detection, Adamic-Adar scoring for link prediction, and type affinity matching. The system uses Louvain community detection to automatically discover knowledge clusters and provides cohesion scoring for cluster quality assessment.
Graph insights identify surprising connections between topics and detect knowledge gaps, with one-click Deep Research functionality to fill identified gaps. Optional vector semantic search via LanceDB provides embedding-based retrieval compatible with any OpenAI-compatible endpoint.
Obsidian Compatibility Taps Existing Ecosystem
A strategic design decision makes generated wikis work as native Obsidian vaults, enabling users to leverage Obsidian's extensive plugin ecosystem. This compatibility bridges LLM Wiki's automated knowledge compilation with the mature tooling and workflows of the Obsidian community.
The application supports multiple input formats including PDF, DOCX, PPTX, XLSX, images, and web content. A Chrome web clipper enables one-click webpage capture with automatic ingestion into the knowledge base.
Part of Wave of Karpathy-Pattern Tools
LLM Wiki joins a growing category of tools inspired by Karpathy's knowledge compilation concepts, including claude-memory-compiler (595 stars) and claude-obsidian (567 stars). The desktop-first approach built with Tauri, React, and TypeScript emphasizes privacy and local control, distinguishing it from cloud-based alternatives.
With 252 commits showing active development and trending in AI repositories for April 8-14, 2026, the TypeScript implementation demonstrates maturity beyond proof-of-concept stage. The rapid star growth reflects developer interest in moving beyond ephemeral RAG toward persistent, compiled knowledge systems.
Key Takeaways
- LLM Wiki gained 1,134 GitHub stars in 6 days after launching April 8, 2026, implementing Andrej Karpathy's persistent knowledge compilation pattern
- The tool compiles documents into persistent wiki pages with cross-references, avoiding the ephemeral nature of traditional RAG systems
- A four-signal knowledge graph uses direct links, source overlap, Adamic-Adar scoring, and type affinity to connect topics and discover knowledge clusters
- Generated wikis work as native Obsidian vaults, enabling use of Obsidian's plugin ecosystem
- The desktop application supports PDF, DOCX, PPTX, XLSX, images, and web content with optional vector search via LanceDB