It compiles to a single ~27MB binary — no Node.js, Docker, or Python required.
Key features:
- Persistent memory via markdown files (MEMORY, HEARTBEAT, SOUL markdown files) — compatible with OpenClaw's format - Full-text search (SQLite FTS5) + semantic search (local embeddings, no API key needed) - Autonomous heartbeat runner that checks tasks on a configurable interval - CLI + web interface + desktop GUI - Multi-provider: Anthropic, OpenAI, Ollama etc - Apache 2.0
Install: `cargo install localgpt`
I use it daily as a knowledge accumulator, research assistant, and autonomous task runner for my side projects. The memory compounds — every session makes the next one better.
GitHub: https://github.com/localgpt-app/localgpt Website: https://localgpt.app
Would love feedback on the architecture or feature ideas.