Most multiplayer worlds split a binary choice. Either you ship a fixed engine and let players decorate it (Minecraft, Roblox, VRChat) — rich inhabitation, no language-driven content. Or you ship language-driven content with no inhabitation (AI Dungeon, character.ai) — rich generativity, no body. Wander Around argues you don't have to choose. Underneath, a holographic substrate composes the world's atoms; on top, a small specialist model translates language into substrate operations; the substrate already runs continuously at zero marginal cost; the model only fires when language-resolution is needed. That separation lets a $25/month server host an inhabitable, continuously-evolving world a hundred-thousand-times cheaper than the frontier-lab stack.
Three things had to converge. The mathematics of holographic reduced representations (Tony Plate, 1991) had to leave the dust of computational neuroscience and become a usable substrate. Browser-side three-dimensional rendering (Three.js, WebGL2) had to become unremarkable on consumer hardware. And small language models had to become competent enough that domain specialists at thirty million parameters could outperform monolithic models at hundreds of billions on bounded tasks. All three crossed the threshold roughly between 2024 and 2026. Substrate paradigm is what falls out when you compose them.
wander-around.Prometheus7 organization. Trained checkpoints are downloadable; the architecture is documented in a companion preprint.The economic argument is the most strategically important one. Today's AI-driven simulation stacks (Inworld, Convai, NVIDIA ACE, Sony Soul) require model inference for every NPC action and world event. At scale that's tens of dollars per hour per server. Substrate paradigm decouples being from narration: the substrate metabolizes continuously at electricity cost; the model fires only on user interaction. A world with ten thousand NPCs costs the same as one with three. Adding a new domain of capability is one bounded specialist trained on one weekend of GPU time, not a frontier-model retrain.
This shifts who can ship a multiplayer simulated world from venture-capital-tier teams to two-person studios. The launch artifact you're about to see is the worked example.
The eleven runtime systems are deployed on production hardware and 211 unit tests pass in 1.7 seconds. A 142-atom / 20-composer / 60-modifier operator vocabulary defines the world's compositional grammar; eight pre-generated specialist training corpora span 993K (NL, op-tree) pairs across 1.4 GB; nine V1 specialist recipes are scaffolded (eight architectural domains plus a horizontal language-depth fallback); twelve Tier-2 cultural-domain specialists are queued (norse, greek, egyptian, japanese, marine, botanical, mineral, astronomy, medieval, renaissance, victorian, folkloric); the standalone Windows desktop build bundles Python and the FastAPI/Three.js stack as a single artifact requiring zero end-user prerequisites; the Stripe stack (donations + monthly residency subscriptions + signed webhooks) is wired and configurable in one setup script; the 88.2% pre-specialist baseline measures the keyword resolver's accuracy on a 76-prompt evaluation across the eight domains.
The roadmap, in compressed form: V1 launch is one weekend of GPU rental away — nine specialists train in 4–8 wall-clock hours on a $72 H100. V1.5 adds the twelve cultural-domain specialists plus three meta-specialists (curator, paraphraser, edge-miner) that automate corpus generation and drop the marginal cost of adding a new specialist by an order of magnitude. V2 enables the residency layer (private servers listed on a public cartographer for $5/month), the universal constructor GUI, positional voice over WebRTC, and the federated multiplayer hub that turns wanderaround.io into a living map of inhabited worlds.
Wander Around is built at the Prometheus7 Institute, an independent research institute operating on rented commodity infrastructure. The substrate is descended from prior work on holographic reduced representations and vector-symbolic architectures; the cortical organization mapping the seven layers of the model is borrowed from cognitive neuroscience rather than invented. The project's larger ambitions are documented in the preprints and the architecture page.