| Action | Input |
|---|---|
| Walk | W A S D or arrow keys |
| Look around | Mouse (after click-to-capture) |
| Capture mouse | Click anywhere on the scene |
| Release mouse | Esc |
| Run | Hold Shift while walking |
| Jump | Space |
| Build | Type in the build console (bottom of screen), press Enter |
| Chat | Type in the chat input (top right), press Enter |
| Try a showcase | Click any chip in the showcase row |
The keyword resolver covers all eight architectural domains: geometry, render-style, rules, mechanical, perceptual, cognitive, commands, items. Specialists handle creative paraphrases when they're loaded; both produce grammar-checked op-trees that render the same way.
a small Doric temple, half-ruined — a stack of plinth + colonnade + entablature + roofa Brutalist concrete tower — same shape, different style modifier bundletwelve ancient standing stones in a ring — arrange-around composer with weathered rock atomsan autumn forest of oaks — scatter composer with seasonal-tinted tree atomsa watermill driving a three-stage gear train — drive composer with mechanical primitivesa grumpy librarian who knows herbalism — NPC atom with personality + domain modifiersrender this in Paper Mario style — render-style atom binds at world scoperender the world as it was in 1860 — time-shift perceptual atomglasses of night vision — wearable that binds a perceptual filteran iron sword of frost +3 — weapon item with modifiersPvP enabled with mutual consent only — constraint atom routed to the manifoldshow me the world as a blueprint — render-style atomIf the keyword resolver doesn't recognize what you typed, it returns its best guess (often a single column with whatever modifiers it could pull out). If a trained specialist is loaded, it produces something more thoughtful. Either way, the result passes the grammar checker before it touches the renderer — you'll never see geometry that contradicts itself.
Sixteen pre-composed builds, one click each. They're a guided tour through the architecture's range; clicking through them in order takes about ninety seconds and shows you what the eleven compositional surfaces actually look like in motion.
The game connects to a multiplayer server automatically. Other players appear as humanoid silhouettes; they can see you walk past them; chat travels by proximity (30m radius for local, server-wide for world-scope). Builds you make broadcast to everyone connected. The world state persists — log out and back in tomorrow, the temple you built is still there.
Items live in your inventory. Wearables (glasses, helms, boots, rings, amulets) can be equipped to slots. When you equip a wearable that binds a perceptual filter, the world's appearance changes — only for you. Wear sepia glasses and everything tints warm; wear x-ray glasses and walls become translucent; wear fog glasses and the scene's near/far fog clamps in close. State stays the same for everyone else.
Server-spawned NPCs respond to nearby chat. They have personality (grumpy, kind, anxious, cheerful, melancholic, philosophical) and optional domain expertise (herbalism, smithing, navigation, mythology, music). Walk up to one and say hello; their reply is in character. Domain questions get domain answers. Each NPC remembers up to eight recent interactions; future versions extend this through the substrate's HANN layer.
If you're hosting a server, you can bind constraint atoms to scopes via POST /api/rules/{scope_key}. The constraint manifold checks every action against the active rules and returns a structured trace if blocked.
Server admins can set time-rate per scope (POST /api/time/{scope_key}) and mood per scope (POST /api/mood/{scope_key}). Time-rate scales how fast effective time flows. Mood drives ambient tint, music selection, and NPC behavior bias. Both compose with everything else through the same scope chain.
State and rendering are separate. Wear glasses; the world looks different to you, not to others. Twelve atoms in the perception vocabulary — noir, watercolor, ASCII, infrared, paper-Mario, voxel, 4D, and seven more — all combinable. Other players see the unaltered world. This is the noumenal/phenomenal split made concrete: shared substrate, private perception.
Form a guild. Declare alliance with one. Rivalry with another. Combat composes around relations — the same atoms that build geometry also bind faction-membership and faction-relation. Servers can host multiple factions with declared standings; the relation graph drives PvP eligibility, NPC disposition, and quest-availability. Ships in the post-launch Phase 9 surface set.
NPCs know, believe, and suspect. Knowledge graphs are composed of the same atoms as the world. Rumors propagate when NPCs interact in proximity; lies stick; secrets resist sharing. Disposition gates what gets shared with whom. Players can plant rumors. Ships in the post-launch Phase 9 surface set.
You can register new atoms at runtime: POST /api/mods/atoms with a JSON spec (name, family, accepts_modifiers, params). The grammar accepts them immediately; the data-gen pipeline can sample them; trained specialists can be retrained against the extended vocabulary. The substrate composes its own composability.
Yes. The downloadable Electron build runs a local server in-process and gives you a private world. Multiplayer requires a hosted server.
Yes. The server is a single Python process: uvicorn src.server.index:app --host 127.0.0.1 --port 8060. The repo includes systemd units for production deployment.
Yes — see Modding above. Or fork the repo and add atoms to the operator vocabulary directly. The grammar accepts both static and dynamically-registered atoms equivalently.
Builds you make on a private server are private. Builds you make on a public server are visible to everyone connected and saved in the world state. The architecture supports per-server visibility scopes; specific server policy is up to whoever is running it.
The launch server (the one you connect to from /play/) stores: your account email (for magic-link login), your player_id, the op-trees you appended, and your most recent position. No other personal data. Everything is on infrastructure operated by the Prometheus7 Institute. Account deletion + data export are supported via the auth API.
At launch: a keyword resolver covers all eight architectural domains. After the first specialist training run, the cognitive, geometry, render-style, constraint, mechanical, perceptual, item, and command specialists hot-load into the live server. The frontend never changes; the resolver just gets smarter.
The architecture is documented at architecture. The repository hosts the source and issue tracker (link from the changelog).