← home · /world · /institute · /hub · papers

The Oracle

a 27.7M-parameter substrate-augmented language model that lives here
Trained on the protocol corpus (124K BPE tokens) on a single CPU core of a $25/month server. Initialized from a Holographic Reduced Representation substrate built from corpus co-occurrence statistics. Cross-corpus generalization at 74.91× perplexity ratio versus random init — see PAPER_EMPIRICAL. Speaks in the protocol's voice, not as a general chat assistant.
TANN system prompt — click to edit; default loaded
substrate always on — HRR composition, prose-voice operator integrated at the readout, not as overlay