Current systems scale by brute repetition. They discard structure, waste energy, and accumulate instability. We study a different question: what happens when you treat computation as a physical process — one that preserves information, corrects itself, and evolves state with minimal waste.
The dominant paradigm in computational systems relies on a simple trade: more parameters, more data, more compute. This approach works — until it doesn't. Error compounds. Drift accumulates. Efficiency is treated as a problem to solve after the fact, not a constraint to design around.
There is a different starting point. Systems that treat state change as the fundamental operation. Systems where learning is not repeated correction but constrained transformation. Where stability is not enforced but emergent.
This is what we study.