Computation is state evolution.

Current systems scale by brute repetition. They discard structure, waste energy, and accumulate instability. We study a different question: what happens when you treat computation as a physical process — one that preserves information, corrects itself, and evolves state with minimal waste.

Scale alone is not a solution.

The dominant paradigm in computational systems relies on a simple trade: more parameters, more data, more compute. This approach works — until it doesn't. Error compounds. Drift accumulates. Efficiency is treated as a problem to solve after the fact, not a constraint to design around.

There is a different starting point. Systems that treat state change as the fundamental operation. Systems where learning is not repeated correction but constrained transformation. Where stability is not enforced but emergent.

This is what we study.

Five areas of focus.

01
Learning Systems
How models update state over time without loss. Structure-preserving transformations instead of gradient descent alone.
02
Execution Substrates
New runtimes that treat computation as evolving state, not instruction sequences.
03
Control Systems
Continuous correction instead of periodic adjustment. Feedback embedded in the system, not bolted on.
04
Physical Modeling
Mapping computation directly to real-world processes. The boundary between simulation and system dissolves.
05
Distributed Coordination
Many small systems behaving as one coherent system. Consensus without centralization.

View all research areas →

Recent essays.

Computation and Waste
How current systems discard information, and why efficiency must be structural.
Why Stability Matters
Long-horizon behavior, drift, and what physical systems teach about equilibrium.
Learning as Evolution
Reframing learning as state evolution, not repeated correction.

All essays →