Experiments
V12: Attention-Based Lenia
V12: Attention-Based Lenia
Addition: State-dependent interaction topology (evolvable attention kernels).
Result: increase in 42% of cycles (vs 3% for convolution). +2.0pp shift — largest single-intervention effect. But robustness stabilizes near 1.0.
Implication: Attention is necessary but not sufficient. The system reaches the integration threshold without crossing it.
Source code
v12_substrate_attention.py— Attention kernel implementationv12_evolution.py— Evolution loopv12_run.py— CLI runner