V35: Language Emergence
V35: Language Emergence
Status: Complete (10 seeds).
Hypothesis: Language emerges when: (1) partial observability creates information asymmetry (obs_radius=1), (2) discrete channel forces categorical representation (K=8 symbols), (3) cooperative pressure rewards signaling (1.5x bonus for co-consumption), (4) communication range exceeds visual range (comm_radius=5 > obs_radius=1).
Result: Referential communication emerges in 10/10 seeds (100%). Mean symbol entropy 2.48 ± 0.14 bits (83% of maximum, range 2.18–2.67). Resource MI proxy 0.001–0.005 (all positive). All 8 symbols maintained in active use. This breaks the V20b null where continuous z-gate signals never departed from 0.5. Discrete symbols under partial observability and cooperative pressure produce referential communication as an inevitability, not a rarity.
But communication does NOT lift integration. Mean comm ablation lift ≈ 0. Late — below V27 baseline (0.090, ). Distribution: 0 HIGH / 7 MOD / 3 LOW. -MI correlation (null): language and integration are orthogonal. Communication neither helps nor hurts — it operates on a different axis entirely.
Language is cheap. Like affect geometry, referential communication emerges under minimal conditions — partial observability plus cooperative pressure. It sits at rung 4–5 of the emergence ladder. Language does not create dynamics any more than geometry does. The expensive transition remains at rung 8, requiring embodied agency and gradient coupling. Adding communication channels does not help cross it.
Source code
v35_substrate.py— Discrete communication + cooperative dynamicsv35_evolution.py— Evolution loop with communication metricsv35_gpu_run.py— GPU runner (10 seeds)