AI · 8 min read · April 28, 2026
Neural Networks and ODEs Compute Primitive Recursion via Dynamics, Not Composition
Bournez proves recurrent ReLU networks, polynomial ODEs, and discrete maps all express primitive recursive functions through continuous-time trajectories rather than symbolic subroutine chaining.
Recurrent ReLU networks, polynomial ODEs, and polynomial maps equivalently compute primitive recursion via bounded iteration and continuous dynamics.
- — All three frameworks—RNNs, polynomial ODEs, discrete maps—express primitive recursive functions through fixed dynamical systems.
- — Composition emerges from trajectory evolution, not from explicit closure rules or subroutine calls.
- — Time bounds are themselves primitive recursive; inputs are raw integer vectors.
- — Polynomial ODEs robustly perform rounding and phase selection via continuous flow; fixed polynomial maps cannot.
- — ReLU gates enable exact branching; step-size parameters in discrete maps recover continuous-time benefits with discretization trade-offs.
- — These models shape dynamical trajectories through clocks and error correction, structurally unlike symbolic programming.
- — Framework enables studying subrecursive hierarchies by restricting time, polynomial degree, or discretization resources.
Frequently asked
- Yes, according to Bournez's theorem. Any primitive recursive function can be compiled into a fixed polynomial ODE system with bounded iteration. The ODE operates on real-valued states and uses continuous-time flow to perform rounding and phase selection robustly, though the time bound itself must be primitive recursive.