Hierarchical self-programming in recurrent neural networks
/ Abstract
We study self-programming in recurrent neural networks where both neurons (the ‘processors’) and synaptic interactions (‘the programme’) evolve in time simultaneously, according to specific coupled stochastic equations. The interactions are divided into a hierarchy of L groups with adiabatically separated and monotonically increasing time-scales, representing sub-routines of the system programme of decreasing volatility. We solve this model in equilibrium, assuming ergodicity at every level, and find as our replica-symmetric solution a formalism with a structure similar but not identical to Parisi’s L-step replica symmetry breaking scheme. Apart from differences in details of the equations (due to the fact that here interactions, rather than spins, are grouped into clusters with different time-scales), in the present model the block sizes mi of the emerging ultrametric solution are not restricted to the interval [0, 1], but are independent control parameters, defined in terms of the noise strengths of the various levels in the hierarchy, which can take any value in [0, ∞� .T his is shown to lead to extremely rich phase diagrams, with an abundance of firstorder transitions especially when the level of stochasticity in the interaction dynamics is chosen to be low.
Journal: Journal of Physics A