Noise dynamically suppresses chaos in random neural networks

Noise is ubiquitous in neural systems due to intrinsic stochasticity or external drive. For deterministic dynamics, randomly coupled neural networks display a transition to chaos at a critical coupling strength. Here, we investigate the effect of additive white noise on the onset of chaos. We develop the dynamical mean-field theory yielding the statistics of the activity and the maximum Lyapunov exponent. An exact condition determines the transition from stable to chaotic dynamics. Noise suppresses chaos by a dynamic mechanism, shifting the transition to significantly larger coupling strengths than predicted by local stability analysis. A regime emerges, where expansive dynamics and stable long-term behavior coexist. Furthermore, the time scale of the temporal correlations does not diverge at the transition, but peaks slightly above the critical coupling strength.

Collective phenomena that emerge in systems with many interacting degrees of freedom present a central quest in physics. Large random networks of neuron-like units are a prominent example, which can exhibit collective chaotic dynamics [1][2][3]. Their information processing capabilities have been a focus in neuroscience [4] and in machine learning [5] and show optimal performance close to the onset of chaos [6][7][8].
For deterministic and autonomous dynamics, networks of randomly coupled rate, i.e., continuous-variable, neurons display a transition from a fixed point to chaotic fluctuations at a critical coupling strength [1]. This is illustrated in Figure 1a. The transition is well understood by a dynamical mean-field theory, originally developed for spin glasses [9,10]. The onset of chaos is equivalent to the emergence of a decaying autocorrelation function, whose decay time diverges at the transition. This equivalence has been used in several subsequent studies [11][12][13]. Furthermore, the transition happens precisely when the fixed point becomes linearly unstable, which identifies the spectral radius of the random connectivity matrix [14,15] as the parameter controlling the transition.
In nature and technology, neural networks often operate in the presence of noise or time-dependent drive, rendering their dynamics stochastic or nonautonomous. In the presence of noise, a decaying autocorrelation function does not indicate chaos, since the noisy fluctuations cause perpetual decorrelation also for stable dynamics (Figure 1b). Therefore, the transition to chaos, if existent at all, must be qualitatively different from the noiseless case.
It is controversially discussed, whether the instability of deterministic rate dynamics explains a transition to strongly fluctuating firing rates in networks of spiking neurons [16][17][18]. For the analysis of oscillations [19] and correlations [20,21] in these networks, the irregular spiking activity of the neurons can be approximated by Poisson processes, whereby the realization of the spikes amounts to a source of noise. Recently, Kadmon and Sompolinsky [22] showed that already weak spiking noise smooths the transition to chaos in networks of stochastically spiking neurons. However, the mechanism by which noise of arbitrary strength affects the maximum Lyapunov exponent and the location of the transition is not understood.
In random neural networks with simpler discrete-time dynamics chaos is suppressed by noise [23]. However, the neuron dynamics does not possess temporal memory and the corresponding mean-field theory shows that the chaotic fluctuations are temporally uncorrelated even without noise. Therefore, these results can not be transferred to continuous-time dynamics with memory, where the temporal correlations play an important role.
To investigate the generic influence of noise on the transition to chaos, we add white noise to the seminal model by Sompolinsky et al. [1] and develop the dynamical mean-field theory for the resulting stochastic continuous-time dynamics. The autocorrelation function is given by the motion of a classical particle in a potential where the noise amounts to the initial kinetic energy. We then determine the maximum Lyapunov exponent [24,25] and derive an exact condition for the transition from stable to chaotic dynamics. Noise suppresses chaos significantly stronger than expected from its effect on local linear stability, which yields only a necessary condition. This is explained by a dynamic effect: by sharpening the autocorrelation function noise decreases the maximum Lyapunov exponent. The transition occurs precisely when the curvature of the autocorrelation function at time-lag zero changes its sign from positive to negative. Furthermore, the decay time of the autocorrelation function does not diverge at the transition. Its peak is strongly reduced by the noise and occurs slightly above the critical coupling strength.
We study the continuous-time dynamics of a random network of N neurons, whose states x i (t) ∈ R evolve according to the system of stochastic differential equations i = 1, . . . , N. The J ij are independent and identically Gaussian distributed random coupling weights with zero mean and variance g 2 /N , where the intensive gain parameter g controls the recurrent coupling strength or, equivalently, the weight heterogeneity of the network. The ξ i (t) are independent Gaussian white noise processes with autocorrelation function ξ i (t)ξ i (s) = 2σ 2 δ(t − s). We choose the sigmoidal transfer function φ(x) = tanh(x), so that without noise, i.e., for σ = 0, the model agrees with the one studied in [1].
The dynamical system (1) contains two sources of randomness: quenched disorder due to the random coupling weights and temporally fluctuating noise. A particular realization of the random couplings J ij defines a fixed network configuration and its dynamical properties usually vary between different realizations. For large network size N , however, certain quantities are selfaveraging, meaning that their values for a typical realization can be obtained by an average over network configurations [26]. An important example is the populationaveraged autocorrelation function.
The dynamical mean-field theory describes, in the limit of N → ∞, the statistical properties of the system under the joint distribution of disorder, noise and possibly random initial conditions. The theory can be derived via a heuristic "local chaos" assumption [27] or using a generating functional formulation [10,28], while a rigorous proof employs large deviation techniques [29]. The general idea is that for large network size N the local recurrent input N j=1 J ij φ(x j ) in (1) approaches a Gaussian process η(t) with self-consistently determined statistics. As a result we obtain the dynamical mean-field equation characterizing the statistics of a typical neuron: Here, ξ(t) is a Gaussian white noise process as in (1), independent of η(t). Because the random couplings J ij have zero mean, the Gaussian process η(t) is centered, η(t) = 0, and thus fully specified by its autocorrelation function The expectation on the right-hand side is taken with respect to the distribution of x(t).
Our goal is to determine the mean-field autocorrelation function x(t)x(s) , which, by self-averaging, also Figure 1: Solution of the mean-field theory. Noiseless case σ = 0 (left column) and noisy case σ = √ 0.125 (right column). Upper row: Simulated trajectories of two example neurons for sub-critical g = 0.5 (upper part of vertical axis) and super-critical g = 1.7 (lower part of vertical axis). Middle row: Classical potential (7) with self-consistent variance c0 following from energy conservation (8) for different coupling strength g (corresponding legends in lower row); in the noisy case the critical value gc = 1.48 from (12) is shown in red. Dashed line represents initial kinetic energy E kin = σ 4 /2. Lower row: Corresponding self-consistent autocorrelation function (solid line) compared to simulations (crosses). The variance c0 is given by the largest value of c at which the potential (middle row) is defined, indicated for g = 1.7 with gray dotted lines in d and f. Network size in simulations is N = 10000.
describes the population-averaged autocorrelation function. Assuming that x(t) is a stationary process, c(τ ) = x(t + τ )x(t) obeys the differential equation with c 0 = c(0). The Dirac-δ inhomogeneity originates from the white-noise autocorrelation function and is absent in [1]. The same inhomogeneity arises from Poisson spiking noise with 2σ 2 = g 2 r [22], where r is the population-averaged firing rate. In (4) we write with a function u(x) and the Gaussian integration measure Dz = exp(−z 2 /2)/ √ 2π dz. This representation holds since x(t) is itself a Gaussian process. Note that (5) reduces to one-dimensional integrals for f u (c 0 , c 0 ) = u(x) 2 and f u (0, c 0 ) = u(x) 2 , where x has zero mean and variance c 0 .
We formulate (4) as the one-dimensional motion of a classical particle in a potential: where we define with Φ(x) = x 0 φ(y) dy and ∂/∂c f Φ (c, c 0 ) = f φ (c, c 0 ) following from Price's theorem [30]. The potential (7) depends on the initial value c 0 , which has to be determined self-consistently. We obtain c 0 from classical energy conservationċ 2 /2 + V (c) = constant. Considering τ ≥ 0 and the symmetry of c(τ ), the noise term in (6) amounts to an initial velocityċ(0+) = −σ 2 and thus to the kinetic energyċ 2 (0+)/2 = σ 4 /2. Since |c(τ )| ≤ c 0 , the solution c(τ ) and its first derivative must approach zero as τ → ∞. Thus we obtain the self-consistency condition for c 0 as For the noiseless case, Figure 1c,e shows the resulting potential and the corresponding self-consistent autocorrelation function c(τ ) in the chaotic regime. Approaching the transition from above, g → g c = 1, the amplitude c 0 vanishes and the decay time of c(τ ) diverges [1]. This picture breaks down in the noisy case (Figure 1d,f), where c 0 is always nonzero and c(τ ) decays and has a kink at zero. The mean-field prediction is in excellent agreement with the population-averaged autocorrelation function obtained from numerical simulations showing that the self-averaging property is fulfilled. In the following we derive a condition for the transition from stable to chaotic dynamics in the presence of noise. The maximum Lyapunov exponent quantifies how sensitive the dynamics depends on the initial conditions [24]. It measures the asymptotic growth rate of infinitesimal perturbations. For stochastic dynamics the stability of the solution for a fixed realization of the noise is also characterized by the maximum Lyapunov exponent [33]: If it is negative, trajectories with different initial conditions converge to the same time-dependent solution, i.e., the dynamics is stable. If it is positive, the distance between two initially arbitrary close trajectories grows exponentially in time, i.e., the dynamics exhibits sensitive dependence on the initial conditions and is hence chaotic. The linear stability of the dynamical system (1) is analyzed via the variational equation i = 1, . . . , N, describing the temporal evolution of an infinitesimal deviation y i (t) about a reference trajectory x i (t). The maximum Lyapunov exponent can then be defined as if a generic initial perturbation y i (0) with N i=1 y 2 i (0) = 1 is chosen.
Lyapunov exponents usually cannot be calculated analytically. However, for large network size N it is possible to obtain a mean-field description of the variational equation (9) and introduce a dynamical meanfield variable y(t) analog to x(t) in (2). Self-averaging suggests that the squared Euclidean norm appearing in the definition of the maximum Lyapunov exponent (10) can be approximated as Here, K(t, s) = y(t)y(s) is the mean-field autocorrelation function, which obeys the linear partial differential equation (∂ t + 1) (∂ s + 1) K(t, s) = g 2 f φ ′ (c(t − s), c 0 ) K(t, s). A separation ansatz in the coordinates τ = t − s and T = t + s then yields an eigenvalue problem in the form of a time-independent Schrödinger equation [1,22] where τ plays the role of a spatial coordinate. Here, the quantum potential W (τ ) = −V ′′ (c(τ )) = 1 − g 2 f φ ′ (c(τ ), c 0 ) is given by the negative second derivative of the classical potential V (c) evaluated along the self-consistent autocorrelation function c(τ ). The ground state energy E 0 of (11) determines the asymptotic growth rate of K(t, t) as t → ∞ and, hence, the maximum Lyapunov exponent via λ max = −1 + √ 1 − E 0 . Therefore, the dynamics is predicted to become chaotic if E 0 < 0. It is shown together with the ground state energy and wave function in Figure 2. The latter are obtained as solutions of a finite difference discretization of (11).
In the noiseless case, a decaying autocorrelation function corresponds to a positive maximum Lyapunov exponent [1]. This follows from the observation that for g > 1 the derivative of the self-consistent autocorrelation functionċ(τ ) solves the Schrödinger equation with E = 0. Becauseċ(τ ) is an eigenfunction with a single node, there exists a ground state with energy E 0 < 0. Hence, the dynamics is chaotic and λ max crosses zero at g = 1 (Figure 3a).
In the presence of noise, the maximum Lyapunov exponent becomes positive at a critical coupling strength g c > 1; depending on the noise intensity the transition is shifted to larger values (Figure 3a). The mean-field prediction λ max = −1 + √ 1 − E 0 shows excellent agreement with the maximum Lyapunov exponent obtained in simulations using a standard algorithm [24]. Since the ground state energy E 0 must be larger than the minimum W (0) = 1 − g 2 [φ ′ (x)] 2 of the quantum potential, an upper bound for λ max is provided by −1+g [φ ′ (x)] 2 leading to a necessary condition for chaotic dynamics. However, close to the transition λ max is clearly smaller than the upper bound, which is a good approximation only for small g (Figure 3a, inset): the actual transition occurs at substantially larger coupling strengths. In contrast, for memoryless discrete-time dynamics the necessary condition found here is also sufficient for the transition to chaos [23, eq. 13].
Interestingly, ρ = g [φ ′ (x)] 2 is also the radius of the disk formed by the eigenvalues of the Jacobian matrix in (9) as estimated by random matrix theory [14,15]. Therefore, the dynamics is expected to become locally unstable if this radius exceeds unity, as shown in the inset in Figure 3b displaying ρ and the eigenvalues at an arbitrary point in time.
To derive an exact transition condition we determine a ground state with vanishing energy E 0 = 0. As in the noiseless case,ċ(τ ) solves (11) for E = 0, except at τ = 0 where it exhibits a jump caused by the noise. However, due to linearity |ċ(τ )| would be a continuous and symmetric solution with zero nodes. Therefore, if its derivative is continuous as well, requiringc(0+) = 0, it constitutes the searched for ground state. This is in contrast to the noiseless case, whereċ(τ ) corresponds to the first excited state. Consequently, with (4) we find the transition condition in which c 0 is determined by the self-consistency condition (8) resulting in the transition curve (g c , σ c ) in parameter space (Figure 3b). At the transition the classical self-consistent potential V (c; c 0 ) has a horizontal tangent at c 0 , while in the chaotic regime a minimum emerges (Figure 1d). This implies that the curvaturec(0+) of the autocorrelation function at zero changes sign from positive to negative (Figure 1f). Furthermore, according to (12) the system becomes chaotic precisely when the variance c 0 = x 2 of a typical neuron equals the variance g 2 φ 2 (x) of the recurrent input. Close to the transition a standard perturbative approach shows that λ max is proportional to g 2 φ 2 (x) − c 0 , indicating a self-stabilizing effect: since both terms grow with g, the growth of their difference is attenuated, explaining why λ max (g) bends down as the transition is approached (Figure 3a).
The condition (12) predicts the transition at significantly larger coupling strengths compared to the necessary condition g [φ ′ (x)] 2 > 1 (Figure 3b), which is explained as follows. For continuous-time dynamics the effect of noise is twofold: First, because φ ′ (x) is maximal at the origin, it reduces the averaged squared slope in g 2 [φ ′ (x)] 2 , thereby stabilizing the dynamics. This is an essentially static effect as it can be fully attributed to the increase of the instantaneous variance c 0 by the noise, which could similarly result from static heterogeneous inputs. Second, noise sharpens the autocorrelation function (Figure 1e,f) and hence the quantum potential ( Figure 2). This shifts the ground-state energy to larger values, further decreasing the maximum Lyapunov exponent. Because this effect depends on the temporal correlations, noise suppresses chaos by a dynamic mechanism yielding stable dynamics even in the presence of local linear instability.
To understand this dynamic mechanism we return to the variational equation (9): its fundamental solution can be regarded as a product of short-time propagator matrices, where each factor has the same stability properties and unstable directions as the local Jacobian matrix at the respective time. Even though the fraction of eigenvalues with positive real part stays approximately constant, the corresponding unstable directions vary in time. The sharpening of the autocorrelation function suggests that noise causes a faster variation such that perturbations asymptotically decay.
Finally, we consider the effect of noise on the asymptotic decay time τ ∞ = 1/ 1 − g 2 φ ′ (x) 2 of the autocorrelation function (Figure 3c). For weak noise, the decay time peaks at the transition, reflecting the diverging time scale in the noiseless case. For larger noise intensities, the peak is strongly reduced and the maximum decay time is attained above the transition. In networks of spiking neurons the correlation time does not peak at the point where the transition is predicted by the instability of the corresponding deterministic rate dynamics [17,18]. This could be attributed to the effect of spiking noise: even for the simple rate dynamics considered here, noise leads to the absence of a diverging time scale and shifts the transition to larger coupling strengths than those predicted by local linear stability analysis.
The theoretical analysis of stochastic and hence nonautonomous dynamics lays the foundation for a better understanding of the computational properties of random neural networks: the maximum Lyapunov exponent for nonautonomous dynamics indicates the echo state property, i.e., the reliability, of the network [31]. Since noise shapes the temporal correlations, we further expect that it significantly affects information processing capabilities such as short-term memory [8,32]. Thus our results facilitate the design and control of functional networks. More generally, the coexistence of local instability and hence expansive dynamics with stable long-term behavior could serve as a computationally powerful substrate. In conclusion, presenting exact results for a prototypical and solvable model this work contributes to the understanding of chaos in large, noisy and non-equilibrium systems.