Synaptic unreliability facilitates information transmission in balanced cortical populations

Leon A. Gatys,1,2,* Alexander S. Ecker,1,2,3,4 Tatjana Tchumatchenko,5 and Matthias Bethge1,2,3 1Werner Reichardt Centre for Integrative Neuroscience and Institute of Theoretical Physics, University of Tübingen, Germany 2Bernstein Center for Computational Neuroscience, Tübingen, Germany 3Max Planck Institute for Biological Cybernetics, Tübingen, Germany 4Department of Neuroscience, Baylor College of Medicine, Houston, Texas, USA 5Max Planck Institute for Brain Research, Frankfurt, Germany (Received 13 February 2014; revised manuscript received 8 January 2015; published 11 June 2015)


I. INTRODUCTION
Synaptic transmission in cortex is remarkably unreliable.On average most synapses respond to only less than half of the presynaptic spikes [1] and if they respond, the amplitude of the postsynaptic current varies [2].This high degree of unreliability has been puzzling as it impairs information transmission in excitatory neural networks where a presynaptic signal is represented in the mean input to a postsynaptic neuron [3].
However, information transmission in the cortex is unlikely to be purely excitatory.In fact, the high degree of irregularity in cortical interspike interval statistics is inconsistent with such an architecture [4][5][6][7][8][9].In contrast, if excitatory and inhibitory inputs to a cortical neuron are balanced, the response variability of cortical neurons can be accounted for [10,11].In such a balanced state [12][13][14][15] the mean input to a neuron (averaged over repeated trials) effectively vanishes and information about the firing rate of the presynaptic population is encoded in the variance of the postsynaptic current [16].An increase in presynaptic firing leads to a higher variance of the input current to the postsynaptic neuron.Moreover, the membrane potential of the postsynaptic neuron can be modelled as a random walk with steps drawn from the distribution of input currents [17].As a consequence, threshold crossings of the membrane potential will be more frequent if the variance of the input current is higher.Hence, even though the mean input to the postsynaptic neuron is zero, the presynaptic firing rate still modulates the postsynaptic firing rate and previous experiments have shown that this variance coding enables cortical neurons to track fast-varying input stimuli [16].
In this paper, we report a surprising finding that has gone unnoticed so far: in the balanced regime synaptic unreliability generically improves information transmission between neural populations and improves the ability of postsynaptic neurons to accurately track fast-varying signals in the presynaptic firing rate.

A. Simulation using a conductance-based neuron model shows improved information transmission for unreliable synapses
We consider a presynaptic population with balanced excitation and inhibition projecting onto a single postsynaptic neuron (Fig. 1A).The presynaptic population is driven by a common firing rate signal, which determines the instantaneous spike-probability of each presynaptic neuron.The postsynaptic neuron is described by a conductance-based leaky integrate and fire neuron model that was previously shown to reproduce key properties of in vivo cortical activity [18,19].The accumulated spike trains of the whole presynaptic population constitute the input to the postsynaptic neuron and drive the dynamics of the postsynaptic membrane potential.Every time the postsynaptic potential crosses a certain threshold, it is reset and the neuron fires a spike (see Appendix A).For The PSTH is estimated from the postsynaptic spike train averaged over repeated trials of the same presynaptic rate signal.B, Unreliable synaptic transmission.The amplitude of a postsynaptic current in response to a presynaptic spike, jn, is a random process with mean one and variance σ 2 s .If no presynaptic spike is present, the postsynaptic current is zero.In the simulations we model the amplitude given a presynaptic spike as Poisson distributed but this choice is of no relevance for the conclusions.
a given presynaptic rate signal, we sample many presynaptic input spike trains and simulate the corresponding postsynaptic output spike train.Averaging over all trials we obtain a peri-stimulus time histogram (PSTH) for the postsynaptic neuron in response to the presynaptic rate signal.To evaluate information transmission we compare how well the postsynaptic firing rate tracks the firing rate of the presynaptic population.
We distinguish between two conditions: reliable and unreliable synaptic transmission.For reliable synapses, the spike train of an individual presynaptic neuron follows a Bernoulli process with spike probability given by the presynaptic rate signal.Here all postsynaptic responses to a presynaptic spike have the same amplitude, which means that a presynaptic spike is reliably transmitted over the synapse and always influences the postsynaptic conductance at a given postsynaptic potential in the same way.In contrast, in real neural circuits synaptic transmission is unreliable.Due to synaptic failures, not every spike changes the postsynaptic conductance.In addition, the amplitude of the effect on the postsynaptic  [18,19].
conductance can vary considerably and this variability depends on a number of factors, such as the number and size of released neurotransmitter vesicles or the number of active receptors [2,[20][21][22].In the unreliable synaptic transmission condition we summarise this variability by additionally drawing the amplitude of a presynaptic spike from a Poisson distribution, which mimics the probabilistic release of neurotransmitter quanta (vesicles).Thus the presynaptic spike train is a Bernoulli process where the amplitude of the successes is drawn from a Poisson distribution (Fig. 1A).For a principled comparison, the mean of the Poisson distribution is fixed to be one, such that the average amount of used neurotransmitter is the same in both conditions.Remarkably, we find that synaptic noise increases the accuracy with which the postsynaptic firing rate tracks the presynaptic rate signal (Fig. 2A,B).Adopting the stimulus from a previous experimental study [16], we simulated 10 s where the presynaptic neurons were driven by a common firing rate signal, which switched randomly every millisecond to be either 0 or 10 spikes/s (Fig. 2A).The postsynaptic firing rate tracked the presynaptic firing rate signal substantially better if synaptic transmission was unreliable (correlation coefficient 0.89 vs. 0.74, mutual information 0.90 vs. 0.57 bits/ms) (Fig. 2A,B) while operating in a biologically realistic regime (Fig. 2C).Importantly, this effect holds for any fast-varying presynaptic rate signal chosen.For example, we find the same result when using a signal that is uniformly drawn every ms from between 0 and 10 spikes/s (correlation coefficient 0.82 vs. 0.67).For slowly-varying stimuli the effect size decreases, but is qualitatively still apparent (e.g.binary and uniform signals as above, but random switches only every 20ms: binary: correlation coefficient 0.95 vs 0.93, mutual information 0.92 vs 0.88 bits/ms, uniform: correlation coefficient 0.94 vs 0.92).In summary, synaptic unreliability appears to generically improve signal transmission in balanced cortical populations and in particular to facilitate the tracking of fastvarying input signals.

B. Synaptic noise improves variance coding in the balanced state
To understand this surprising enhancement of information transmission analytically, we simplify the above analysis.Instead of explicitly modelling the temporal evolution of the pre-and postsynaptic neurons we consider instantaneous information transmission, which corresponds to the case of a presynaptic signal that changes rapidly in time.Furthermore we do not assume a particular spiking mechanism for the postsynaptic neuron but calculate information transmission between the presynaptic firing rate and the postsynaptic current instead of comparing pre-and postsynaptic firing rates (Fig. 3A).
We use the simplest model of a balanced input: the presynaptic population consists of N independent and identical neurons, half of which are excitatory and half of which are inhibitory, all forming equally strong synapses with the postsynaptic neuron (Fig. 3A).Thus, all presynaptic action potentials cause postsynaptic currents of equal mean amplitude, which we assume to be one for simplicity.We consider a small time interval dt such that presynaptic neural responses are binary and occur with probability R. Then the instantaneous postsynaptic current, j n , caused by an individual presynaptic neuron is either Bernoulli distributed for reliable synapses or Bernoulli distributed with the amplitude of the success additionally drawn from a mean-one distribution, P s (j n ), with variance σ 2 s in case of unreliable synapses (Fig. 1B).For simplicity, we further assume that the total postsynaptic current J can be described as the sum of the individual excitatory and inhibitory inputs j n corrupted by a small amount of zero-mean Gaussian noise with So say we have a binary presynaptic rate signal that is either off (R/dt=0 spikes/s) or on (R/dt=1 spikes/s) as shown in Fig. 3A.If the presynaptic signal is off, none of the presynaptic neuron spikes and the postsynaptic current is only drawn from the small Gaussian noise distribution (Fig. 3A, red distribution of postsynaptic current).But if the presynaptic signal is 1 spikes/s, all presynaptic neurons cause postsynaptic currents, j n , with small probability.As the individual currents are independent, the total postsynaptic current is approximately Gaussian distributed and since the presynaptic population is balanced, the mean postsynaptic current is zero.Therefore the information about the presynaptic firing rate is encoded in the variance of the postsynaptic current and the postsynaptic current is drawn from a wider distribution (Fig. 3A, blue distribution of postsynaptic current).
As we explain in the following, the important effect of synaptic noise is to amplify the modulation of the variance of the postsynaptic current by the presynaptic firing rate and therefore to increase the information about the presynaptic signal carried by the input to the postsynaptic neuron.
To illustrate this we calculate the variance of the postsynaptic current given a presynaptic firing rate.With reliable synaptic transmission the current j n from each presynaptic neuron is simply a Bernoulli random variable.As all presynaptic neurons are independent for a given firing rate, the variance of the total postsynaptic current is just the sum of the variances of the individual currents plus the variance of the independent Gaussian noise.
Hence for reliable synapses the distribution of the postsynaptic current is either mean-zero Gaussian with variance σ 2 , if the presynaptic signal is off (red distribution Fig. 3B) or mean-zero Gaussian with variance N R(1 − R) + σ 2 if the presynaptic signal is on (blue distribution Fig. 3B) With unreliable synapses the current caused by a single presynaptic neuron arises from a sequential process.First, there is again the Bernoulli decision whether the neuron spikes or not.Second, if the neuron spikes, the amplitude of the postsynaptic current is drawn from another probability distribution P s (j n ) with mean one and variance σ 2 s (Fig. 1B).When conditioning on the Bernoulli decision, we can use the law of total variance to calculate the variance of the postsynaptic current for given presynaptic firing rate R (see Appendix B).
So apart from the Bernoulli and additive Gaussian variance, there is an additional term caused by the synaptic

FIG. 3. (Color online)
A, Simplified population model.In a small time bin, the presynaptic firing rate signal is drawn from a binary distribution and drives inhibitory and excitatory inputs to a postsynaptic neuron.The total postsynaptic current is the sum over all presynaptic inputs corrupted by small Gaussian noise.For a given presynaptic firing rate it is approximately mean-zero Gaussian with variance depending on the presynaptic firing rate.Here the red (dark grey) and blue (light grey) distributions of the postsynaptic current indicate the conditional distribution of the postsynaptic current given the presynaptic signal was on (blue) or off (red).B, Distributions of the postsynaptic current for given presynaptic firing rate for reliable synapses.If the presynaptic signal is 0 spikes/s the postsynaptic current is only drawn from the small Gaussian noise distribution (red distribution).If the presynaptic signal is 1 spikes/s, all presynaptic neurons fire with small probability and increase the variance of the postsynaptic current (blue distribution).C, Unreliable synapses amplify the modulation of the postsynaptic variance by the presynaptic firing rate.The red distribution is unchanged but the variance of the blue distribution is increased.Therefore the ratio between the variances of the postsynaptic current given the two different presynaptic firing rates is increased.D, The mutual information between the binary presynaptic firing rate R and postsynaptic current J is an increasing function of the ratio between the conditional variances of the postsynaptic current.Hence the amplification of the variance modulation leads to an increase in mutual information between presynaptic rate signal and the total postsynaptic current.
noise that scales with the presynaptic activity R and thus amplifies the modulation of the postsynaptic current variance by the presynaptic activity.While the postsynaptic variance is still σ 2 if the presynaptic signal is off (red distribution Fig. 3C), it is considerably larger if the presynaptic signal is on (compare blue distributions Fig. 3B  and C).
As the conditional postsynaptic currents are approximately Gaussian distributed, the mutual information I[R : J] between a binary presynaptic signal R with pos-sible states R 1 and R 2 (R 2 > R 1 , without loss of generality) and the postsynaptic current J is an increasing function of the ratio between the variances of the conditional postsynaptic currents Var[J|R2]  Var[J|R1] (Fig. 3D).While it is trivial to see that this ratio is increased by unreliable synapses for an on/off binary signal with R 1 = 0, we can show that unreliable synapses in fact increase this ratio for arbitrary binary signals (see Appendix C).Therefore unreliable synapses generically improve the encoding of any two presynaptic firing rate states in the input current to a postsynaptic neuron.

C. Synaptic noise increases the channel capacity between presynaptic activity and postsynaptic current
To show that the improvement in information transmission is a general property of balanced input populations and independent of the particular choice of presynaptic signal, we need to compare the channel capacity for reliable and unreliable synaptic transmission.The channel capacity is the amount of information that can be transmitted using the optimal presynaptic signal and is defined as the maximal mutual information between the channel input and the channel output.
Since we choose the distribution of the presynaptic signal P (R) and the distribution of the conditional postsynaptic current P (J|R) is known, the mutual information I[R : J] can be computed and optimised over presynaptic signals P (R).Because optimisation over all possible signal distributions is computationally intractable, we constrain the optimisation to all discrete signal distributions with probability mass on up to five different states in the interval [0, 0.1] (corresponding to presynaptic firing rates between 0 and 100 spikes/s).We optimise the mutual information using gradient descent jointly over the five parameters R 1 , R 2 ,..., R 5 for the available signal states (Fig. 4A) and the four parameters P (R 1 ),..., P (R 4 ) (Fig. 4B) determining the probability mass on the five states (see Appendix D).
We found that such a balanced population channel favours low-entropy signals and the complexity of the optimal input distribution increases with population size (Fig. 4A,B).However, this increase is modest: for populations of 5,000 neurons the optimal signal distribution explores only three different states and even for 50,000 neurons not all available five states are explored (Fig. 4C).Based on this observation we conclude that it is unlikely that signals with higher entropy would improve the channel capacity substantially.Similar conclusions were reached by previous studies on neural information channels [23][24][25] .This conclusion holds for both, unreliable and reliable synapses (the results in Fig. 4A,B and  C were obtained with unreliable synapses but results do not differ qualitatively for reliable synapses).
Moreover, we find that unreliable synapses lead to a small but consistent increase in the channel capacity Probability mass i) 5,000 ii) 50,000 i) 5,000 ii) 50,000

FIG. 4. (Color online)
A,The optimal firing rates R1, R2, R3, R4 as a function of presynaptic population size.Signal complexity increases with population size: for small populations binary signals maximise information transmission and for very large populations a quaternary signal achieves channel capacity.Note that R5 is not explicitly shown because up to 50,000 presynaptic neurons, no quinary signal was optimal.B, The optimal probability masses P (R1), P (R2), P (R3), P (R4) on the firing rate states shown in A as a function of presynaptic population size.The colour shows the correspondence between Ri and P (Ri).C, The optimal signal distributions for 5,000 (i) and 50,000 (ii) presynaptic neurons.The results in A, B and C were obtained with unreliable synapses; results for reliable synapses do not differ qualitatively.D, Channel capacity as a function of presynaptic population size.Unreliable synapses robustly improve information transmission for all population sizes.E, The improvement is explained by the increase in maximum variance of the postsynaptic current due to unreliable synaptic transmission.The channel capacity as a function of maximum postsynaptic variance is almost equal for unreliable and reliable synapses (deviations for small population sizes are likely due to a breakdown of the Gaussian assumption for the conditional postsynaptic currents such that the postsynaptic variance does not capture all information about the presynaptic activity).
compared to the reliable case for all population sizes (Fig. 4D), demonstrating that our earlier results are indeed general and do not depend on a particular choice of input signal favouring synaptic unreliability.This increase in channel capacity is fully explained by the described amplification of the modulation of the postsynaptic current variance by the presynaptic firing rate (Fig. 3).If we match the maximum variance of the postsynaptic current by allowing more presynaptic neurons for reliable synaptic transmission, there is almost no difference in channel capacity between unreliable and reliable synapses (Fig. 4E).Hence synaptic unreliability serves as a general mechanism to facilitate information transmission in balanced cortical populations.

III. DISCUSSION
In conclusion, in a balanced code synaptic noise does not impair information transmission.Instead, it actually increases the amount of transmitted information: the firing rate of a postsynaptic neuron tracks a presynaptic signal more accurately if synaptic transmission is unreliable.Intriguingly, this effect can be explained by a very simple model of neural communication.It is the variance coding of a balanced population that implies a surprising feature: extra variability can increase the encoding performance of a balanced neural population, as long as this variability is modulated by the presynaptic population activity.As a striking example of this mechanism we have shown that the experimentally observed unreliability of synaptic transmission can act as such beneficial noise.
While the mechanisms underlying synaptic noise in the cortex are still debated, the detailed biological mechanism is irrelevant for our analysis.The mere fact that a single presynaptic spike leads to a postsynaptic current of variable amplitude results in a higher information rate if excitation and inhibition are balanced.This effect is opposite to the case of purely excitatory input populations, in which reliable synaptic transmission is optimal [3].As information transmission is typically considered in the context of an additive noise channel, where changes in the mean activity carry the signal, the unreliability in synaptic transmission has puzzled many researchers.If, however, cortical circuits indeed use a balanced code for information transmission this unreliability is not as surprising.On the contrary, the code would actually be somewhat improved by unreliable synapses, suggesting that there may not have been any evolutionary pressure on cortical neurons to make synaptic communication more reliable.
Importantly, our theoretical result is open to experimental verification: one could selectively stimulate the presynaptic inputs to a postsynaptic neuron (e.g. by retrograde expression of light-gated ion channels such as ChR2 using adeno-associated viral vectors [26]) while recording the postsynaptic membrane potential intracellularly.We predict that increasing the variability of the postsynaptic current evoked by a presynaptic spike (e.g. by manipulating vesicle release through changing extracellular Mg2+ concentration; [22,27]) should improve the ability to decode the presynaptic signal from the postsynaptic membrane potential.
More broadly, in this paper we identified a parsimonious mechanism that is responsible for a beneficial role of noise in signal transmission.Due to its simplicity it may be relevant for any system with inherent noise that is tuned to keep the mean output at a constant level.In contrast to stochastic resonance [28,29], the presented mechanism does not rely on a threshold nonlinearity and therefore does not reach an optimum at a finite noise level.In fact, for the balanced regime, any source of spike-triggered variability improves information transmission.In other words, for a balanced population channel, we can simply say: the noisier, the better.
Appendix A: Simulation of information transmission with a conductance based leaky integrate and fire neuron.
We model a postsynaptic neuron that receives input from 4,000 excitatory and 1,000 inhibitory neurons.The presynaptic population is driven by a common firing rate signal that determines the instantaneous spikeprobability of each presynaptic neuron.In all simulations we assume a time discretisation of dt = 1ms, i.e. if the presynaptic rate signal is 10 spikes/s, every presynaptic neuron spikes with probability 0.01 in each time bin.The postsynaptic neuron is described by a conductancebased leaky integrate and fire neuron model, adapted from [18].In this model the membrane potential of the neuron evolves as where τ L is the leak time constant and V L , V e and V i are the resting, excitatory and inhibitory reversal potentials, respectively.The excitatory and inhibitory inputs J e (t), J i (t) are the sum over the individual spike trains, j n (t), of the 4,000/1,000 excitatory/inhibitory presynaptic neurons respectively: J e (t) = excit.
j n (t), j n (t) The parameters a e , a i are dimensionless quantities that represent the integrated conductances over the time course of the synaptic event divided by the neural capacitance.They are chosen such that total excitation and inhibition are approximately equally strong.Small additive noise is added directly to the membrane potential to account for any variability independent from the signal.When the membrane potential crosses a threshold V th it is reset to the reset potential V r where it remains for a refractory period τ r .All parameters are chosen to match the analysis described in [18], only the voltage threshold was slightly varied to match the output firing rates of the postsynaptic neuron for unreliable (V th = −55.95mV) and reliable (V th = −55.10mV) synapses.The membrane potential noise was drawn from a zero-mean normal distribution with standard deviation 2 mV.
To compute the postsynaptic PSTH we average over the postsynaptic spike trains in response to 10,000 samples of presynaptic input spike trains for a given presynaptic rate signal.
For the binary signal we computed the mutual information as I is the conditional entropy of the postsynaptic firing rate given the presynaptic rate signal.We estimate P (X) and P (X|Y ) non-parametrically by computing the histograms of the marginal and conditional PSTH responses using a sufficiently small bin size (conditional histograms for reliable and unreliable synapses are shown in Fig. 2B.The marginal histogram is then just the sum of the two conditional histograms with a weighting factor of 1/2).For the uniform presynaptic signal we did not attempt to compute mutual information due to possible problems with limited sampling bias for the joint probability distribution of two continuous random variables.

FIG. 1 .
FIG. 1. (Color online)A, Population Model.A common firing rate drives inhibitory and excitatory inputs to a postsynaptic neuron.The postsynaptic neuron is modelled as a conductance based leaky integrate and fire neuron (LIF).The PSTH is estimated from the postsynaptic spike train averaged over repeated trials of the same presynaptic rate signal.B, Unreliable synaptic transmission.The amplitude of a postsynaptic current in response to a presynaptic spike, jn, is a random process with mean one and variance σ 2 s .If no presynaptic spike is present, the postsynaptic current is zero.In the simulations we model the amplitude given a presynaptic spike as Poisson distributed but this choice is of no relevance for the conclusions.