Scientific Understanding of Consciousness
Consciousness as an Emergent Property of Thalamocortical Activity

Network Inhibitory Plasticity

 

 

Science 16 December 2011:  Vol. 334 no. 6062 pp. 1569-1573

Inhibitory Plasticity Balances Excitation and Inhibition in Sensory Pathways and Memory Networks

T. P. Vogels, H. Sprekeler, F. Zenke1, C. Clopath, W. Gerstner

1School of Computer and Communication Sciences and Brain-Mind Institute, École Polytechnique Fédérale de Lausanne, 1015 Lausanne EPFL, Switzerland.

2CNRS, UMR 8119, Université Paris Descartes, 45 Rue des Saints Pères, 75270 Paris Cedex 06, France.

[paraphrase]

Cortical neurons receive balanced excitatory and inhibitory synaptic currents. Such a balance could be established and maintained in an experience-dependent manner by synaptic plasticity at inhibitory synapses. We show that this mechanism provides an explanation for the sparse firing patterns observed in response to natural stimuli and fits well with a recently observed interaction of excitatory and inhibitory receptive field plasticity. The introduction of inhibitory plasticity in suitable recurrent networks provides a homeostatic mechanism that leads to asynchronous irregular network states. Further, it can accommodate synaptic memories with activity patterns that become indiscernible from the background state but can be reactivated by external stimuli. Our results suggest an essential role of inhibitory plasticity in the formation and maintenance of functional cortical circuitry.

The balance of excitatory and inhibitory membrane currents that a neuron experiences during stimulated and ongoing activity has been the topic of many studies. This balance, first defined as equal average amounts of de- and hyperpolarizing membrane currents (from here on referred to as “global balance”), is essential for maintaining stability of cortical networks. Balanced networks display asynchronous irregular (AI) dynamics that mimic activity patterns observed in cortical neurons. Such asynchronous network states facilitate rapid responses to small changes in the input, providing an ideal substrate for cortical signal processing.

Moreover, input currents to cortical neurons are not merely globally balanced but also coupled in time and cotuned for different stimulus features. The tight coupling of excitation and inhibition suggests a more precise, detailed balance, in which each excitatory input arrives at the cell together with an inhibitory counterpart, permitting sensory inputs to be transiently or persistently turned on by targeted disruptions of the balance.

Although the excitatory-inhibitory balance plays an important role for stability and information processing in cortical networks, it is not understood by which mechanisms this balance is established and maintained during ongoing sensory experiences. Inspired by recent experimental results, we investigated the hypothesis that synaptic plasticity at inhibitory synapses plays a central role in balancing the excitatory and inhibitory inputs a cell receives.

We simulated a single postsynaptic integrate-and-fire neuron receiving correlated excitatory and inhibitory input signals. The cell received input through 1000 synapses, which were divided into eight independent groups of 100 excitatory and 25 inhibitory synapses. All excitatory and inhibitory synapses within each group followed the same temporally modulated rate signal (time constant τ ∼ 50 ms) to mimic ongoing sensory activity. Spikes were generated from independent Poisson processes, leading to 125 different spike trains per signal. This architecture allowed each signal to reach the cell simultaneously through both excitatory and inhibitory synapses. To mimic glutamatergic and γ-aminobutyric acid (GABAergic) transmission, the synapses were conductance-based with reversal potentials VE = 0 mV and VI = −80 mV and time constants τE = 5 ms, and τI = 10 ms for excitation and inhibition, respectively. The strength of the inhibitory synapses was initially weak but could change according to a spike-timing–dependent plasticity rule, in which near-coincident pre- and postsynaptic spikes induce potentiation of the synapse. Additionally, every presynaptic spike leads to synaptic depression. This learning rule can be summarized as

Δw = η(pre × post – ρ0 × pre)

where:

w denotes the change in synaptic efficacy,

pre and post are the pre- and postsynaptic activity,

η is the learning rate, and

ρ0 is a constant that acts as a target rate for the postsynaptic neuron.

 

Whereas inhibitory synapses were plastic, the efficacies of the excitatory model synapses were fixed at the beginning of a simulation and left unchanged unless otherwise noted. Analogous to frequency- or orientation-tuned sensory neurons, excitatory synapses were tuned to have a preferred signal. Because all excitatory synapses were set to nonzero strengths, the postsynaptic neuron fired at high rates when the inhibitory synapses were weak at the beginning of a simulation. The resulting high number of pairs of pre- and postsynaptic spikes led to relatively indiscriminate strengthening of all inhibitory synapses until excitatory and inhibitory membrane currents became approximately balanced and the postsynaptic firing rate was dramatically reduced. In this globally balanced state, only unbalanced excitatory signals led to coincident pairs of pre- and postsynaptic spikes, consequently strengthening underpowered inhibitory synapses. Those inhibitory synapses that were stronger than their excitatory counterparts kept the postsynaptic side unresponsive and were thus weakened (because of sole presynaptic firing) until they allowed postsynaptic spiking again. Over time, this led to a precise, detailed balance of excitatory and inhibitory synaptic weights for each channel. In agreement with the mathematical analysis, the postsynaptic firing rate was determined mainly by the depression factor, ρ0, but not by the average input firing rate to the postsynaptic neuron. The mechanism was robust to plausible delays of several milliseconds. However, because detailed balance requires a correlation between excitatory and inhibitory synaptic inputs, the balance deteriorated when the delay between excitation and inhibition increased to values larger than the autocorrelation time of the input signals and the coincidence time of the Hebbian learning rule, but global balance still persisted.

To investigate how the state of the balance affects the neuron’s response properties, we presented a fixed stimulus sequence to the neuron and compared the spiking response over 50 trials to the input rates of each signal. In the globally balanced state in which inhibitory synapses were distributed so that excitation and inhibition were balanced only on average across all channels, the peristimulus time histogram (PSTH) faithfully reproduced the firing rates of the preferred signals. The other, nonpreferred input signals evoked more inhibition than excitation and thus had no impact on the cell’s firing behavior. An additional steplike input rate protocol, in which 100-ms-long pulses of various step sizes were presented to one channel at a time, revealed that spiking responses are largely insensitive to stimulus intensity and indeed narrowly tuned to the preferred stimulus, giving rise to an all-or-none response

Our results offer an explanation for how long-term memories can be stably embedded into networks as quiescent and overlapping Hebbian assemblies. Unlike previous studies, our network does not exhibit the behavior of an attractor network, in which activated cell assemblies will compete with each other and the winning pattern often exhibits persistent elevated activity. Instead, the network remains quiet unless the balance of one or more assemblies is modulated in favor of the excitation and returns to the background state when the modulation is turned off. We have shown this effect here by driving a subset of cells with an external stimulus, but there are several conceivable methods to modulate the balance of excitation and inhibition. The possibility to activate several patterns simultaneously allows the analog combination of patterns into larger composite memories. The capacity of storable and retrievable patterns is likely to depend on complex interactions between dynamics, size, and connectivity of the assemblies and the host network, as well as several other parameters.

We show that a simple, Hebbian plasticity rule on inhibitory synapses leads to robust and self-organized balance of excitation and inhibition that requires virtually no fine-tuning and captures an unexpected number of recent experimental findings. The precision of the learned balance depends on the degree of correlation between the excitatory and the inhibitory inputs to the cell, ranging from a global balance in the absence of correlated inputs to a detailed balance for strong correlations. The phenomenon is robust to the shape of the learning rule, as long as it obeys two fundamental requirements: Postsynaptic activity must potentiate activated inhibitory synapses, whereas in the absence of postsynaptic firing inhibitory synapses must decay. Because the balance is self-organized, inhibitory plasticity will most likely maintain balance also in the presence of excitatory plasticity, as long as excitation changes more slowly than inhibition or when excitatory plasticity events are rare.

The mammalian brain hosts a wide variety of inhibitory cell types with different synaptic time scales, response patterns, and morphological target regions. Presumably, these cell types serve different functions, and consequently their synapses may obey several different plasticity rules. In our simplified model, the dynamics of inhibitory plasticity powerfully contributes to the functional state of cortical architectures and may have a strong impact on cortical coding schemes.

[end of paraphrase]

 

 

    Return to  —  Neural Network