Scientific Understanding of Consciousness |
Neural Network — Recent Research
Science 4 July 2008: Vol. 321. no. 5885, pp. 48 - 50 Transient Dynamics for Neural Processing Misha Rabinovich,1 Ramon Huerta,1 Gilles Laurent2 1Institute for Nonlinear Science, University of California at San Diego, La Jolla, CA 92093, USA. 2Division of Biology, California Institute of Technology, Pasadena, CA 91125, USA. (paraphrase) Neural networks are complicated dynamical entities, whose properties are understood only in the simplest cases. When the complex biophysical properties of neurons and their connections (synapses) are combined with realistic connectivity rules and scales, network dynamics are usually difficult to predict. Yet, experimental neuroscience is often based on the implicit premise that the neural mechanisms underlying sensation, perception, and cognition are well approximated by steady-state measurements (of neuron activity) or by models in which the behavior of the network is simple (steady state or periodic). Transient states--ones in which no stable equilibrium is reached--may sometimes better describe neural network behavior. Computing with "attractors" is a concept familiar to the neural networks community. Upon some input signal, a model neural network will gradually change its pattern of activated nodes (neurons) until it settles into one pattern--an attractor state. Thus, the input--a voice, an odor, or something more abstract--is associated with properties of the entire network in a particular attractor state. Such patterns of neural activity might be established, learned, and recalled during perception, memorization, and retrieval, respectively. The range of possible dynamics expressed by neural networks may be expressed by non-classical, transient dynamics of "liquid-state machines,” in which computation is carried out over time without any need for a classical attractor state. Because neural phenomena often occur on very short time scales, classical attractor states--fixed points or limit cycles--cannot be realistically reached. An alternative theoretical framework may explain some forms of neural network dynamics that are consistent both with experiments and with transient dynamics. Because transients and fixed points represent states of neuronal populations, and because these states are themselves read out or "decoded" by yet other neuronal populations, stimulus identification by such decoders should be more reliable with transient than with fixed-point states. To understand such transient dynamics, a mathematical image is needed that is consistent with existing results, and its underlying model(s) must be used to generate testable predictions. One possible image is a stable heteroclinic channel defined by a sequence of successive metastable ("saddle") states. Such dynamical objects are rare in low-dimensional systems, but common in complex ones. Although many connection statistics probably work for stable heteroclinic-type dynamics, it is likely that connectivity within biological networks is, to some extent at least, the result of optimization by evolution and synaptic plasticity. The idea behind a liquid-state machine is based on the proposals that the cerebral cortex is a nonequilibrium system and that brain computations can be thought of as unique patterns of transient activity, controlled by incoming input. The results of these computations must be reproducible, robust against noise, and easily decoded. Because a stable heteroclinic channel is possibly the only dynamical object that satisfies all required conditions, it is plausible that "liquid-state machines" are dynamical systems with stable heteroclinic channels, based on the principle of winner-less competition. Thus, using asymmetric inhibition appropriately, the space of possible states of large neural systems can be restricted to connected saddle points, forming stable heteroclinic channels. These channels can be thought of as underlying reliable transient brain dynamics. (end of paraphrase)
Nature 452, 436-441 (27 March 2008) Compartmentalized dendritic plasticity and input feature storage in neurons Attila Losonczy1,2, Judit K. Makara1,2 & Jeffrey C. Magee1 Howard Hughes Medical Institute, Janelia Farm Research Campus, 19700 Helix Dr Ashburn, Virginia 20147, USA (paraphrase) Although information storage in the central nervous system is thought to be primarily mediated by various forms of synaptic plasticity, other mechanisms, such as modifications in membrane excitability, are available. Local dendritic spikes are nonlinear voltage events that are initiated within dendritic branches by spatially clustered and temporally synchronous synaptic input. That local spikes selectively respond only to appropriately correlated input allows them to function as input feature detectors and potentially as powerful information storage mechanisms. However, it is currently unknown whether any effective form of local dendritic spike plasticity exists. Here we show that the coupling between local dendritic spikes and the soma of rat hippocampal CA1 pyramidal neurons can be modified in a branch-specific manner through an N-methyl-d-aspartate receptor (NMDAR)-dependent regulation of dendritic Kv4.2 potassium channels. These data suggest that compartmentalized changes in branch excitability could store multiple complex features of synaptic input, such as their spatio-temporal correlation. We propose that this 'branch strength potentiation' represents a previously unknown form of information storage that is distinct from that produced by changes in synaptic efficacy both at the mechanistic level and in the type of information stored. (end of paraphrase)
Science 5 December 2008: Vol. 322. no. 5907, pp. 1551 - 1555 Astroglial Metabolic Networks Sustain Hippocampal Synaptic Transmission Nathalie Rouach,1 Annette Koulakoff,1 Veronica Abudara,1,2 Klaus Willecke,3 Christian Giaume1 1 INSERM U840, Collége de France, 11 place Marcelin Berthelot, 75005 Paris, France. (paraphrase) Astrocytes provide metabolic substrates to neurons in an activity-dependent manner. However, the molecular mechanisms involved in this function, as well as its role in synaptic transmission, remain unclear. Here, we show that the gap-junction subunit proteins connexin 43 and 30 allow intercellular trafficking of glucose and its metabolites through astroglial networks. This trafficking is regulated by glutamatergic synaptic activity mediated by AMPA receptors. In the absence of extracellular glucose, the delivery of glucose or lactate to astrocytes sustains glutamatergic synaptic transmission and epileptiform activity only when they are connected by gap junctions. These results indicate that astroglial gap junctions provide an activity-dependent intercellular pathway for the delivery of energetic metabolites from blood vessels to distal neurons. Glucose, transported by the blood, is the major source of energy used by the brain for neuronal activity. It has been proposed that neurons obtain most of their energy from extracellular lactate, a glucose metabolite produced by astrocytes. Indeed, astrocytes provide by their perivascular endfeet and processes a physical link between the vasculature and the synaptic terminals, supporting the concept of metabolic coupling between glia and neurons. Moreover, a typical feature of astrocytes is their network organization resulting from extensive intercellular communication through gap-junction channels formed by connexins (Cxs). The aim of this work was to determine whether and how the connectivity of local perivascular astroglial networks contributes to their metabolic supportive function to neurons. Our findings identify a previously unknown role for gap-junction channels in hippocampal astrocytes. The Cxs constitute the molecular basis for perivascular astroglial metabolic networks, allowing activity-dependent intercellular trafficking of energetic metabolites used to sustain glutamatergic synaptic activity. These data extend the classical model of astroglial energy metabolism in brain function. By including gap-junction-mediated metabolic networks of astrocytes, we propose that supply of energetic metabolites involves groups of connected astrocytes to reach more efficiently and distally the sites of high neuronal demand. Gap junctions are directly involved in the metabolic supportive function of astrocytes by providing an activity-dependent intercellular pathway for glucose delivery from blood vessels to distal neurons. (end of paraphrase)
Return to — Neural Network |