Scientific Understanding of Consciousness
Consciousness as an Emergent Property of Thalamocortical Activity

Supercomputer Simulation of Brain Functionality

 

 

Nature Volume: 482,  p.456  (23 February 2012)

Human Brain Project (HBP) proposal

[paraphrase]

Proposal to build a supercomputer simulation that integrates everything known about the human brain, from the structures of ion channels in neural cell membranes up to mechanisms behind conscious decision-making.

Brain researchers are generating 60,000 papers per year, They're all beautiful, fantastic studies — but all focused on their one little corner: this molecule, this brain region, this function, this map. The HBP would integrate these discoveries and create models to explore how neural circuits are organized, and how they give rise to behaviour and cognition — among the deepest mysteries in neuroscience.

Neuroscientists would have to pool their discoveries systematically. Every experiment at least tacitly involves a model, whether it is the molecular structure of an ion channel or the dynamics of a cortical circuit. With computers you could encode all of those models explicitly and get them to work together. That would help researchers to find the gaps and contradictions in their knowledge and identify the experiments needed to resolve them.

Scientists have been devising mathematical models of neural activity since the early twentieth century, and using computers for the task since the 1950s. Instead of modelling each neuron as a point-like node in a larger neural network, the HBP proposes to model them in all their multi-branching detail — down to their myriad ion channels.

Many neuroscientists have deep unease about the HBP proposal. They think it is ill-conceived, not least because the idiosyncratic approach to brain simulation strikes them as grotesquely cumbersome and over-detailed.

The computer power required to run such a grand unified theory of the brain would be roughly an exaflop, or 1018 operations per second — hopeless in the 1990s, but the available computer power doubles roughly every 18 months, which meant that exascale computers could be available by the 2020s.

Blue Gene an IBM supercomputer optimized for large-scale simulations was used beginning in 2005, in the Blue Brain Project, an experiment in integrative neuroscience and, in retrospect, a prototype for the HBP.

Starting with a data set on the rat cortex, including results from some 20,000 experiments in many labs and data on about every cell type that was available, the computer simulation included the morphology, the reconstruction in three dimensions, the electrical properties, the synaptic communication, where the synapses are located, the way the synapses behave, even genetic data about what genes are expressed.

By the end of 2005, the researchers had integrated all the relevant portions of this data set into a single-neuron model. By 2008, they had linked about 10,000 such models into a simulation of a tube-shaped piece of cortex known as a cortical column. Now, using a more advanced version of Blue Gene, they have simulated 100 interconnected columns.

Unifying models can serve as repositories for data on cortical structure and function. Most of the effort has gone into creating “the huge ecosystem of infrastructure and software” required to make Blue Brain useful to every neuroscientist. This includes automatic tools for turning data into simulations, and informatics tools such as a user-editable website that automatically collates structural data on ion channels from publications in the PubMed database, and currently incorporates some 180,000 abstracts.

One of the key goals of the computer simulation will be to make it highly collaborative and internet-accessible, open to researchers from around the world. The project consortium already comprises some 150 principal investigators and 70 institutions in 22 countries.

[end of paraphrase]

 

 

    Return to — Modeling the Brain Functionality of Consciousness