Department of Energy Argonne National Laboratory Office of Science NEWTON's Homepage NEWTON's Homepage
NEWTON, Ask A Scientist!
NEWTON Home Page NEWTON Teachers Visit Our Archives Ask A Question How To Ask A Question Question of the Week Our Expert Scientists Volunteer at NEWTON! Frequently Asked Questions Referencing NEWTON About NEWTON About Ask A Scientist Education At Argonne If your mind were a computer
Name: fath
Status: N/A
Age: N/A
Location: N/A
Country: N/A
Date: Around 1993

In terms of computer memory how much information can the human brain store? Similarly, what is the processing speed and architecture of the brain?

Yours is a difficult question to answer, mainly because the human brain is the most complicated object in the known universe. Consider that a human brain contains some 100 billion neurons. And likely 500 billion or more supporting cells. And each neuron is extremely complicated electrically. Each neuron has millions of channels in its membrane and each of those channels is a digital gate that can allow electrical current to pass. The wiring diagram of a single neuron would be hopelessly complex. The simple answer to your question is that nobody knows. Information storage is easily defined in digital computers because one knows exactly how and where information is stored. Not so with the brain. We know some about how and where, but not enough to estimate how much information is up there in your 2 pound universe. Consider that if you know how to play baseball well, the entire rules book is stored in your brain. And besides that, the ability to hit a baseball with a bat is also stored in your brain (the cerebellum most likely, although some simple information is stored in the spinal cord). Think of all the face you would recognize. One would likely require 1000's of bits of information to accomplish a single face recognition. Our brain's capacity is vast, perhaps someone has tried to estimate its capacity, but I would guess not, since we are so ignorant. Processing speed is also difficult to answer and the tasks the brain processes are hard to quantify. Adding the digits 1 and 1 requires not only the ability to add, but the ability to recognize shapes, contrasts and to remember that certain shapes represent numbers. Some people are capable of doing mental calculations with the speed of some computers. Although these people are rare, it is possible that all people are capable of these feats with practice. You might recall the movie Rain Man that portrayed a man who could remember cards so well he could win at gambling often. As for architecture, here we know something, but certainly not enough. The human brain is both serial and massively parallel in its processing algorithms and in its structure. Each neurons sends out 100's of outputs to many different neurons. And each neuron receives thousands of inputs. These inputs are integrated in a complicated analog process involving synapses and dendritic trees. Once integrated, that information is relayed to a "spike initiation zone" where it is decided whether the neuron will send a digital signal or not. The temporal pattern of those digital signals travels over axons and cross long (inches or feet) distances in the nervous system. In general, information flows from sensors, into integrating systems of neurons, and then out to motor neurons and finally to effectors like muscles. I could try to answer a more detailed question if you like. But as I said, mostly we do not know much about the brain.

James Murray

It is also interesting that most people may use less than 10% of their neural capabilities. This figure was arrived at when head-trauma patients required the removal of nearly 90% of their cerebrum, and yet recovered and continued to lead normal, successful lives. Of course, this may partly have been the result of the parallel capabilities of the brain, overlapping and "making up" for what was lost. The closest computer approximation of a brain are neural networks. These are a general approximation of what happens when you take neurons communicating downstream and process inputs to the neurons. It seems like and exploding field with a lot of potential to answer questions like capacity and processing speed. If one uses neural networks as a model, there are some things one can infer about a brain. The brain probably does not store information as directly and accurately as a computer. The more neurons one trains to recognize an input, the more accurately they can recognize this input (to a limit). A brain includes a very robust and flexible input system, this means that a brain has to do a lot more with any one input that a computer usually needs to since any one input is a lot more complicated than a byte. Most everyday stimuli/inputs to the are fairly complex and the brain uses a fair amount of capacity to decipher it accurately. In contrast computer input is relatively discrete and simple. So a computer can afford the luxury of responding with precisely accurate output.


Click here to return to the Biology Archives

NEWTON is an electronic community for Science, Math, and Computer Science K-12 Educators, sponsored and operated by Argonne National Laboratory's Educational Programs, Andrew Skipor, Ph.D., Head of Educational Programs.

For assistance with NEWTON contact a System Operator (, or at Argonne's Educational Programs

Educational Programs
Building 360
9700 S. Cass Ave.
Argonne, Illinois
60439-4845, USA
Update: June 2012
Weclome To Newton

Argonne National Laboratory