Is the Brain a Computer?

English: a human brain in a jar
English: a human brain in a jar (Photo credit: Wikipedia)

I’ve just read an interesting article by Robert Epstein which tries to debunk the idea that the brain is a computer. His main thrust seems to be that the idea that the brain is a computer is just a metaphor, which it is. Metaphors however are extremely useful devices that use similarities between different systems to perhaps understand the least understood of the two systems.

Epstein points out that we have used several metaphors to try to understand the mind and the brain, depending on the current state of human knowledge (such as the hydraulic metaphor). This is true, but each metaphor is more accurate than the last. The computer model may well be the most accurate yet.

Cork in a hydraulic ram
Cork in a hydraulic ram (Photo credit: Wikipedia)

The computer model may well be all that we need to use to explain the operation of the brain and mind with very high accuracy. Brain and mind research may eventually inform the computer or information technology.

It is evident that Epstein bases his exposition on a partially understood model of computing – for instance it appears that he thinks that data is stored in a more or less permanent fashion in a computer. He says:

The idea, advanced by several scientists, that specific memories are somehow stored in individual neurons is preposterous; if anything, that assertion just pushes the problem of memory to an even more challenging level: how and where, after all, is the memory stored in the cell?

This describes one particular method of storing data only. It sort of equates with the way that data is stored on a hard disk. On a disk, a magnetic bit of the disk is flipped into a particular configuration which is permanent. However, in the memory of a computer, the RAM, the data is not permanent and will disappear when the computer is switched off. In fact the data has to be refreshed on every cycle of the computer’s timer. RAM is therefore called volatile memory.

English: Several PATA hard disk drives.
English: Several PATA hard disk drives. (Photo credit: Wikipedia)

In the early days of computing, data was stored in “delay line memory“. This is a type of memory which needs to be refreshed to preserve information contained in it. Essentially data is fed in and read out of a pipeline simultaneously, the read out being fed back to input again to complete the cycle and maintain the memory.

I expect that something similar may be happening in the brain when remembering something. It does mean that a memory may well be distributed throughout the brain at any one time. There is evidence that memory fades over time, and this could be related to an imperfect refresh process.

Schematic diagram of a delay locked loop (DLL)
Schematic diagram of a delay locked loop (DLL) (Photo credit: Wikipedia)

Epstein also has issues with the imperfect recall that we have of real life objects (and presumably events). He cites the recall of a dollar bill as an example. The version of the bill that people drew from memory was very simplified as compared to the version that they merely copied.

All that this really demonstrates is that when we remember things a lot of the information about the object is not stored and is lost. Similarly, when an image of the dollar bill is stored in a computer, information is lost. When it is restored to a computer screen it is not exactly the same as thing that is imaged. It is not the same as the image as stored in the computer.

Newfoundland 2 dollar bill
Newfoundland 2 dollar bill (Photo credit: Wikipedia)

It’s worth noting the image file in a computer is not the same as the real thing that it is an image of, as it is just a digitisation of the real thing as captured by the camera that created the image.

The image on the screen is not the same as either the original or the image in the computer, but the same is true of the image that the mind sees. It is digitised by the eye’s rods and cones and converted to an image in the brain.

English: Stylized idea of the communication be...
English: Stylized idea of the communication between the eye and the brain. (Photo credit: Wikipedia)

This digitised copy is what is recalled to the mind’s eye when we remember of recall it. The remembered copy of the original is therefore an interpretation of a digitised version of the original and therefore has lost information.

Just as the memory in our minds is imperfect, so is the image in the computer. Firstly the image in the computer is digital. The original object is continuous. Secondly, the resolution of the computer image has a certain resolution, say 1024 x 768, and some details in the original object will inevitably be lost. More details are lost with a lower resolution.

Computer monitor screen image simulated
Computer monitor screen image simulated (Photo credit: Wikipedia)

In addition the resolution of the image stored in the computer may not match the capabilities of the screen on which it is displayed and may need to be interpolated which produces another error. In the example of the dollar bill, the “resolution” in the mind is remarkably small and the “interpolation” onto the whiteboard is very imperfect.

Epstein also assumes a particular architecture of a computer which may be superseded quite soon in the future. In particular in a computer there is one timing circuit, a clock, that all other parts of the computer rely on. It is so important that the speed of a computer is related to the speed of this clock.

Clock signal + legend
Clock signal + legend (Photo credit: Wikipedia)

It may be that the brain may operate more like a network, where each part of the network keeps its own time and synchronisation is performed by a message based scheme. Or the parts of the brain may cooperate by some means that we don’t currently understand. I’m sure that the parts of the brain do cooperate and that we will eventually discover how it does it.

Epstein points out that babies appear to come with built in abilities to do such things as recognise faces, to have certain reflexes and so on. He doesn’t appear to know that computers also have built in certain basic abilities without which they would be useless hunks of silicon and metal.

An American Megatrends BIOS registering the “I...
An American Megatrends BIOS registering the “Intel CPU uCode Error” while doing POST, most likely a problem with the POST. (Photo credit: Wikipedia)

When you switch on a computer all it can do is read a disk and write data to RAM memory. That is all. When it has done this is gives control to program in RAM which, as a second stage, loads more information from the disk.

It may at this stage seek more information from the world around it by writing to the screen using a program loaded in the second stage and reading input from the keyboard or mouse, again using a program loaded in the second stage. Finally it gives control to the user via the programs loaded in the second stage. This process is called “bootstrapping” and relies on the simple hard coded abilities of the computer.

English: grub boot menu Nederlands: grub boot menu
English: grub boot menu Nederlands: grub boot menu (Photo credit: Wikipedia)

But humans learn and computers don’t. Isn’t that right? No, not exactly. A human brain learns by changing itself depending on what happens in the world outside itself. So do computers!

Say we have a bug in a computer program. This information is fed to the outside world and eventually the bug gets fixed and is manually or automatically downloaded and installed and the computer “learns” to avoid the bug.

Learning Organism
Learning Organism (Photo credit: Wikipedia)

It may be possible in the future for malfunction computer programs to update themselves automatically if made aware of the issue by the user just as a baby learns that poking Mum in the eye is an error, as Mum says “Ouch!” and backs off a little.

All in all, I believe that the computer analogy is a very good one and there is no good reason to toss it aside, especially if, as in Epstein’s article, there appears to be no concrete suggestion for a replacement for it. On the contrary, as knowledge of the brain grows, I will expect us to find more and more ways in which the brain resembles a computer and that possibly as a result, computers will become more and more like brains.

Brain 1
Brain 1 (Photo credit: Wikipedia)

 

%d bloggers like this: