We can begin to investigate the question of computer intelligence by again looking up a definition. The word “compute” means literally to think, or reckon, with. Early computers such as counting sticks, the abacus, and the adding machine are obviously something man thinks with. Even though we may know the multiplication tables, we find it easier and safer to use a mechanical device to remember and even to perform operations for us.
These homely devices do not possess sufficient “intelligence” to raise any fears in our minds. The abacus, for example, displays only what we might charitably call the property of memory. It has a certain number of rows, each row with a fixed number of beads. While it is not fallible, as is the human who uses it, it is far more limited in scope. All it can ever do is help us to add or subtract, and if we are clever, to multiply, divide, do square roots, and so on. If we are looking for purposive behavior in computing machines, it is only when we get to the adding machine that a glimmer appears. When a problem is set in and the proper button pushed, this device is compelled to go through the gear-whirring or whatever required to return it to a state of equilibrium with its problem solved.
So far we might facetiously describe the difference in the goal-seeking characteristics of man and machine by recalling that man seeks lofty goals like climbing mountains simply because they are there, while the computer seeks its goal much like the steel ball in the pinball machine, impelled by gravity and the built-in springs and chutes of the device. When we come to a more advanced computer, however, we begin to have difficulty in assessing characteristics. For the JOHNNIAC, built by Rand and named for John von Neumann, can prove the propositions in the Principia Mathematica of Whitehead and Russell. It can also “learn” to play a mediocre game of chess.
If we investigate the workings of a digital computer, we find much to remind us of the human brain. First is the obvious similarity of on-off, yes-no operation. This implies a power source, usually electrical, and a number of two-position switches. The over-all configuration of the classic computer resembles, in principal if not physical appearance, that of the human brain and its accessories.
As we have learned, the electronic computer has an input section, a control, an arithmetic (or logic) section, a memory, and an output. Looking into the arithmetic and memory sections, we find a number of comparisons with the brain. The computer uses power, far more than the brain. A single transistor, which forms only part of a neuron, may use a tenth of a watt; the brain is ahead on this score by a factor of millions to one.
Electronic switches have an advantage over the neuron in that they are much faster acting. So fast have they become that engineers have had to coin new terms like nanosecond and picosecond, for a billionth and a trillionth of a second. Thus, the computer’s individual elements are perhaps 100,000 times faster than those of the brain.
There is no computer in existence with the equivalent of 10 billion neurons. One ambitious system of computers does use half a million transistors, plus many other parts, but even these relatively few would not fit under a size 7-1/2 hat. One advanced technique, using a “2-D” metal film circuitry immersed in liquid helium for supercooling, hopefully will yield a packaging density of about 3-1/2 million parts per cubic foot in comparison with the brain’s trillion-part density.
We have mentioned the computer memory that included the “delay line,” remindful of the “chain circuit” in the brain. Electrical impulses were converted to acoustic signals in mercury, traversed the mercury, and were reconverted to electrical impulses. Early memory storage systems were “serial” in nature, like those stored on a tape reel. To find one bit of information required searching the whole reel. Now random-access methods are being used with memory core storage systems so wired that any one bit of information can be reached in about the same amount of time. This magnetic core memory stores information as a magnetic field, again analogous to a memory theory for the human brain except that the neuron is thought to undergo a chemical rather than magnetic change.
General Electric Co., Computer Dept.
Tiny ferrite cores like these make up the memory of some computers. Each core stores one “bit” of information.