The first wave of ridiculous predictions has run its course and been followed by loud refutations. Now there is a third period of calmer and more sensible approach. A growing proportion of scientists take a middle-of-the-stream attitude, weighing both sides of the case for the computer, yet some read like science fiction.

Cyberneticist Norbert Wiener, more scientist than fictioneer, professes to foresee computerized robots taking over from their masters, much as a Greek slave once did. Mathematician John Williams of the Rand Corporation thinks that computers can, and possibly will, become more intelligent than men.

Equally reputable scientists take the opposite view. Neuro-physiologist Gerhard Werner of Cornell Medical College doubts that computers can ever match the creativity of man. He seems to share the majority view today, though many who agree will add, tongue in cheek, that perhaps we’d better keep one hand on the wall plug just in case.

Thinking Defined

The first step in deciding whether or not the computer thinks is to define thinking. Far from being a simple task, this definition turns out to be a slippery thing. In fact, if the computer has done no more than demand this sort of reappraisal of the human brain’s working, it has justified its existence. Webster lists meanings for “think” under two headings, for the transitive and intransitive forms of the verb. These meanings, respectively, start out with “To form in the mind,” and “To exercise the powers of judgment ... to reflect for the purpose of reaching a conclusion.”

Even a fairly simple computer would seem to qualify as a thinker by these yardsticks. The storing of data in a computer memory may be analogous to forming in the mind, and manipulating numbers to find a square root certainly calls for some sort of judgment. Learning is a part of thinking, and computers are proving that they can learn—or at least be taught. Recall of this learning from the memory to solve problems is also a part of the thinking process, and again the computer demonstrates this capability.

One early psychological approach to the man-versus-machine debate was that of classifying living and nonliving things. In Outline of Psychology, the Englishman William McDougall lists seven attributes of life. Six of these describe “goal-seeking” qualities; the seventh refers to the ability to learn. In general, psychologist McDougall felt that purposive behavior was the key to the living organism. Thus any computer that is purposive—and any commercial model had better be!—is alive, in McDougall’s view. A restating of the division between man and machine is obviously in order.

Dr. W. Ross Ashby, a British scientist now working at the University of Illinois, defines intelligence as “appropriate selection” and goal-seeking as the intelligent process par excellence, whether the selecting is done by a human being or by a machine. Ashby does split off the “non goal-seeking” processes occurring in the human brain as a distinct class: “natural” processes neither good nor bad in themselves and resulting from man’s environment and his evolution.

Intelligence, to Ashby, who long ago demonstrated a mechanical “homeostat” which showed purposive behavior, is the utilization of information by highly efficient processing to achieve a high intensity of appropriate selection. Intelligent is as intelligent does, no distinction being made as to man or machine. Humanoid and artificial would thus be meaningless words for describing a computer. Ashby makes another important point in that the intelligence of a brain or a machine cannot exceed what has been put into it, unless we admit the workings of magic. Ashby’s beliefs are echoed in a way by scientist Oliver Selfridge of Lincoln Laboratory. Asked if a machine can think, Selfridge says, “Certainly; although the machine’s intelligence has an elusive, unnatural quality.”

“Think, Hell, COMPUTE!” reads the sign on the wall of a computer laboratory. But much of our thinking, perhaps some of the “natural” processes of our brains, doesn’t seem to fit into computational patterns. That part of our thinking, the part that includes looking at pretty girls, for example, will probably remain peculiar to the human brain.