The Digital Counter
The digital computer was first on the scene and it appears now that it will outnumber and perhaps outlive its analog relative. A simple computer of this type is as old as man, though it is doubtful that it has been in use that long. Proof of this claim to its pioneering are the words digit and calculi, for finger and pebbles, respectively. We counted “how many” before we measured “how large,” and the old Romans tallied on fingers until they ran out and then supplemented with pebbles.
Perhaps the first computations more complex than simple counting of wives or flocks came about when some wag found that he could ascertain the number of sheep by counting legs and dividing by four. When it was learned that the thing worked both ways and that the number of pickled pigs feet was four times the number of pigs processed, arithmetic was born. The important difference between analog and digital, of course, is that the latter is a means of counting, a dealing with discrete numbers rather than measuring.
This kind of computation was taxed sorely when such things as fractions and relationships like pi came along, but even then man has managed to continue dealing with numbers themselves rather than quantity. Just as the slide rule is a handy symbol for the analog computer, the abacus serves us nicely to illustrate the digital type, and some schools make a practice of teaching simple arithmetic to youngsters in this manner.
Our chapter on the history of the computer touched on early efforts in the digital field, though no stress was laid on the distinction between types. We might review a bit, and pick out which of the mechanical calculating devices were actually digital. The first obviously was the abacus. It was also the only one for a long time. Having discovered the principle of analogy, man leaned in that direction for many centuries, and clocks, celestial simulators, and other devices were analog in nature. Purists point out that even the counting machines of Pascal and Leibnitz were analog computers, since they dealt with the turning of shafts and gears rather than the manipulation of digits. The same reasoning has caused some debate about Babbage’s great machines in the 1800’s, although they are generally considered a digital approach to problem-solving. Perhaps logicians had as much as anyone to do with the increasing popularity of the digital trend when they pointed out the advantages of a binary or two-valued system.
With the completion in 1946 by Eckert and Mauchly of the electronic marvel they dubbed ENIAC, the modern digital computer had arrived and the floodgates were opened for the thousands of descendants that have followed. For every analog computer now being built there are dozens or perhaps hundreds of digital types. Such popularity must be deserved, so let us examine the creature in an attempt to find the reason.
Courtesy of the National Science Foundation
The computer family tree. Its remarkable growth began with government-supported research, continued in the universities; and the current generation was developed primarily in private industry.
We said that by its nature the analog device tended to be a special-purpose computer. The digital computer, perhaps because its basic operation is so childishly simple, is best suited for general-purpose work. It is simple, consisting essentially of switches that are either on or off. Yet Leibnitz found beauty in that simplicity, and even the explanation of the universe. Proper interconnection of sufficient on-off switches makes possible the most flexible of all computers—man’s brain. By the same token, man-made computers of the digital type can do a wider variety of jobs than can the analog which seemingly is more sophisticated.
A second great virtue of the digital machine is its accuracy. Even a trial machine of Babbage had a 5-place accuracy. This is an error of only one part in ten thousand, achievable in the analog at great expense. This was of course only a preliminary model, and the English inventor planned 20-place accuracy in his dream computer. Present electronic digital computers offer 10-place accuracy as commonplace, a precision impossible of achievement in the analog.