Mr. Keynes holds that an induction may be rendered more probable by number of instances, not because of their mere number, but because of the probability, if the instances are very numerous, that they will have nothing in common except the characteristics in question. We want, let us suppose, to find out whether some quality A is always associated with some quality B. We find instances in which this is the case; but it may happen that in all our instances some quality C is also present, and that it is C that is associated with B. If we can so choose our instances that they have nothing in common except the qualities A and B, then we have better grounds for holding that A is always associated with B. If our instances are very numerous, then, even if we do not know that they have no other common quality, it may become quite likely that this is the case. This, according to Mr. Keynes, is the sole value of many instances.

A few technical terms are useful. Suppose we want to establish inductively that there is some probability in favour of the generalisation: “Everything that has the property F also has the property f”. We will call this generalisation g. Suppose we have observed a number of instances in which F and f go together, and no instances to the contrary. These instances may have other common properties as well; the sum-total of their common properties is called the total positive analogy, and the sum-total of their known common qualities is called the known positive analogy. The properties belonging to some but not to all of the instances in question are called the negative analogy: all of them constitute the total negative analogy, all those that are known constitute the known negative analogy. To strengthen an induction, we want to diminish the positive analogy to the utmost possible extent; this, according to Mr. Keynes, is why numerous instances are useful.

On “pure” induction, where we rely solely upon number of instances, without knowing how they affect the analogy, Mr. Keynes concludes (Treatise in Probability, p. 236):

“We have shown that if each of the instances necessarily follows from the generalisation, then each additional instance increases the probability of the generalisation, so long as the new instance could not have been predicted with certainty from a knowledge of the former instances.... The common notion, that each successive verification of a doubtful principle strengthens it, is formally proved, therefore without any appeal to conceptions of law or of causality. But we have not proved that this probability approaches certainty as a limit, or even that our conclusion becomes more likely than not, as the number of verifications or instances is indefinitely increased.”

It is obvious that induction is not much use unless, with suitable care, its conclusions can be rendered more likely to be true than false. This problem therefore necessarily occupies Mr. Keynes.

It is found that an induction will approach certainty as a limit if two conditions are fulfilled:

(1) If the generalisation is false, the probability of its being true in a new instance when it has been found to be true in a certain number of instances, however great that number may be, falls short of certainty by a finite amount.

(2) There is a finite a priori probability in favour of our generalisation.

Mr. Keynes uses “finite” here in a special sense. He holds that not all probabilities are numerically measurable; a “finite” probability is one which exceeds some numerically measurable probability however small. E.g. our generalisation has a finite a priori probability if it is less unlikely than throwing heads a billion times running.

The difficulty is, however, that there is no easily discoverable way of estimating the a priori probability of a generalisation. In examining this question, Mr. Keynes is led to a very interesting postulate which, if true, will, he thinks, give the required finite a priori probability. His postulate as he gives it is not quite correct, but I shall give his form first, and then the necessary modification.