Actually, it turns out that, especially if the dimensionality of the space is greater than one, the foregoing procedure not only provides one metrization, but many. Indeed, this lack of uniqueness is what makes the procedure exceedingly difficult. Only by imposing some additional conditions that result in the existence of a unique solution, does the problem become tractable.

We choose to impose the additional condition that the resulting metric space be a Euclidean geometry with a rectangular coordinate system.

Even this always does not yield uniqueness, but we will show the additional restriction that will guarantee uniqueness after the necessary language is developed. Since all metrizations of a given metrizable topology are isomorphic, in the quotient class the orthogonal Euclidean geometry serves the purpose of being a convenient representative of the unique element resulting from a given metrizable topology.

Furthermore, the same comment applies to the use of a Gaussian distribution as the probability distribution on this orthogonal Euclidean geometry. Namely, the random Gaussian distribution on an orthogonal Euclidean geometry is a convenient representative member of the equivalence class which maps into one element (stochastic space) of the quotient class.

Information Theory

Now, we will show that Information Theory provides the language necessary to describe the metrization procedure in detail.

It is possible to introduce Information Theory axiomatically by a suitable generalization of the axioms[16] in Feinstein.[17] But to simplify the discussion here, we will use the less elegant but equivalent method of defining certain definite integrals. The probability density distribution p is defined from the cumulative probability distribution P by

P(X′) = ∫X′measurable ⊂ X p(x)dx.(1)

Then the information rate H is defined as

H(X) = -∫ₓp(x) ln κ p(x)dx(2)