Figure 3—Adaption of the memory at-6 db SNR: (a) Blank initial memory; (b) Memory after first dump; (c) Memory after 12 dumps; (d) Memory after 40 dumps; (e) Perfect “checkerboard” memory for comparison

As the machine memory adapts to this noisy input signal, it progresses as shown in [Figure 3]. The sign of 103 memory components are displayed in a raster pattern in this figure. [Figure 3a] shows the memory in its blank initial state at the start of the adaption process. [Figure 3b] shows the memory after the first adaption of the memory. This first “dump” occurred after the threshold had decayed to the point where an energy measurement produced an acceptance decision. [Figure 3c] and 3d show the memory after 12 and 40 adaptions, respectively. These dumps, of course, are based on both energy and cross-correlation measurements. As can be seen, the adapted memory after 40 dumps is already quite close to the perfect memory shown by the “checkerboard” pattern of [Figure 3c].

The detailed analysis of the performance of this type of machine vs. signal-to-noise ratio, average signal repetition rate, signal duration, and machine parameters is extremely complex. Therefore, it is not appropriate here to detail the results of the analytical and experimental work on the performance of this machine. However, several conclusions of a general nature can be stated.

(a) Because the machine memory is always adapting, there is a relatively high penalty for “false alarms.” False alarms can destroy a perfect memory. Hence, the threshold level needs to be set appropriately high for the memory adaption. If one wishes to detect signal occurrences with more tolerance to false alarms, a separate comparator and threshold level should be used.

(b) The present machine structure, which allows for slowly varying changes in the signal waveshape, exhibits a marked threshold effect in steady-state performance at an input signal-to-noise ratio (peak signal power-to-average noise power ratio) of about -12 db. Below this signal level, the time required for convergence increases very rapidly with decreasing signal level. At higher SNR, convergence to noise-like signals, having good auto-correlation properties, occurs at a satisfactory rate.

A more detailed discussion of performance has been published in the report cited in footnote reference 1.

Conceptual Design of Self-Organizing Machines

P. A. Kleyn

Northrop Nortronics
Systems Support Department
Anaheim, California

Self-organization is defined and several examples which motivate this definition are presented. The significance of this definition is explored by comparison with the metrization problem discussed in the companion paper [(1)] and it is seen that self-organization requires decomposing the space representing the environment. In the absence of a priori knowledge of the environment, the self-organizing machine must resort to a sequence of projections on unit spheres to effect this decomposition. Such a sequence of projections can be provided by repeated use of a nilpotent projection operator (NPO). An analog computer mechanization of one such NPO is discussed and the signal processing behavior of the NPO is presented in detail using the Euclidean geometrical representation of the metrizable topology provided in the companion paper. Self-organizing systems using multiple NPO’s are discussed and current areas of research are identified.