THE ADAPTIVE DETECTION MACHINE

The purpose of this note is to describe very briefly a machine[2] which has been implemented to recover the noise-perturbed binary waveform. A simplified block diagram of the machine is shown in [Figure 1]. The experimental machine has been designed to operate on signals of 10³ samples duration.

Each analog input sample enters the machine at left and may either contain a signal sample plus noise or noise alone. In order to permit digital operation in the machine, the samples are quantized in a symmetrical three-level quantizer. The samples are then converted to vector form, e.g., the previous 10³ samples form the vector components. A new input vector, Y⁽ⁱ⁾, is formed at each sample instant.

Define the signal sample values as s₁, s₂, ..., sₙ. The observed vector Y⁽ⁱ⁾ is then either (a) perfectly centered signal plus noise, (b) shifted signal plus noise, or (c) noise alone.

(s₁, s₂, ..., sₙ) + (n₁, n₂, ..., nₙ)(a)
(Y⁽ⁱ⁾)ᵗ = (0, ..., s₁, s₂, ..., sₙ₋ⱼ) + (n₁, n₂, ..., nₙ) (b)
(0 ... 0) + (n₁, n₂, ..., nₙ)(c)

At each sample instant, two measurements are made on the input vector, an energy measurement ‖Y⁽ⁱ⁾‖² and a polarity coincidence cross-correlation with the present estimate of the signal vector stored in memory. If the weighted sum of the energy and cross-correlation measurements exceeds the present threshold value Γᵢ, the input vector is accepted as containing the signal (properly shifted in time), and the input vector is added to the memory. The adaptive memory has 2Q levels, 2Q-1 positive levels, 1 zero level and 2Q-1-1 negative levels. New contributions are made to the memory by normal vector addition except that saturation occurs when a component value is at the maximum or minimum level.

The acceptance or rejection of a given input vector is based on a hypersphere decision boundary. The input vector is accepted if the weighted sum γᵢ exceeds the threshold Γᵢ

γᵢ = Y⁽ⁱ⁾∙M⁽ⁱ⁾ + α‖Y⁽ⁱ⁾‖² ⩾ Γᵢ.

Figure 1—Block diagram of the adaptive binary waveform detector

Geometrically, we see that the input vector is accepted if it falls on or outside of a hypersphere centered at

- M⁽ⁱ⁾
C⁽ⁱ⁾ = ——

having radius squared

Γ⁽ⁱ⁾ ‖M⁽ⁱ⁾‖²
[r⁽ⁱ⁾]² = —— + ———— .
α (2α)²

Both the center and radius of this hypersphere change as the machine adapts. The performance and optimality of hypersphere-type decision boundaries have been discussed in related work by Glaser[3] and Cooper.[4]

The threshold value, Γᵢ, is adapted so that it increases if the memory becomes a better replica of the signal with the result that γᵢ increases. On the other hand, if the memory is a poor replica of the signal (for example, if it contains noise alone), it is necessary that the threshold decay with time to the point where additional acceptances can modify the memory structure.

The experimental machine is entirely digital in operation and, as stated above, is capable of recovering waveforms of up to 10³ samples in duration. In a typical experiment, one might attempt to recover an unknown noise-perturbed, pseudo-random waveform of up to 10³ bits duration which occurs at random intervals. If no information is available as to the signal waveshape, the adaptive memory is blank at the start of the experiment.

In order to illustrate the operation of the machine most clearly, let us consider a repetitive binary waveform which is composed of 10³ bits of alternate “zeros” and “ones.” A portion of this waveform is shown in [Figure 2a]. The waveform actually observed is a noise-perturbed version of this waveform shown in [Figure 2b] at-6 db signal-to-noise ratio. The exact sign of each of the signal bits obviously could not be accurately determined by direct observation of [Figure 2b].

(a) Binary signal

(b) Binary signal plus noise

Figure 2—Binary signal with additive noise at-6 db SNR

(a)

(b)

(c)

(d)

(e)

Figure 3—Adaption of the memory at-6 db SNR: (a) Blank initial memory; (b) Memory after first dump; (c) Memory after 12 dumps; (d) Memory after 40 dumps; (e) Perfect “checkerboard” memory for comparison

As the machine memory adapts to this noisy input signal, it progresses as shown in [Figure 3]. The sign of 103 memory components are displayed in a raster pattern in this figure. [Figure 3a] shows the memory in its blank initial state at the start of the adaption process. [Figure 3b] shows the memory after the first adaption of the memory. This first “dump” occurred after the threshold had decayed to the point where an energy measurement produced an acceptance decision. [Figure 3c] and 3d show the memory after 12 and 40 adaptions, respectively. These dumps, of course, are based on both energy and cross-correlation measurements. As can be seen, the adapted memory after 40 dumps is already quite close to the perfect memory shown by the “checkerboard” pattern of [Figure 3c].

The detailed analysis of the performance of this type of machine vs. signal-to-noise ratio, average signal repetition rate, signal duration, and machine parameters is extremely complex. Therefore, it is not appropriate here to detail the results of the analytical and experimental work on the performance of this machine. However, several conclusions of a general nature can be stated.

(a) Because the machine memory is always adapting, there is a relatively high penalty for “false alarms.” False alarms can destroy a perfect memory. Hence, the threshold level needs to be set appropriately high for the memory adaption. If one wishes to detect signal occurrences with more tolerance to false alarms, a separate comparator and threshold level should be used.

(b) The present machine structure, which allows for slowly varying changes in the signal waveshape, exhibits a marked threshold effect in steady-state performance at an input signal-to-noise ratio (peak signal power-to-average noise power ratio) of about -12 db. Below this signal level, the time required for convergence increases very rapidly with decreasing signal level. At higher SNR, convergence to noise-like signals, having good auto-correlation properties, occurs at a satisfactory rate.

A more detailed discussion of performance has been published in the report cited in footnote reference 1.