SOME CONTEMPORARY CONCEPTS
Since we are attempting to duplicate processes other than chemical, per se, we will forego any reference to the extensive literature of neurochemistry. It should not be surprising though if, at the neglect of the fundamental biological processes of growth, reproduction and metabolism, it proves possible to imitate some learning mechanisms with grossly less complex molecular structures. There is also much talk of chemical versus electrical theories and mechanisms in neurophysiology. The distinction, when it can be made, seems to hinge on the question of the scale of size of significant interactions. Thus, “chemical” interactions presumably take place at molecular distances, possibly as a result of or subsequent to a certain amount of thermal diffusion. “Electrical” interactions, on the other hand, are generally understood to imply longer range or larger scale macroscopic fields.
1. Cellular Structure
The human brain contains approximately 10¹⁰ neurons to which the neuron theory assigns the primary role in central nervous activity. These cells occupy, however, a relatively small fraction of the total volume. There are, for example, approximately 10 times that number of neuroglia, cells of relatively indeterminate function. Each neuron (consisting of cell body, dendrites and, sometimes, an axon) comes into close contact with the dendrites of other neurones at some thousands of places, these synapses and “ephapses” being spaced approximately 5μ apart [(1)]. The total number of such apparent junctions is therefore of the order of 10¹³. In spite of infinite fine-structure variations when viewed with slightly blurred vision, the cellular structure of the brain is remarkably homogeneous. In the cortex, at least, the extensions of most cells are relatively short, and when the cortex is at rest, it appears from the large EEG alpha-rhythms that large numbers of cells beat together in unison. Quoting again from Sperry, “In short, current brain theory encourages us to try to correlate our subjective psychic experience with the activity of relatively homogeneous nerve cell units conducting essentially homogeneous impulses, through roughly homogeneous cerebral tissue.”
2. Short-Term Memory
A train of impulses simply travelling on a long fiber may, for example, be regarded as a short-term memory much in the same way as a delay line acts as a transient memory in a computer. A similar but slightly longer term memory may also be thought of to exist in the form of waves circulating in closed loops [(23)]. In fact, it is almost universally held today that most significant memory occurs in two basic interrelated ways. First of all, such a short-term circulating, reverberatory or regenerative memory which, however, could not conceivably persist through such things as coma, anesthesia, concussion, extreme cold, deep sleep and convulsive seizures and thus, secondly, a long-term memory trace which must somehow reside in a semipermanent fine-structural change. As [Hebb (9)] stated, “A reverbratory trace might cooperate with a structural change and carry the memory until the growth change is made.”
3. The Synapse
The current most highly regarded specific conception of the synapse is largely due to and has been best described by [Eccles (5)]: “ ... the synaptic connections between nerve cells are the only functional connections of any significance. These synapses are of two types, excitatory and inhibitory, the former type tending to make nerve cells discharge impulses, the other to suppress the discharge. There is now convincing evidence that in vertebrate synapses each type operates through specific chemical transmitter substances ...”. In response to a presentation by [Hebb (10)], Eccles was quoted as saying, “One final point, and that is if there is electrical interaction, and we have seen from Dr. Estable’s work the complexity of connections, and we now know from the electronmicroscopists that there is no free space, only 200 Å clefts, everywhere in the central nervous system, then everything should be electrically interacted with everything else. I think this is only electrical background noise and, that when we lift with specific chemical connections above that noise we get a significant operational system. I would say that there is electrical interaction but it is just a noise, a nuisance.” Eccles’ conclusions are primarily based on data obtained in the peripheral nervous system and the spinal cord. But there is overwhelming reason to expect that cellular interactions in the brain are an entirely different affair. For example, “The highest centres in the octopus, as in vertebrates and arthropods, contain many small neurons. This finding is such a commonplace, that we have perhaps failed in the past to make the fullest inquiry into its implications. Many of these small cells possess numerous processes, but no axon. It is difficult to see, therefore, that their function can be conductive in the ordinary sense. Most of our ideas about nervous functioning are based on the assumption that each neuron acts essentially as a link in some chain of conduction, but there is really no warrant for this in the case of cells with many short branches. Until we know more of the relations of these processes to each other in the neuropile it would be unwise to say more. It is possible that the effective part of the discharge of such cells is not as it is in conduction in long pathways, the internal circuit that returns through the same fiber, but the external circuit that enters other processes, ...” [(3)].
4. Inhibition
The inhibitory chemical transmitter substance postulated by Eccles has never been detected in spite of numerous efforts to do so. The mechanism(s) of inhibition is perhaps the key to the question of cellular interaction and, in one form or another, must be accounted for in any adequate theory.
Other rather specific forms of excitation and inhibition interaction have been proposed at one time or another. Perhaps the best example is the polar neuron of [Gesell (8)] and, more recently, [Retzlaff (18)]. In such a concept, excitatory and inhibitory couplings differ basically because of a macroscopic structural difference at the cellular level; that is, various arrangements or orientation of intimate cellular structures give rise to either excitation or inhibition.
5. Long-Term Memory
Most modern theories of semipermanent structural change (or engrams, as they are sometimes called) look either to the molecular level or to the cellular level. Various specific locales for the engram have been suggested, including [(1)] modifications of RNA molecular structure, [(2)] changes of cell size, synapse area or dendrite extensions, [(3)] neuropile modification, and [(4)] local changes in the cell membrane. There is, in fact, rather direct evidence of the growth of neurons or their dendrites with use and the diminution or atrophy of dendrites with disuse. The apical dendrite of pyramidal neurones becomes thicker and more twisted with continuing activity, nerve fibers swell when active, sprout additional branches (at least in the spinal cord) and presumably increase the size and number of their terminal knobs. As pointed out by [Konorski (11)], the morphological conception of plasticity according to which plastic changes would be related to the formation and multiplication of new synaptic junctions goes back at least as far as Ramon y Cajal in 1904. Whatever the substrate of the memory trace, it is, at least in adults, remarkably immune to extensive brain damage and as [Young (24)] has said: “ ... this question of the nature of the memory trace is one of the most obscure and disputed in the whole of biology.”
6. Field Effects and Learning
First, from [Boycott and Young (3)], “The current conception, on which most discussions of learning still concentrate, is that the nervous system consists essentially of an aggregate of chains of conductors, linked at key points by synapses. This reflex conception, springing probably from Cartesian theory and method, has no doubt proved of outstanding value in helping us to analyse the actions of the spinal cord, but it can be argued that it has actually obstructed the development of understanding of cerebral function.”
Most observable evidence of learning and memory is extremely complex and its interpretation full of traps. Learning in its broadest sense might be detected as a semipermanent change of behavior pattern brought about as a result of experience. Within that kind of definition, we can surely identify several distinctly different types of learning, presumably with distinctly different kinds of mechanisms associated with each one. But, if we are to stick by our definition of a condition of semipermanent change of behavior as a criterion for learning, then we may also be misled into considering the development of a neurosis, for example, as learning, or even a deep coma as learning.
When we come to consider field effects, current theories tend to get fairly obscure, but there seems to be an almost universal recognition of the fact that such fields are significant. For example, [Morrell (16)] says in his review of electrophysiological contributions to the neural basis of learning, “A growing body of knowledge (see reviews by Purpura, Grundfest, and Bishop) suggests that the most significant integrative work of the central nervous system is carried on in graded response elements—elements in which the degree of reaction depends upon stimulus intensity and is not all-or-none, which have no refractory period and in which continuously varying potential changes of either sign occur and mix and algebraically sum.” [Gerard (7)] also makes a number of general comments along these lines. “These attributes of a given cell are, in turn, normally controlled by impulses arising from other regions, by fields surrounding them—both electric and chemical—electric and chemical fields can strongly influence the interaction of neurones. This has been amply expounded in the case of the electric fields.”
Learning situations involving “punishment” and “reward” or, subjectively, “pain” and “pleasure” may very likely be associated with transient but structurally widespread field effects. States of distress and of success seem to exert a lasting influence on behavior only in relation to simultaneous sensory events or, better yet, sensory events just immediately preceding in time. For example, the “anticipatory” nature of a conditioned reflex has been widely noted [(21)]. From a structural point of view, it is as if recently active sites regardless of location or function were especially sensitive to extensive fields. There is a known inherent electrical property of both nerve membrane and passive iron surface that could hold the answer to this mechanism of spatially-diffuse temporal association; namely, the surface resistance drops to less than 1 per cent of its resting value during the refractory period which immediately follows activation.