We can accept that this world of enormously diversified forms of human practice (corresponding to the diversity of human beings) requires more than one type of literacy. But this is not yet sufficient condition for changing the current premise of education if the avenues of gaining knowledge are not developed. The assumption that language is a higher level system of signs is probably correct, but not necessarily significant for the inference that in order to function in a society, each member has to master this language. To free ourselves of this inference will take more than the argument founded on the efficiency of illiterate and aliterate individuals who constitute their identity in realms where literacy does not dominate, or ceased being entirely necessary.

Learning from the experience of interface

The exciting adventure of artificially replicating human characteristics and functions is probably as old as the awareness of self and others. Harnessing tools and machines in order to maximize the efficiency of praxis was always an experience in language use and craftsmanship. So far, the most challenging experience has been the use of computers to replicate the ability to calculate, process words and images, control production lines, interpret very complex data, and even to simulate aspects of human thinking.

Programming languages serve as mediating entities. Using a limited vocabulary and very precise logic, they translate sequences of operations that programmers assume need to be executed in order to successfully compute numbers, process words, operate on images, and even carry out the logical operations for playing chess and beating a human opponent at the game. A programming language is a translation of a goal into a description of the logical processes through which the goal can be achieved. Computer users do not deal with the programming language; they address the computer through the language of interface: words in plain English (or any other language for which interface is designed), or images standing for desired goals or operations. The entire machine does not speak or understand an interface's high-level language. The interaction of the user with the machine is translated by interface programs into whatever a machine can process. Providing efficient interfaces is probably as important as designing high level abstract programming languages and writing programs in those languages. Without such interfaces, only a limited number of people could involve themselves in computing. The experience of interface design can help us understand the direction of change to which the new pragmatics commits us. At the end of the road, the computer should physically disappear from our desks. All that will be needed is access to digital processing, not to the digital engine. The same was true of electricity. Once upon a time it was generated at the homes or workplaces where the people who needed it could use it. Now it is made available through distribution networks.

Natural language accomplished the function of interface long before the notion came into existence. Literacy was to be the permanent interface of human practical experiences, a unifying factor in the relation between the individual and society. Ideally, interface should not affect the way people constitute themselves; that is, it should be neutral in respect to their identity. This means that people can change and tasks can vary. The interface would account for the change and would accommodate new goals. Even in their wildest dreams, computer scientists and researchers in cognitive science and artificial intelligence, who work with intelligent interfaces, do not anticipate such a living interface. Interfaces affect the nature of practical experiences in computing. As these become more complex, a breakdown occurs because interfaces do not scale up. Instead of supporting better interactions, an interface can hamper them and affect the outcome of computing. Language has performed quite well under the pressure of scaling up. It grows with each new human practical experience and can adapt to a variety of tasks because the people constituted in language adapt. In the intimate relation between humans and their language, language limits new experiences by subjecting them to expectations of coherence. Language's expressive and communicative potential reaches its climax as the pragmatics that made it possible and necessary exhausts its own potential for efficiency. Literate language no longer enhances human abilities in practical experiences outside its pragmatic domain. Literacy only ends up limiting the scope of the experience to its own, and limits human growth.

Many impressive human accomplishments, probably the majority of them, are testimony to the powerful interface that literate language is. But these accomplishments are equal testimony to what occurs when the interface constitutes its own domain of motivations, or is applied as an instrument for pursuing goals that result in a forced uniformity of experiences. If literacy had been a neutral mediating entity, it would have scaled up to the new scale of humankind and the corresponding efficiency expectations, once the threshold was reached. Successive forms of religious, scientific, ideological, political, and economic domination are examples of powerful interface mechanisms. To understand this predicament, we can compare the sequence of interfaces connected to the experience of religion to the sequence of computer-user interfaces. Notwithstanding the fundamental differences between these two domains of practical experience, a striking similarity has to be acknowledged. Both start as limited experiences, open to the initiated few, and expand from a reduced sign system on interactions to very rich multimedia environments. From a limited secretive domain to the wide opening afforded by a trivial vocabulary, both evolve as double-headed entities: the language of the initiated individuals interfaced with the language of the individuals progressively integrated in the experience. No one should misconstrue this comparison, meant only to illustrate the constitutive nature of the experience of interfacing. We could as well focus on the experiences of economics, politics, ideology, science, fashion, or, even better, art.

The experience of literacy resulted in some consistency, but also in lost variety. Every language of interaction (interface) that disappeared took with it into oblivion experiences impossible to resuscitate. The relation between the individual and community, once very rich at various levels, grew weaker the more literacy took over. Literacy norms this relation, shaping it into a multiple-choice quiz. Information processing techniques applied on literacy-controlled forms of social interaction require even further standardization in order to be efficient. As a result, the individual is rationalized away, and the community becomes a locus for data management instead of a place for human interaction. The process exemplifies what happens when interface takes over and interacts with itself.

The various concerns raised so far only reiterate how important it is to understand the nature of interface processes. But experience gained in computational research of knowledge points to other aspects critical to the relation between the individual and society. Humans constitute themselves in a variety of practical experiences that require alternatives to language. Powerful mathematical notations, diagrams, visualization techniques, acoustics, holography, and virtual space are such alternative means. Non-linear association and cognitive paths, until now embodied in hypertext structures that we experience on the World Wide Web, belong to this category, too. Processing language is not equivalent to integrating these alternative means.

Cognitive requirements put severe restrictions on experiences grounded in means different from language, on account of the intensity and nature of cognitive processes, as well as of memory requirements. The genetic endowment formed in language-based practical experiences of self-constitution is not necessarily adapted to fundamentally different means of expression. Communication requires a shared substratum, which is established in an acculturation process that takes many generations. Enhanced by the new media, communication does not become more precise. Programs are conceived to enable the understanding of language. Everything ever written is scanned and stored for character recognition. Images are translated into short descriptions. A semantic component is attached to everything people compute. Hopes are high for using such means on a routine basis, though the compass might be set on some elusive direction. Even when machines will understand what we ask them to do-that is, when they integrate speech and handwriting recognition functions in the operating system-we will still have to articulate our goals. A technology capable of automating many operations that human beings still perform will increase output, and thus the efficiency of the effort applied. But the real challenge is to figure out ways to optimize the relation between what is possible and what is necessary. Procedures that will associate the output to the many criteria by which humans or the machine determine how meaningful that output is, are more important than raw technological performance. Until now, literacy has not proven to be the suitable instrument for this goal.

People and language change together. Individuals are formed in language; their practical experiences reshape language and lead to the need for new languages. If we cannot uncouple language and the human being, especially in view of the parallel evolution of genetic endowment and linguistic ability, we will continue to move in the vicious cycle of expression and representation. The issue is not language per se, but the claim that representation is the dominant, one might say exclusive, paradigm of human activity. Neither science nor philosophy has produced an alternative to representation.