KENNETH LIM @ ART PAGES
CHAPTER 3
REPRESENTING OUR WORLDS:
DIGITAL TRANSLATION
Current digital computer research and innovation is aimed at the front end of the human-computer equation. Artificially Intelligent machines, to ease human interface, have to "perceive" inputs from our point of view by possessing what humanly "intelligent" behaviour as natural language and speech understanding and computer vision. Our consumption of computer technology will move towards their simulation of our cognitive capacities.
As the value of information is assigned by consumers, AI research has to represent the real world in a formal language that reflects human classification of the external material world, and has the capacity to process human values. Interactive technologies within Stratum 2.1 seem to be progressing quite rapidly in developing systems with speech recognition, and the flexible use of natural language. The understanding and capability to process human values, will have to be left to the endeavours of Hard AI technologies as reflected in Stratum 2.2, where sophisticated machines are expected to be taught to recognise those values. Computer programming languages are hierarchies of rules or sub-routines for task performance. Beginning from the 1/0 binary, sub-routines interpret the series of binary codes, which are further processed as sub-routines, and the processing goes on depending on the sophistication of the program. Interchangeability in formal language systems within the logical paradigm of digital computers may hold the key towards our understanding of our own cognitive capabilities, and for capitalism to exploit our potential through digital technology.
Programming languages such as PROLOG, LISP and POP 11 are procedural or declarative models of representations. Procedural languages such as POP11 are currently used to construction high level neural network systems that are capable of learning. Such languages seek to represent the world comprehensively, with a formal system that is flexible enough for the dynamics of reality. Once perfected, this system ideally will be able to hold itself under interrogation. Once represented, the world we perceive is fragmented into the world that we experience through our sensory systems, and the possible worlds as defined by our language capacities.
There are a number of factors that determine future interactive computer technologies. Successful human-computer interface depends on the success of digital representations of human rationality. The precision of formal logic and discrete systems are considered to be a major inhibitor of AI developments because of their inability to represent and predict human cognitive patterns.
From the late 1980s, Western Scientific thought seemed to have plateaued in the field of Artificial Intelligence research as they have hit the edges of their philosophical boundaries. Japan, being limited by the technological output of the West, and its needs to globalize in their effort to sustain progress, have to reassess their position as Technologists. Having milked the West dry of their innovations, Japan is left on its own to decide its future as a Technological centre of the world. To progress, Japan has to be Science and Technology innovators. Some consideration will be made to traditional Japanese philosophies as they seemed to have limited the pursuit of Scientific Knowledge, but at the same time, offers an alternative conceptual framework to work on, to avoid the pitfalls of Western rational traditions. This is not without its problems, because technological research in Japan hadn't started from scratch, and depended very much on the adaptation of existing Western technology. The Japanese traditionally are masters of transformation. Such adaptability is found generally in all Japanese artefacts, from motif design to the Transformer toy (a robot that is able to transform into a motor vehicle), or Teenage Mutant Ninja Turtles. The art of technological application has enormous potential for Japanese innovators, especially when they are no longer dependent on the Western market for the sales of their technology. As the emphasis is now towards digital computer "perception", the Japanese, with their Fifth Generation and TRON projects may find a niche for themselves. To meet their competitors head on, TRON's multilingual software aims to give Japanese companies, and even other non English-speaking countries the advantage by providing software in Chinese, Korean, Japanese, Arabic, French, Spanish and many others (Tatsuno, 1990, p. 158).
The market for interactive computer technologies have also shifted. The gradual economic slide of the U.S. as the largest consumer market, with the opening up of China and the development of the Asian economies will mean that the Americans no longer dictate the demands for technology. The fifth generation computers and other associated technologies need only be sanctioned by the Asian markets. The West is not blind to the pull of capital, hence their aggressive move into Asian technological research and development. The 1990s and the early 21st century will see the forces of both technological and economic giants fighting for supremacy in the Asian region. The most potent applications in the making are natural speech recognition machines, translation machines, multimedia systems and hand script recognition, all inroads into the fields of Artificial Intelligence.
DUALISM
Computers are Western Innovations. Charles Babbage's unlimited "Finite machine" (Sharples et al, 1989) and Alan Turing's "Universal machine" (1950) were the basis for present day computing technologies. Babbage's analytic machine was able to analyze a seemingly unlimited number of mathematical problems, whilst Turing's Universal machine, utilizing the 1/0 binary system, is capable of encoding everything into the on/off process. Computer processing languages such as DOS, PROLOG, LISP etc., is all high level 1/0 utilities that manipulate the information processing mechanism mathematically upon a given input to produce a desired output.
Dualism dominates all digital technologies. With the development of an alphabet system, a logic system emerged within Western thought that distinguishes and classifies the world in the either/or binary system of opposites. The English Language system has been argued to be the basis for Western Scientific thought (Winograd and Flores, 1987, Lyotard 1984, Logan, 1986). The underlying argument is very convincing in identifying the boundaries and scope of Western rationalistic traditions to be delineated by the phonetic alphabet system of language. Western Dualism, such as the Cartesian Dualistic of the mind/body dichotomy, is based on the notion of opposites. In formal logic, the dichotomy is usually stated as "A" or "-A", whereby both elements exist to eliminate each other out. The existence of "A" and "-A" is considered to be a paradox in this instance. Eastern religions such as Zen Buddhism and Shinto consider this dichotomy to be harmonious and mutually transformable. Similar to the Chinese concept of the yin/yang dialectic, this complementary dualism is reflected in the Mandala of Japanese Dualism (Mitsukuni, 1982, p. 24).
This mandala of dualism is of particular interest to this exploration into interactive computer technology, because it reflects the ideal human-computer relationship that electronic artists are currently striving to achieve in their installations. A relationship where full contact is made along the peripheries of humans with technology, extending and complementing human faculties and desires. This unity also serves as an alternative to the intrusiveness of Haraway's Cybernetic Organism and the violence of Tsukamoto's "Tetsuo" (1989).
Another interesting aspect of the Japanese mandala of Dualism is the fact that the world of the human mind is much bigger and it envelopes the real material world. The purpose of this concentric model is that balance has to be achieved on both the inner and outer worlds, but the way in which such balance is represented may be different from the harmony of the real material world. This rings similar to the concept of structural coupling proposed by Maturana and Verela (1980), where it is proposed that a particular system of representation, such as a language system, may be able to represent the material world accurately by drawing some form of correspondence between signs and objects, the correspondence is achieved by different structural organization. The language system is organized differently to the real material organization of the world, and yet correlation is achieved.
To overcome the limitations of dualism, the West has looked towards the East for inspiration. The apparent holism of most Eastern philosophies points to the relativity of things whereby concepts are developed to focus on relations, may solve the limitations of the binary system. Although Eastern philosophies appear to be antithetical to their Western counterparts, they are not immune to the problem of dualism, only the boundaries are drawn differently, and the relationships between the components are different in nature, as we have seen earlier in the mandate of dualism (Figure 2). Japanese philosophies and technology will be given further consideration later.
Hofstadter (1981, p. 197) made an interesting observation about the different practices of "Hard" scientists versus "Soft" scientists. Current technological race between USA and Japan ultimately relies on their ability to conceptualize according to the 1/0 binary system and yet also to represent, if not simulate efficiently, the space between discrete units (analog system) in the most economical way. In this technological climate, the distance between East and West has, in many ways, been bridged by information technology and other communication advances. We are now caught in the smear of both influences. McLuhan and Powers (1989) predicted outcome of this movement to be the "Easternization" of Western technological research and development and the subsequent "Westernization" of the East. This argument is rather extreme, although it holds true that the Japanese society perceives the West with much awe, whilst playing down their own cultural worth. A new Western movement seeking to fragment the binary system is Postmodernism. Although Postmodernism rejects the notion of the discrete whole by examining the position between components (both quantitatively, and qualitatively) that make up this whole, language research has to examine the multiple dynamics of the binary relationship and interaction of discourses. The boundaries represented by the discrete symbols (words) are in many ways arbitrary, as they are defined by contextual influences. Early Western Philosophical traditions have made it seemingly impossible to break away from the binary system, but postmodernism may be successful in rectifying the problem if McLuhan and Powers (1989) have it their way. McLuhan and Powers (1989) see the West to be moving towards the "acoustic space" of the East, where the multiplicities of non-linear and multi-dimensional cross-scale dynamics dominate their perception of the world.
REPRESENTATION
Computers are language systems. Knowledge is represented in 1/0 binaries in the systems by sophisticated computer programs. The Turing machine has enabled infinite universal knowledge to be represented by a finite binary system. Difficulties in representation are problems of the software engineers - problems of perception, of translation etc. The difficulty is the dynamics of the programmers conceptual capacities. The role of a program is to describe the things it seeks to represent. Initially, it is tempting to read computer programming as a quest for excavating and modelling universal laws of human nature in line with the Chomskyan (Chomsky, 1974) mathematical theory of mind, where the notion of innate principles controlling and guiding our social, intellectual and individual behaviour had to be unravelled. If these Universal laws exist, then a hard AI proponent will be able to simulate the principles in computer programs to produce desired humanistic intelligent behaviour. As we have learnt from Foucault (1974) and Lyotard (1984), these formal descriptions are historicized and open to interpretation (often individualized), often contested by theorists and practitioners alike as to the most accurate and efficient way to model that knowledge. They view Scientific knowledge as "a kind of discourse" (Lyotard, 1984, p.3 and Foucault, 1974, pp. 140, 160). "Leading" sciences and technologies have concentrated on the study of language and communication, where concepts are dependent on the semantic potential of the language - whether a particular language is capable of perceiving or describing a particular concept or phenomena. The socialization of that concept is reflected in the way ideas are communicated and translated. The binary digit in computers, also known as "bit", forms the basis of information that is devoid of context, syntactical and semantic content. The bit is just an arbitrary unit, bearing no matter nor energy (Wierner 1961, pp. 132), whereas information can be measured with energy - it has an energy value. This energy level is often measured for value in the capitalist society, and is used as currency for exchange.
The direction of cultural and scientific pursuits in the new order is geared towards representing the seemingly unrepresentable. Paradoxes in Western language systems are perceived as "unitary" by Japanese thought. Concepts and human endeavours are legitimized if they are translatable for machine manipulation, and technological consumption. Knowledge is no longer privileged in individual minds, but is "exteriorized", gradually divorcing knowledge from the "training of minds", but is now commoditized for exchange and commerce, losing its "use-value." The isolation of information as an entity separated it from its agents of transmission (Gardner, 1985, pp. 21), by the syntax of reductionism. This follows that Information is everywhere. Natural language evolves to incorporate the vocabulary and syntax of computer languages, overhauling the discursive conventions (e.g. technotalk) of computer-based environments, such as the corporation and technical institution. Linguistics have adopted the digital computer as a platform to test theories. Anything that is picked up by the human senses or perceived by human cognitive apparatus is considered to be information - information that can be translated into algorithms, that functionally map out the data received. Information is also created, transformed, processed, packaged, bought, sold, shared and owned. The rational of mathematics as a precise medium of representation has taken over the relatively more ambiguous system of Natural Language. This is because algorithms found in digital computer software are descriptive, mapping out the characteristics and relations of entities they represent in procedural or declarative languages. Natural language being analog is less precise in form and content, tending towards ambiguity and vague boundaries of description as the sense of their usage is culturally dependent. Programming language for digital computer systems theoretically are acultural, as they are supposed (also theoretically speaking) to be able to represent anything. Such an argument was proposed (on a wider scope) by proponents of Hard AI, such as Marvin Minsky (Appendix B), who believed that current limitation of AI technology is only temporary, as more sophisticated modelling techniques and theories will be developed by innovative programmers that will create some form of "machine intelligence". The new digital commodity that knowledge is, will, according to Lyotard (1984), will be one of the most, if not the most important power stake in the international arena.
Western thinkers have relied on formalisms and logical rules to reason. Our world is represented by systems of signs that order the world into sets of patterns and relations where scientific laws and observations are derived. This tradition is attributed by theorists (McLuhan, 1964, Foucault, 1973, Innis, 1951, Logan, 1986) to begin with the establishment of the alphabet system. Logan maintains that Western Science and Technology owe their fundamental existence to the alphabet system, because of its abstraction. From the alphabet system "codified law, monotheism, abstract theoretical science, formal logic and individualism" and the notion of "universal law" emerged (Logan, 1986, p. 23).
LANGUAGE AS DISCRETE SYSTEMS
As a product of the postmodern society, the core application of Artificial Intelligence technology is the understanding of Natural Language and the capacity to translate between languages. Language use and production theories, as a result of problems posed by AI research, cannot claim that Language is a fixed system developed within individuals. Rather, this system is autopoietic, constantly redefining and changing according to external influences. Interpersonal, environmental factors all play prominent roles in defining language use. The discreteness of the computational 1/0 relationship is ineffective in representing the complexity of "reality". Western alphabetic traditions are trapped by the binary systems, e.g. Cartesian dualistic mind/body problem, life/death, seen/unseen, form/less or on/off. The test for formal language systems, rests on their flexibility to reorganize according to their socialization. Following also, the concept of autopoiesis, is "Emergent" theory (Havel, 1993), where the combination of the new and not so new components of a system emerge a new entity. This approach is traditionally adopted by Japanese art which focuses on the extraction of a whole into essential components. The combination of the right essences will emerge something aesthetically powerful. Similarly, biological structures are also components of cells or micro units that are networked and bound structurally to form a larger whole. The dichotomy between the natural (biological) and the artificial (technology) is then false if considered from a materialist position. When the spiritual is located within the atomic structure of the material, then dualism does not exist.
The greatest breakthrough for the Japanese came with the development of fuzzy logic from the USA. Lotfi Zadeh of the university of California at Berkeley found a method of processing inprecise data (fuxxy logic) that assigned truth values to other sets of data apart from the Boolean 1/0 logic (Tatsuno, 1990, p. 189). Fuzzy logic gave the Japanese a non-dualistic technique for modelling their world, such as the Kanji recognition chip, capable of 160 million 16-bit operations developed by Hiroyuki Watanabe in 1987 (Tatsuno, 1990, p. 189). Utilizing sophisticated techniques such as neural networking and fuzzy logic, the Japanese came closer to bridging the divide between analog and digital processes, humans and computers. Consequently, when the Japanese announced their commitment towards the development of the "fifth generation" and most recently, "seventh generation" computers, the West dismissed their positivism as a public relations exercise, but this progression journeys on a natural philosophical pathway.
With the information age, Lyotard (1984) sees the concept of learning from the top-down approach, dissecting the whole into microscopic units. Learning is thus restricted to "quantities of information" about the whole, i.e., that if knowledge and concepts cannot be computed for machine translation, they have no place within the postmodern society, and hence, will be abandoned. Quality of information must be translatable into a computable quantity of information bits to have any significance in this information society. As discussed earlier, Sterling (1992) and Dawkins (1976) brought Lyotard's view a step further by suggesting information has as much value as the attention they garner. Dawkins' "memes" can survive their evolutionary process of continuity if they can withstand the demands of the environment, which, in this case, refers to the social-economic environment.
"A more fundamental question remains: is man prepared to become homo informaticus? In the end what really counts is the quality of information rather than the quantity of information. " (Auh, 1986, p. 133). This is the question that Artificial Intelligence research is trying to uncover.
Chapter 1: Attention for Sale: Capitalism and Interactive Computers Chapter 2: Defining Human-Computer Interaction Chapter 3: Representing our Worlds: Digital Translation Chapter 4: Digital Intelligence: Parallel flow of Multiples? Chapter 5: Japanese Philosophy & Artificial Intelligence Research Soundwaves Conclusion Appendix A, B, C, D, E, F, G Bibliography
This page hosted by Get your own Free Home Page