Deus in machina

"I fear the Greeks even though they offer gifts "   Virgil, Aeneid

<BACK

1. With the appearance of the digital computer in the mid 20th Century the ' deus ex machina ' of the Ancient Greeks seems to have evolved into a god-in-the-machine, widely regarded as an infallible oracle resolving the problems of life, the universe, and everything. It is true that modern computers can be invaluable tools and rarely make errors or, even more rarely, fail to report errors which are detected but cannot be corrected. Used injudiciously however, they can (and sometimes do) produce results which range from the grossly erroneous to the more insidious plausible-but-incorrect.

2. By the start of the 21st Century the computer has become so pervasive, not to say intrusive, that some understanding of its background and limitations is needed even by non-users but unfortunately this is not readily accessible in current literature.This stems from two basic causes - firstly any computer, given sufficient time and data storage can be programmed in principle to execute any operation that is logically possible, so that little can be said which is entirely true of all computer systems in all circumstances.The second obstacle is the extensive material generated by the commercial interests of hardware manufacturers and software companies in maintaining a captive customer-base and planned obsolescence. This produces a flood of literature of transitory relevance which swamps the basic issues of computer science which have scarcely changed since first propounded by Countess Lovelace in the 1830's.

3. Once the digital computer was established as a viable practical proposition (in the early 1950's) three main areas of development emerged - computer technology (how to make computers), computer studies (how to operate machines and applications), and computer science (how to control and program machines). The first and last have remained largely specialist subjects, involving mainly solid state/semiconductor physics and mathematics respectively, while the second became primarily an educational activity through training courses and books. In a fashion familiar in the context of computers, advertising, and education - inventing new names for old wares to simulate progress - 'computer studies' became 'information technology'. This change highlights one of the main sources of error in computing viz that the word 'information' (1) is not used in the usual dictionary sense . In common with Claude Shannon's Information Theory (the original owner of the IT acronym) information in these contexts carries no connotation of 'meaning': in both cases the content (of messages and computers) is fundamentally a collection of 0's and 1's (or 'true' and 'false' logic values) with no inherent meaning. They have meaning in relation the human mind only by virtue of (hopefully consistent) rules for translating between the patterns and human concepts. The humans involved are, at the least, the originator who specifies the operations required, the recipient who interprets the results, and the programmer(s) (probably several) who produced the programs to implement the required manipulations. The potential for confusion and misunderstanding between these individuals is appreciable even if the computer system were not in the loop.

4. A large part of the software and hardware in practical systems are specialized units ('peripherals') primarily concerned with such translations and do not have a sufficient general capability to be regarded as computers in their own right . The 'god' in the machine manifests itself physically as the CPU (Central Processing Unit) which executes all instructions and has sufficient functionality to constitute a universal computer. Spiritually it represents the ghosts of all the programmers involved in writing the operating system and application software.

5.   The digital binary system universally adopted is the only viable way of achieving the reliability and stability required in a large general purpose computer but, even where there are no human errors, a related problem arises with numerical data in that such systems are inherently incapable of handling real numbers - calculations involving integers may be exact (2) but irrational numbers can only be approximated. In most problems the errors can be kept at acceptable levels with suitable precautions although they may accumulate to give appreciable overall errors in long series of arithmetic operations. In some cases however the overall error can be magnified by a large enough factor to invalidate the results completely. Situations where this presents a serious problem eg statistical inference and relaxation methods, were known long before computers appeared but subsequent studies have shown that major problems can arise in many fields of practical interest which previously were too laborious for detailed study.

6. The result of commercial pressures on the computer industry, hardware and software, is that development is directed at producing greater functionality at a given cost rather than providing the capability actually required by users at minimum cost. As a result many, if not most, users find to their detriment (3)  that they use only a tiny fraction of the capability of their system and programs: the more options available, the more information required to specify which are relevant to the current task and the longer tasks take to execute.

7. Maintenance of systems can be compromised after only 6 months to 1 year because components become obsolete and no longer available commercially. A side effect, not always foreseen by either individual or corporate users, is the disruption caused by a rapid cycle of change if continuous service is needed. For individual operators the effort of running two programs with similar aims is greater than the sum of the two separately because of instinctive confusion between operating details of the two. For large tasks it may be necessary to run both systems parallel for a considerable period, at the cost of substantial effort, to maintain continuity while the problem of ensuring that both continue to use precise and contemporary data can become very complicated, as evidenced by the history of recent large Government projects.



1. 'technology' is also slightly fuzzy - originally this related to manufacturring techniques but by diffusion it became associated with software techniques enabled by enhanced chip capabilities based on manufacturing improvements . The most apposite definition here is perhaps that of Professor Ian Stewart: 'Technology is making things work without understanding them, Science is understanding things without making them work'

2. Rational fractions can also be handled exactly in principle but rarely are.

3, Professional programmers spend their time writing programs rather than using their products and the 'default' options introduced to hide the facts of life from users are not always the most suitable or efficient for the task in hand. Perhaps they should be added to the Mikado's 'little list' and a 'punishment to fit the crime' devised.