Janus Praeses


" Qui custodiet ipsos custodes ?"    Juvenal, 'Satires'


<BACK


1.1 Computer security is a protean concept which covers a wide range of disparate subjects, many of which reflect real world problems that do not involve specifically computational issues. In computer terms they invariably involve some compromise between security and functionality and careful thought as to circumstances and motivations is needed to arrive at a suitable compromise. One area of concern is computer integrity ie protection of data and programs from inadvertent or malicious corruption while another is unauthorized physical access or use of a machine. In the current climate where widespread networking has become the norm, the danger of external electronic intrusion to obtain private data or monitor operations is a major concern. The growth of networking and e-commerce has also caused increasing concern with communication security eg covert interception of messages and data transmitted to or from a machine, and forgery and imposture in commercial or financial contexts.


2.1 In the earliest days main frames were usually organized as single-user systems with programs from authorized users submitted to the machine separately in batches. In this case fairly simple security precautions suffice - usually restricting physical access to a team of supervised operators and preventing electronic access to the operating system by users except for read/execute access to selected application programs and utilities. If write permission is confined to an allocated area of memory which is cleared at the end of each batch task the leakage of information between users is prevented. In practice this system proved fairly inefficient in the use of the machine since input/output and other house-keeping tasks often occupied 90% or more of the time leaving the CPU idle and available for use by other tasks. To counteract this multi-tasking systems which allow CPU operations from different tasks to be interlaced were developed and widely adopted, at the expense of considerable complication in controlling their interactions, while in some cases input/output was sub-contracted to auxiliary machines.


2.2 As the use and power of available machines increased the need for multi-user systems became more pressing, and a system often adopted involved local groups of users sharing the use of the machine, connected to it by teletype terminals with no inherent computational capability. Data and program integrity in these cases was often provided by allocating a unique 'user name' to each authorized participant together with a 'password' to be used when access was required. On confirming password legitimacy read/write access to an individual working area of storage was allocated to the user with 'read-only' access to the application program areas and no access to other user and system program areas. In time this was elaborated to allow nominated groups of users access to a common working area with varying privilege levels allocated to individuals where required. Since computer time was very expensive such systems usually had a system supervisor overseeing user authorizations and a logging system to monitor individual usage. Where sensitive information might be involved this often included facilities to sample traffic and clear the machine for dedicated sessions


2.3 Even in this fairly rudimentary situation there is some sacrifice in functionality eg the assignments of user privilege often created difficulties in cooperation between individual users and user groups. The introduction of a system supervisor can also create problems in relation to intrusion and imposture since the division between supervisory areas and user areas may not be impermeable. In order to function the supervisor must be able to access a list of user passwords and override them at need, while the monitoring and maintenance functions must allow inspection and intervention in all memory areas. In practice deliberate (but unacknowledged) 'back doors' into the system which bypass the security constraints were common while inadvertent back-doors were not unknown (1)


3.1 The spread of computers to business and industrial organizations, particularly following the introduction of the much cheaper mini-computers presented a new range of security problems arising from the managerial functions involved. In the batch-file system control of the permitted operations and responsibility for their outcome lies with their user while in a multi-tasking system responsibility for ensuring there are no interactions between tasks lies with the operating system. With management applications like databases and spreadsheets the responsibilities are more diffuse in that there are three levels of interaction - consulting data, entering data, and maintenance - that have to be controlled. As the number of acolytes involved becomes greater the possibility of concurrent sessions by different individuals rises so the tasks cannot be isolated in the same way and some adaptive means of controlling access is needed eg if concurrent sessions of consultation and entry are possible precautions are needed to ensure the data does not contain inconsistent results due to updating during the session, (2) particularly in spreadsheet systems


3.2 Many of these administrative programs are concerned mainly with interactive collection and distribution of data rather than extensive calculations and are best served by a local network. The 'dumb terminal' approach used by many academic systems of the time proved fairly inefficient for this purpose since repeated interchange of data between a 'server' machine and a 'client' terminal is slow compared with the speed of CPU operations. A network in which all the messages involved are circulated to a ring of terminals (3) each sufficiently intelligent to hold data and extract traffic addressed to it from a common stream is more efficient but raises more complicated security issues. The increase in the number and power of terminals requires some form of physical protection to avoid exposure to surreptitious monitoring of key-strokes and VDU displays as well as the introduction of illicit programs, while an increase in the number of authorized users makes it more difficult to identify venal or negligent operators. The protection of user names and passwords can be improved by encryption but, while it is relatively easy to devise strong encryption systems, the distribution, maintenance, and protection, of encryption keys and algorithms is not so simple.


4.1 The evolution of networking hardware and protocols had substantial effects on the issue of communication security. The earliest local networks generally used a point-to-point protocol (PPP) with low bandwidth requirements (since input was usually from keyboards or paper tape readers) which could be implemented using permanent 'telephone wire' connections. In this case the only major communication security problem is protecting the cable from tampering but the introduction of intelligent terminals increased the data flow, requiring better cables with higher bandwidths and limiting the permissible length of links (for a given cable). This was often met by the introduction of modems in the longer links (and in some cases some form of line-switching), which did not change the security requirement much except for increasing the opportunities for tampering.(4)


4.2 In the outcome the general pattern of development until the early 60's was the spread of proprietary local networking systems resulting in a series of incompatible enclaves sometimes with a modem/telephone line connection to the remote hosts of other local networks. These allowed a number of well publicised attacks (and probably many more unpublicised ones) on computing centres and commercial organizations as well as providing a mechanism for spreading computer viruses and malicious software. In the later 60's pressure for the introduction of a standard Open System Interface developed, fueled largely by the academic/UNIX community, and in 1962 the first steps to developing the Internet were taken by the US government agency ARPA (5), charged with generating research into computer networking. In 1965 ARPA sponsored a study of a 'cooperative network of time-sharing computers' and in 1967 this produced a symposium consensus that the best way forward was a sub-network of ancillary computers (known as IMP's - Interface Message Processors) to control connections with telephone lines, switching, and routing together with a layered software suite (distributed between IMP and host) to facilitate the interchange of message content (in standardized formats) between IMP and host.


4.3 Four IMP's were completed by the end of 1969, using off-the-shelf HP 516 minicomputers and were used to set up a 4 node network between four universities, three in California (UCLA, UCSB, SRI) and one in Utah (6). The initial tests proved successful and the network expanding, doubling in size initially every year and settling down to expanding at around 30% pa until around 1980 (coincident with the introduction of personal and home computers). After 1980 there was a sharp rise in expansion - approaching 100% pa for a few years, slowing down to around 70% pa after the launch of the World Wide Web in the 1990's. In the original configuration the basic concept seems to have been based on the local network scheme, allowing remote terminals to login to a 'sandbox', with basically similar security implications except perhaps that the IMP sub-network provides an additional point of attack for subversion since its architecture and required responses to incoming signals are standard and publicly known. Ultimately developments both in size and the range of protocols permitted has increased vulnerabilities by orders of magnitude relative to the original scheme in terms of the present Internet (with ~150 million hosts world-wide by Jan 2002)


4.4 Some of these vulnerabilities are inherent in the concept of a large peer-to-peer network connected via public switched telephone lines

(a) The recipient has no direct assurance of the origin of the signal and may require authentication of the sender

(b) There is no overall system supervisor to oversee security. With a large switched network the recipient has no simple way to determine signal routing and whether security control at all the servers/routers involved is adequate to guarantee signal integrity acceptably.

(c) The packet switching techniques (7) used in modern communication networks facilitate the diversion, capture, or substitution of data packets without this becoming apparent to sender or recipient. The increasing use of wireless and satellite links in recent times makes it effectively impossible to prevent interception of data.

(d) Although there is no overall supervision host system administrators need facilities to probe the network and intervene when necessary to correct errors. Incorporating protocols to allow this creates potentially dangerous loopholes in the sandbox concept which could be employed by an interloper to compromise host systems (8)



4.5 Another feature (contemporary with the networking idea but independent of it) ultimately having a significant impact on computer security, was the development of application-oriented interpreted programming languages. Although not best suited to large programs and much less efficient than compiled or assembly languages these reduce the time and effort required for program development substantially and are valuable for example in providing facilities for integrated suites of programs. Their computer security threat arises from the fact that executable code becomes more difficult to identify and they provide a means of concealing and executing illicit programs in formerly innocuous sections of memory eg word-processor files and Web pages. In the latter case extensive features attached to Web browsers have appeared which allow intervention ranging from introduction of advertising pop-ups and viruses, to covert collection of personal data, to taking complete control of the machine remotely. (9)


5.1The first computer virus is said to have appeared in the mid 80's, at a time when the home computer community had largely unraveled the DOS system (and were producing improved clones) and Microsoft were in the throes of introducing high level multi-tasking and interpreters into their Windows integrated application packages. The threats however were fairly easily countered with single-user systems and were not regarded as serious by the majority of desk-top users until the launch of the World Wide Web in 1990. Thereafter the number of incidents reported annually by CERT (Computer Emergency Response Team, founded 1987 at Carnegie Mellon University) rose sharply - by a factor of 400 in 12 years. Over the last four years the number of incidents has risen annually by around 120% (ie a factor of 2.2) while the number of Internet hosts has increased around 50% annually, so the threat to individual computers connected to Internet continues to increase rapidly. Clearly it verges on the foolhardy to use a computer which is (or has been) connected to the Internet without some protective measures. For machines connected permanently to a local network a user has few options - any network will have a mandatory security policy installed, by design or default, and have at least a nominal Network Manager. Users who have exclusive use of such machines might install additional protection to meet their perceived needs but are obliged to negotiate the detail with the Network Administration to avoid compromising maintenance functions or other nodes on the network. Users of stand-alone machines with Internet access have more options but are faced with making difficult choices between functionality and security.


5.2 The word virus is often used in a fairly loose sense and in general terms is a malicious executable program introduced into a machine with the aim of
(a) disrupting its operations and spreading to other computers
(b) surreptitiously monitoring operations and collecting data from the target machine, transmitting it to the attacker when opportunity offers
(c) taking remote control of the target machine.
They have also been called 'Trojan Horses' or 'worms' depending on their objective and in some cases Internet servers are targeted to allow simultaneous automated attacks on many sites.

Several active techniques have been used to counter such attacks - virus scanners, fire-walls, and encryption - but for stand-alone machines with dial-up access a number of passive options are available. The preliminary probing techniques required to mount a successful attack are time consuming and one simple and relatively painless method is to close dial-up connections promptly whenever they are not active eg download Web pages of immediate interest, terminate the connection, and examine the pages at leisure using the browser 'History' feature. This is effective because for dial-up connections ISP's allocate Web addresses randomly selected from their allocated pool of addresses so a session using a series of short separate connections leaves the attacker chasing a moving target, making it difficult to concatenate the responses to probes.

Other options where a loss of functionality has to be balanced against the need for protection are:-
(a) Deactivate scripting languages (VBscript, Jscript, and JavaScript)
(b) Deactivate ActiveX plug-ins in Internet Explorer and Java runtime engines
(c) Never respond to invitations to enter data into a Web page
(d) Do not accept 'cookies' (10)
(e) Never open e-mail attachments (ie don't click on them in Windows)
(f) Empty Internet Temporary Files folder when browser is closed

Most Web browsers provide some or all of these options as well as many freeware and shareware utilities.


5.3 The most widely used active approach for personal (as opposed to networked) computers is probably the virus scanner, which searches memory, disks, and programs for the characteristic signatures of known viruses. In some cases they also check for 'suspicious activity' in critical system areas. In multi-tasking machines this process can be a background activity but is quite time-consuming (eg around 80 min per gigabyte for current disk drives at last check) so a machine has to be active for many hours to perform a complete disk scan in the background and any viruses that have penetrated could lie dormant in obscure corners for a long time. The favoured approach is to scan incoming data immediately (including email and attachments) and every program activated before it is executed (to deal with any long-dormant viruses before they can do much harm). Scanners have proved to be fairly effective in containing major virus incidents but it must be remembered that they are inherently retrospective - enough machines must have been infected to draw attention to a new virus, sufficient time elapse for a reliable identification signature to be found, and the new data distributed before protection becomes effective. It is mildly ironic that to ensure prompt protection many users sign up for commercial services which offer automatic and transparent updating whenever the user is on-line - the very mechanism sought by hackers: if a hacker were ever to successfully impersonate an anti-virus provider the outcome could be disastrous.


5.4 Every data packet transmitted to, from, or across the Internet necessarily contains origin and destination addresses (32 bits) and a 'port number' (16 bits) specifying the type of service required and systems generically known as firewalls are widely used to monitor and filter these packets individually at the lowest level ie before they are allowed to enter a host computer. The versions used for private network protection are very complex, usually including virus scanning, intrusion detection and system activity/audit trail monitoring and often present a bottle-neck but simpler versions (widely available as freeware and shareware packages) are becoming popular for stand-alone machines. Most are based on accepting only packets with port numbers calling for essential services eg port 80 for Web pages and e-mails. (11) Some also accept only packets from 'trusted' IP addresses or, more usually, reject packets from unknown or suspect addresses while others attempt to provide content censoring by examining the passive data content: in view of the flexibility of the English language and the wide availability of powerful search engines the last is widely judged to be ineffectual. Clearly firewalls implicitly involve significant, sometimes draconian, restrictions on functionality.


5.5 In the earliest days of ARPANET security of the data transmitted was based on close physical control of the nodes and terminals but it was apparent that as the network expanded it would become progressively less possible to guarantee protection of the data and, more importantly, user names and passwords against eavesdropping. The traditional approach, encryption, had made great advances due to the greater number-crunching power available but the logistics of distributing encryption/decryption keys over a wide-spread network were problematic, while reliable means of authentication (ie identifying the message source) and verification (proving the data had not been altered) were considered essential. At the time up-to-date knowledge of cryptography was classified and largely confined to national Intelligence services (NSA in the US and GCHQ in the UK) so various academic groups began to work independently on the problem with a view to producing a system for public use. In the early 70's IBM began work on what was essentially an electronic elaboration of the WWII Enigma coding machine intended for sale to banks and financial institutions. A working version, Lucifer, was completed in 1974 and subsequently endorsed by the National Bureau of Standards as a Federal standard (DES -Data Encryption Standard) but met with a hostile reception from the budding Internet community when they discovered that the whole process had been orchestrated by NSA. The main complaints were that the algorithms and patents involved had been classified, so that users could not assess for themselves the strength of protection it provided, that the resistance to cryptanalysis had been weakened to an unacceptably low level at NSA's insistence (subsequently confirmed in Congressional hearings), and that NSA had installed a covert back door for their own benefit (never proved). Apart from this a major deficiency from an Internet point of view was that it did not provide any method of resolving the vexed problem of setting up secure encryption keys between participants having no previous contact. This episode signaled the start of a long and intense battle by NSA to suppress or control the private use of encryption which only ended in 1999 when the use of the International Traffic in Arms Regulations to prohibit export of software and dissemination of information on cryptography (a favoured NSA tactic) was ruled unconstitutional and Congress finally passed the Security and Freedom through Encryption bill which specifically lifted restrictions of export of encryption software and forbade mandatory key escrow (ie requiring every user to register encryption keys before using them).(12)


5.6 The Internet key distribution problem was first solved by Whit Diffey and Marty Hellman, working at Stanford Research Institute, in a 1976 paper which proposed the use of two mutually inverse keys which, although functionally related are more difficult to transform in one direction than the other (eg factorizing a product into its prime factors, although this was not the function they used). In this case the result of the 'easy' transformation can be made public (and used as the encryption key) without compromising the result in the opposite direction (used as the private decryption key). This idea triggered a burst of activity including a team at MIT (Rivest, Shamir, and Adleman) who produced the RSA algorithm using the product/prime factor approach which was incorporated ultimately in several commercial products. These were still partially crippled by the NSA feud however and there was, not surprisingly, some reluctance in the general community to accept systems which depended on 'Trusted Third Parties' (who were not seen as worthy of unqualified trust or guaranteed leak-proof) to generate the key pairs or to rely on implementations whose details were kept secret (in this case for commercial reasons). The barrier was finally breached in 1991 by a user who, incensed by a draft Congress bill requiring communication systems to include a means of recovering clear text for all encrypted traffic, posted a copy of a shareware development program (PGP -Pretty Good Privacy) that had come into his possession to a number of US Internet sites (to ensure it was not suppressed and to avoid prosecution under the ITAR). Within 24 hours working versions were circulating world-wide and subsequently the program was developed internationally as an Open Source project outside US jurisdiction and, due to a quirk in international patent law on prior disclosure, unconstrained by US patents, secret or otherwise. In common with most other encryption systems PGP uses the public key method to exchange a key for a conventional symmetric method of encryption in the interests of speed and has become a popular choice. The general reservations about using methods that do not reveal implementation details, particularly proprietary programs, have on the whole been justified by a continuous (and continuing) stream of flaws discovered in many of them: it is not that Open Source programmers are inherently less liable to error but that the details are exposed to a more intense and broad-based examination from the outset and can call on a much larger pool of expertise to correct them.


5.6 The use of encryption protects data in transmission by making it unintelligible if diverted or copied as well as from techniques which trawl the data stream for keywords to filter out sections which may be of interest but imposes some restrictions on the choice of method since the recipient must know what decryption algorithm to use and have the means to apply it to hand. It can also be used to give extra protection to person-to-person communications over encrypted local networks since the data payload in packages is passive and can be encrypted and decrypted twice if necessary. It is sometimes suggested as a means of protecting archived data (including of course clear text copies of the original messages, user names, and passwords) but the efficacy in this case is uncertain. If the data is to be retrievable the relevant decryption keys must also be stored and a system which has been penetrated sufficiently to allow an intruder to read the clear text may also allow access to the keys and the encryption algorithms unless they are stored on a separate isolated system, which would make retrieval cumbersome and time-consuming.




1. A common source is error handling routines. When an error condition prevents the program from continuing it is unacceptable in a multi-user environment to halt the whole machine and re-initialize. It must revert to a state in which remedial action is possible and this may increase the users privilege level inadvertently, in some well known cases to supervisor status when anything is possible.


2. Two people cannot work safely on the same file at the same time, even if it is written on the back of old envelopes.The traditional manila folder bound with red tape has some merit in preventing this and the material is automatically in chronological order


3. The terminology seems a little confused - these are often described as a peer-to-peer network, which implies literally that all machines involved have comparable capability, but the term is also used for server/client systems in which the server has most of the computational power (and even for dumb terminal systems where these are still used).


4.Government rules for links carrying classified data said they could not be connected to a telephone exchange of any kind and that the cable must be armoured, pressurised, and buried in concrete.


5. ARPA (Advanced Research Projects Agency) was set up by the US Department of Defense in 1958 'to establish a US lead in science and technology applicable to the military' in response to the Soviet launch of Sputnik. A major DoD concern in 1962 was the recognition that an electromagnetic pulse attack could paralyze the country's civil and military infrastructure by incapacitating computer and communication systems over large areas. The brief to ARPA was to shift Department contracts from independent contractors to "the best academic computer centers" because the former seemed to be exclusively dedicated to batch-processing mode.


6. The network had a hybrid configuration, with the three Californian nodes connected in a loop and the Utah node connected only to the Stanford node, so that messages between Utah and Los Angeles/Santa Barbara had to be relayed by Stanford. Originally a point-to-point protocol, Telnet, was used with dedicated 25khz telephone lines but the system evolved rapidly to accommodate many protocols and use the public switched telephone network for connections


7. The data to be transmitted is split into small packets which are sent independently encapsulated in 'envelopes' providing routing instructions to reduce queuing problems. It is easy to replace this envelope and since packets are given a limited life (to avoid cluttering the network with redundant packages) their disappearance or diversion is not detectable. This might be described in pre-computer terms as requiring that mail be restricted to postcards pinned up in a newsagent's window rather than transferred in a split stick carried by native bearers.


8. For example SNMP (Simple Network Management Protocol), TFTP (Trivial File Transfer Protocol), BOOTPS and BOOTPC (remote boot-up facilities) have no effective security provisions. This is not entirely disastrous since the response to such requests is determined by programs in the recipient host but in the present climate the average user is hard-pressed to determine which protocols are incorporated and currently active in proprietary operating systems, particularly since some have developed a habit of installing and activating utilities without notifying the user or seeking permission.


9. The first to attract attention were macro languages incorporated in 'Office' type integrated suites, used to introduce viruses in the mid 90's, followed by Web-page script languages such as VBSCRIPT (a BASIC variant) and JAVASCRIPT (no relation) popular with amateur hackers. More powerful script languages aimed specifically at Internet such as PERL and PHP followed, followed by general purpose languages such as ActiveX (Microsoft) and JAVA (Open Source). The former is particularly dangerous since it allows access to the full range of system resources and has no security provisions, while JAVA bids to become a universal comprehensive cross-platform language. JAVA incorporates the usual UNIX approach to security and is becoming increasingly popular but presents a user with the dilemma of choosing between cutting off access to an increasing number of resources (by removing or inactivating the Java Runtime Engine) and risking the intrusion of powerful illicit programs.


10. Cookies are short text files deposited within the client machine by the server which can be read whenever a new connection is established. There is some debate about the threat they pose since it is argued that they are non-executable but in view of the many known loopholes in browsers and the existence of scripting languages this is debatable. In any event they allow an attacker to track usage patterns which could assist probing.


11. Currently around 350 'well known' port addresses defined internationally are widely used. Some of these are very dangerous in themselves while unused ports left accessible provide an opportunity for a hacker to inject a malicious service routine for them.


12. The situation in Europe generally is unresolved between proponents of privacy, political control, and law enforcement. A Council of Europe resolution establishes the right of private citizens to use encryption and forbids mandatory key escrow but implementation is left to individual governments. In the UK the Regulation of Investigatory Powers Acts (2000) legalizes demands for decryption of individual communications in some circumstances, provision of a decryption key in others, and establishes legal penalties for failure to comply. In the US the NSA torch has been picked up by a commercial lobby aiming at legalization of hacker-type activities by corporate bodies, including mandatory incorporation of facilities in electronic machines to allow external intervention in pursuit of profits, under the deceptive banner of 'Trusted Computer Platforms'.