The Global Network: A Historical/Critical Essay II

 

Table Of Contents

Introduction

Electricity, and Wire

Wireless Technology

Communications Provide Entertainment

The Cable Network

The Interactive Cable Network

Computer Based Networks

The Internet

Collective Functionality

Collective Education

For The Kids

Quality Concerns With Education On The Internet

CDPV2K

Collective Computing

RC-5

SETI@Home

Future Possibilities

Weather Forecasting

Browser Inclusions

3-D Rendering

A Scalable Future

Footnotes

 

 

 

 

 

 

 

 

 

Introduction

The computer, and even more importantly, computer networking has affected peoples lives tremendously in the past 20 years. What was once an expensive luxury for the business world has become commonplace in the family home. The computer has brought education, entertainment and productivity to the masses. Computer networking, although a more recent phenomenon, has given the masses the ultimate tool for information and communication...the Internet.

What started as a military project eventually fell in the hands of the public sector. Certain areas of information and entertainment provided the economic backing needed to establish such a massive network of computers.

The Internet has become in the last 10 years an ever-growing entertainment and information medium. Over 20,000,000 computers are now attached to one another via modem, or wired networks. Bandwidth issues presently limit the amount of information that can pass from one computer to another. Therefore, the Internet remains a primarily connective medium, (i.e. information passes from one user to another). The Internet is the ultimate tool for world-wide communications. However, its use in computing has been over-looked. With the advent of more advanced transmission formats, the Internet's value as a collective computer could far outweigh it's use for connectivity.

Electricity, and Wire

In the 1800s, the advent of electricity would provide the first step towards modern communications. Line-based telegraphs would come in many forms over the next 50 years. The first generation, would use a single wire for every letter, advances in components helped to reduce the number of wires to one. By 1837, Samuel Morse had patented his version of the telegraph, and his famous code. telegraphy could provide long-distance, real-time communications for the first time. By 1867, trans-Atlantic communications by telegraphy was realized. For the first time in history, a world-wide electric network had been established. This spurred inventors all over the world to find new ways of communicating.

In 1876, Alexander Graham Bell shocked the world with his first demonstration of a telephone. The American Telegraph and Telephone company (AT&T) was soon established. Cables were laid all over America, eventually reaching from the Atlantic to the Pacific. 1

The 1800s represent a landmark time for communications. The foundations for the "modern" network were firmly established. Although future refinements would improve on the systems developed in that century, the original concepts have changed little.

These early networks provided an immediate state of connectivity. Rather like a mail system, communications between two parties was possible; due to electricity however, it was in real-time.

Wireless Technology

In Italy, a young inventor by the name of Marconi discovered a technique to transmit information over the air. In 1897, Marconi demonstrated his wireless telegraph to the British Post Office. Their astonishment helped Marconi procure funds for starting his commercial venture, which would soon become the largest communications corporation in the world.

The British hired Marconi to establish a network of radio towers throughout England. Within a year, The English Channel had been transversed by his equipment, and communication with the mainland over wireless was common. 2

Early experiments involving ship to shore contact had sold the American government on Marconi's ideas. The US navy was the largest in the world, and communications over water was becoming increasingly complex. With radio communication, boats separated by vast distance could still coordinate with one another. With such a lucrative contract, Marconi had a firm hold on communications in America. Marconi set up his American corporation and settled on the northern seaboard. By 1901, communication over the Atlantic Ocean by wireless was a reality. 3

The next twenty years brought refinement to wireless communication. Voice transmission was developed in 1906, and transmitters grew in power.

Radio would eventually become the first broad-band connective element to be realized. Instead of being one or two way communication, the signal could be received by all equipped with a radio. Distress signals from a stranded ship could be heard by the closest radio. Weather emergencies could be broadcast before the storms hit land.

Government and military use of wireless equipment was widespread by the 1920's. Amateur radio operation had grown in popularity, and some amateurs used their gear to transmit literature to the public. The radio was soon to become the first entertainment network, and the first commercial broad-band use of communications. 4

Communications Provide Entertainment

In 1919, 9XM, at the University Of Wisconsin became the first "radio station". The station would broadcast plays, musical concerts, and local news. As radio receivers became cheaper and easier to operate, the popularity of radio stations exploded across the United States. Commercial radio stations would sell advertising time to make a profit. Broadcasters across the country began co-operating with each other to strengthen their hold over their territory. The first such "network" was the American Broadcasting Company (ABC). Two others soon followed: the National Broadcasting Company (NBC) and the Colombian Broadcasting Service (CBS). ABC, CBS, and NBC would become the largest broadcasting entities world-wide. After the discovery of television broadcasting, the big three (as they came to be known) invested) invested heavily in the technology. The trouble was, not everyone got a clear picture. 5

The Cable Network

Television was instantly popular, and the big three made a fortune on advertising revenue. Viewers in rural areas, unfortunately, could not always receive the signal. Inventive entrepreneurs saw this as a blessing in disguise. Large towers were erected around areas with poor reception. Coaxial cable carried the strong signal from the towers, to homes directly. Eventually, cable operators would incorporate satellite communications to provide more channels than the networks could.

Cable provides a more robust transmission medium. Basically, you can fit more channels on a cable than you can squeeze into the air. By the late 80's cable television was the largest growing entertainment medium. By the nineties, cable was available in virtually every metropolitan area in America, Canada, and most of Europe. 6

The Interactive Cable Network

American cable companies had been experimenting with an interactive cable system since the mid eighties. Today, cable companies in most major cities in the U.S. offer television, data transmission and reception. Most are installing fiber-optic communication as well. Fiber-optics transmit light instead of electricity. More information can be pushed down a fiber-optic line, providing for a rich resource. The most advanced cable networks can provide video, phone, and data services for computers. 12 major cities in America offer cable modem services with speeds exceeding that of ISDN, the previous standard for in-home high speed computer data transmission. 7

Computer Based Networks

Data networking was a novel concept in the 1950s. Herb Grosch, a mainframe salesman, developed a theory that held true well into the 1970's. Grosch's Law, as it came to be known, stated that a more expensive computer will always be more powerful than multiple smaller computers adding up to the same price. The reason for the gap was primarily due to data storage. Mainframe computers at that time used large magnetic cores to store information. The cores were cheap, but the support circuitry was expensive, making the price per bit less expensive on larger systems.

The military could see the importance of data transfer. It heavily funded the Advanced Research Projects Agency (ARPA). By 1969, ARPA had a functioning network of 4 nodes. The theory behind the original ARPANET (As it would soon be called) was one of shared resources. Computer power was still very expensive. The concept of shared resource networking provides multiple users access to the same mainframe. Each user would be given a percentage of the total computer power. When one user needed total computer power, the network was prioritised to his location. 8

Network theory advanced quickly. Due to British influence, the concept of packet switching was being explored. Instead of assigning a direct continuous stream of data from user to user, packets of data were used instead. Packet switching breaks a chunk of data into smaller pieces, which are sent individually, and reassembled at the target computer. Packet switching provides a continuous stream for multiple users.

Less than a year later, ARPANET had over 15 nodes across America running at 56 kilobits per second. The network was still purely connective in nature. Users could access files, send e-mail, and share computing power. The first step towards a collective network came in 1974.

Ethernet was designed at Xerox's Palo Alto Research Center (PARC). It was loosely based on the ALOHAnet in Hawaii. ALOHAnet used radio transmitters and receivers to link computers to common resources such as printers and storage. Ethernet replaced the wireless units with coaxial cable. A Local Area Network (LAN) could be established running at 3 Megabits per second. Workstations or personal computers were all that were needed to take advantage of a LAN, and compute collectively. 9

LANs soon spread from military and academic installations to the business world, where they truly made an impact. After the release of Intel's 80386 processor (1985), IBM compatible computers could easily be strung along on an Ethernet LAN. This offered businesses of all sizes a cheap option to leasing computer time, and provided managerial resources not previously possible. One incentive that was not yet made fully aware was the demise of the ARPANET, and the introduction of internetting.

In 1980, ARPANET adopted the Transmission Control Protocol/Internet Protocol (TCP/IP). TCP/IP allowed individual networks to interconnect amongst themselves. ARPA used TCP/IP to connect most, if not all, military resources and academic resources. By 1988, the technology had proven itself in commercial applications. ARPA disbanded their funding for ARPANET.

The Internet

Personal computers were already found in many homes by the end of the eighties. Modems (modulator/demodulator) were common as well, most communicating at 1,200 bits per second over a standard telephone line. Bulletin-Board Systems were set up as a means for public access to information without a LAN connection. A user would dial into the BBS system directly. Once logged on, a user could post messages, retrieve files from the server, and access other BBS services. Most BBS's were specifically tailored to their users (Porn).

During the early nineties, large corporations invested heavily in constructing LANs. In large business centers, especially in the U.S., corporations would usually lease individual floors of buildings as opposed to the whole building. When wiring a LAN this often led to linking multiple floors of multiple buildings. Thanks to that fact, a new industry, network administration, was developed.

Network administration soon became corporatised as well. These corporations soon found that companies would pay to access other networks for information and communication. Banks, and airlines had used large internetworks for years, but they were slow, and had a prioritized operating system. Academic networks, left over from the ARPANET, were fast, used a common language, and a sturdy protocol. E-mail, newsgroups, and FTP, could be easily accessed through UNIX servers. 10

The development of Hyper-Text Mark-up language (HTML) provided an easier interface to internetted resources. The first successful program to take advantage of HTMLl was Mosaic. It was developed at the US National Center for Supercomputing Applications (NCSA). The term World Wide Web was now an established entity, and its popularity soared. 11

Today, An estimated 6,000,000 web pages exist. Telephone access for users at home is now more common than LAN's. Transfer speeds of up to 3 million bytes per second are possible on co-axial cable, and fiber-optic technology should soon eclipse that commercially.

Collective Functionality

Communication has rarely exceeded a connective functionality. Technologies were primarily designed for either one-to-one (telephone, mail, e-mail) or broad band, sending information to everyone (television, radio, the World Wide Web). Early computer networks were designed to provide a conduit; reducing power, but increasing the number of users.

"Many hands make lighter work" 12

Collective functionality is not a new concept. Collectivism has been employed since the dawn of time. Collectivism has been adopted by politics, economics, and agriculture:... In almost any regard, more workers means less work.

Modern supercomputers are technically large networks in a large box. The most powerful computer to date (Janus, housed in Albuquerque NM) houses over 9,000 Pentium processors, just like in a PC, each working together. Such collective functionality is limited by certain factors: How fast can each processor talk to each other, Is there a delegation of authority? 13

A huge resource is afforded to us in the form of the Internet. 20 million computers are estimated to be connected. Once the potential of the Internet as a group of collective processors has been explored, we can then tap its potential as a supercomputer.

"As the information available on the Internet becomes richer, and the interaction among the connected computers becomes more complex, I expect that the Internet will even begin to exhibit emergent behavior going beyond any that has been explicitly programmed into the system." 14

Collective Education

Learning as a group rather than an individual has always been considered beneficial. If individuals can interact with others, they would acquire skills collectively. Cable networks are beginning to offer news services to aid in education. These first steps towards a collective connectivity are helping to shape the next evolution of the Internet, and television itself.

For The Kids

Cable companies have been providing educational material for years, but advances in network technology has led to For The Kids; an American project being co-ordinated by Time-Warner cable. It offers schools an interactive stream of video, audio, and data. Classrooms are fitted with a cable modem and receiver box. A television provides the content, and special remote controls allow interactivity. The rich bandwidth provided by the service allows for real-time video communications with other schools across the world. Special lectures and guests can be broadcast from anywhere in the world. Children can type in their questions, and receive an answer in real-time. For The Kids proves that networking has the potential to turn the world into one large classroom, as long as the content is regulated. 15

Quality Concerns With Education On The Internet

The Internet is perhaps the largest library of information ever. Unfortunately, not all information found is accurate. According to Universities across the world since students have begun to rely heavily on the Internet for their research, the quality of this research has suffered. Privately funded information networks use quality control measures to insure that the information found there is true. 16

CDPV2K

Information as text and photos are no longer the only educational interests to be found on the Internet. More and more multi-media elements are being incorporated into web sites. Educational web-sites like CDPV2K use the newest technologies to convey musical education. Clever compression schemes, along with an increase in bandwidth, continue to refine the look and feel of the Internet. 17

Collective Computing

Janus, the mega computer, communicates with all 9,000 of it's processors at bus speed, much faster than any network. Networks are limited by how fast they can communicate with each other. The average Internet data speed hovers around 10 kbps. This is not fast enough to turn on a Janus spread all over the world. The Internet is perfect for distributed computing though.

Distributed computing (DC) is a form of collective computing that does not rely heavily on inter-communications.

A few early experiments with distributed computing, including a pair of programs called Creeper and Reaper, ran on the ARPAnet. Later, when the Xerox Palo Alto Research Center (PARC) installed the first Ethernet, a program cruised the network at night, commandeering idle computers for CPU-intensive tasks.

This early cycle recycler was the creation of John F. Shoch and Jon A. Hupp, who called it a "worm". A later scavenger system called Condor, developed by Miron Livny and his colleagues at the University of Wisconsin at Madison, is now running at more than a dozen universities and other sites. Condor roams within clusters of Unix workstations, usually confined to a single laboratory or department. Once an idle CPU is found, it is delegated a task from a list defined beforehand. 18

The first Internet DC endeavor was organized in 1988 by Arjen K. Lenstra (now of Citibank) and Mark S. Manasse of the DEC System Research Center in Palo Alto. They and their colleagues had written software to distribute factoring tasks among workstations within the DEC laboratory, and they extended this system so that computers elsewhere could contribute to the effort. Mathematical computation is ideally suited for the Internet. Computers are told what to do, they do it, and then send back the results. 19

RC-5

RSA Data Security specializes in encryption software. They claim to produce codes that are next to impossible to break. To prove it to the world, they offer cash prizes to anyone who can.

In one challenge the message was encoded with DES, the Data Encryption Standard, a cipher developed in the 1970s under U.S. government sponsorship. The key that unlocks a DES message is a binary number of 56 bits. This task of finding the key was undertaken by an Internet collaboration called DESCHALL. The leaders of the group devised a screen saver scheme. Whenever the computer was inactive, a screen saver would activate a program that would download a list of keys, and try them all. When the computer was finished with the list, it would report its results, and download a new list. In June of 1997 they won the $10,000 prize.

Another RSA Challenge was to crack their RC-5 encryption. After the notoriety of the DES crack, people were eager to win the new prize. By the end of the second contest, some 4,000 active teams were processing 7 billion keys per second, a rate equivalent to the work of 26,000 Pentium computers. 20

SETI@Home

SETI (The Search For Extra-Terrestrial Intelligence) analyzes radio signals emitted from solar systems looking for signs of intelligence. They accomplish this with a huge radio telescope in Puerto Rico (Yes, the same one as in the movie Contact). Recordings of the sky are sent through a series of super-computers, and analyzed for any variations. The signals analyzed are so weak that incredibly complex processes must be used. This takes computer power that SETI can't really afford.

In 1996, David Gedye, then-head of SETI proposed harnessing the power of the Internet to advance SETI's search. Over the next two years, he and a team of students and scientists developed the SETI@Home program. It is similar in concept to the RC-5 endeavor. When the computer is idle, a screen saver starts processing a small bit of sound received from SETI. When the analysis is over, the results are sent back, and a new bit of sound is received. Over 1 million people have joined. At any time, over 140,000 computers are working at once, world-wide. 21

It is a delicious irony that little green men might be found by a screen saver. What could potentially be the greatest discovery in history is also the fore-bearer for one of the most potentially impacting important technological breakthroughs of modern time.

Future Possibilities?

Weather Forecasting

How often does the weather person correctly predict the day's climate? How often have you been caught without your umbrella? It is possible to correctly predict weather patterns with a computer. But it would take a very large computer. Janus, the super-computer in New Mexico, can predict less than an hour in a week. The Internet could conceivably handle a month. The only limitation holding us back is speed of inter-connectivity.

Browser Inclusions

Internet Explorer 4.0 revolutionized the web browser in 1998 with its convenient inclusion of a side-bar search window. It didn't provide users with any new technology; it just made things a bit more convenient. What if Seti@Home or an RC5 project was integrated into 6.0?

The idea of the Internet optimized computer is coming to fruition with products such as the I-Mac. Mac OS 9.0's new Sherlock technology integrates a browser and search engine directly within the system software. Using a distributed computing scheme, all Sherlock software connected to the Internet could be used at once to aid the search process.

3-D Rendering

Toy Story, the first digitally created feature length movie, is roughly seventy minutes long. In total the movie is comprised of approximately 110,000 frames. 46 days of continuous processing, with 117 Sun workstations were needed for the final rendering of the project. The computers were networked together using Ethernet. A central computer delegated work to the rest. 22

Complex 3-D models take a much smaller percentage of memory than the final product. Although a screen saver type application would not be ideal for complex rendering, a dedicated application would be. If Pixar would pay a person a quarter for every frame they render, It would cost $27,500.

Unfortunately, most personal computers could not do the job in time. But who knows what the future holds.

A Scalable Future

Ventures like SETI@Home and RC-5 have proven that the Internet holds vast potential for Distributed Computing. Once communication speeds have improved, the Internet could literally become a large parallel processing supercomputer.

With personal computer technology advancing the way it is, along with the progression of networking, the Internet could provide more power than any computer ever conceived of until now. However, overshadowing that future for now is the spread of information, education, and connectivity.

The global network, (cable, satellite, wireless, telephone, LAN, etc.) makes it possible to provide information of any sort across the world in an instant. Future possibilities in collective computing are exciting, but not feasible for some time. Distributed computing and collective education seem to be the best use of existing networks...for now.

 

 

 

 

 

 

Footnotes

 

1. COMMUNICATIONS HISTORY: THE UNITED STATES, http:// www.stockton.edu/~gilmorew/0amnhist/comuhis1- 1.htm# Outline.

2, 6. Bittner, John R. "Broadcasting and Telecommunication: An Introduction" (London, Prentice Hall, 1991)

3. Fang's Media History Timeline, http:// www.mediahistory.com/time/alltime.html

4. A History of Modern Communications, Computing, and Media, http:// www.acclarke.co.uk/

5, 7. Murphy, Brian. "The World Wired Up" (London, Comedia Publishing Group, 1983)

8. Negroponte, Nicholas. "Being Digital" (New York, Vintage Books, 1995)

9, 10. Coulouris, George. "Distributed Systems: Concepts and Designs" (Addison-Wesley Publishing Company, 1994)

11. The Internet: Past, Present and Future - Internet & WWW History, http:// www.vissing.dk/inthist.html

12. My father used to say that.

13. Computing Science- Collective Wisdom, http:// www.amsci.org/amsci/issues/Comsci98/compsci1998- 03.html

14. Quote from a web article with no name.

15. Time Warner Press Release: October 12, 1996

16. According to a CNN Report.

17. http:// www.mainem.co.uk.html

18. Anderson, John A. "Multiple Processing: A Systems Overview" (Cambridge, Prentice Hall, 1989)

19. Ceruzzi, Paul E. "A History Of Modern computing" (Massachusetts Institute Of Technology, 1998)

20. Mersenne Prime Search, http://www.mersenne.org/prime.htm

21. SETI@home- Search for Extraterrestrial Life, http:// setiathome.ssl.berkeley.edu/ Derived statistics for SETI@home, http:// www.roving- mouse.com/setiathome/.html

22. http:// www.pixar.com/toystory

 

Other Sources

Crouch, Colin. "Reinventing Collective Action" (Blackwell Publishers, 1995)

Forbes ASAP (0222)- Massive, Parallel Processors, http:// www.forbes.com/asap/99/0222/060b.htm

Stoll, Clifford. "Silicon Snake Oil" (New York, Anchor Books, 1995)

Wand, Ian. "Computing Tomorrow" (Cambridge, Cambridge University Press, 1996)