A Brief History of Electrical Technology
Part 8: Networking

by piero scaruffi | Contact/Email | Table of Contents

Timeline of Computing | Timeline of A.I.
The Technologies of the Future | Intelligence is not Artificial | History of Silicon Valley
(Copyright © 2016 Piero Scaruffi and all pictures should be public domain)


The Visible Hand of Government

(Copyright © 2016 Piero Scaruffi)

The visible hand of government was at work again, and one of its decisions was influential in the development of computer networks, although not as popular as the creation of the Arpanet.

Until it was broken up in 1984, AT&T enjoyed a legal monopoly over telephony in the USA. AT&T interpreted this as a mandate for all telephone owners to purchase telephone equipment only from AT&T. AT&T was ready to disconnect from the telephone network any user who dared install telephone accessories from other vendors. In 1928 Hush-A-Phone of New York had introduced a device that fit over the mouth of the telephone to protect against eavesdroppers and increase privacy, i.e. (quote) "a voice silencer designed for confidential conversation". AT&T discovered the Hush-A-Phone after it had been on the market for many years and sued the small firm. In 1957 the government's Federal Communications Commission (FCC) sided with Hush-A-Phone. This decision would open the floodgates of third-party telephone accessories, from answering machines to modems.

Thomas Carter's Carterfone in Texas began selling the Carterfone in 1959, a device that connected the two-way radio used on oil rigs (the progenitor of the walkie-talkie) with the telephone so that the oil workers at sea could and receive make phone calls. A similar lawsuit ended up at the FCC, and in 1968 ruled in favor of Carterfone, another victory for third-party vendors.

Then came Robert Weitbrecht, an engineer at SRI (ironically a man who was born deaf and nonetheless a ham-radio amateur).

In 1963 he invented the acoustic coupler, a device that converted sound from the telephone's earpiece to electrical signals for a teletype, and, viceversa, converted electrical signals from the teletype to sound into the telephone's mouthpiece. This device enabled any firm to get around the AT&T telephone monopoly and connect any electronic device directly to the telephone network, except that, of course, the connection was acoustic, not electronic.

Finally, in 1966 FCC started a lengthy "computer inquiry" into AT&T's business. The question was what kind of relationships were admissible between computer business and telecommunication business. As a result of this "computer inquiry", in 1971 the FCC confirmed AT&T's telephone monopoly but banned AT&T from entering the computer business. Without this decision, AT&T, that owned the entire long-distance network of the USA, would have dominated online computer services.

The history of computers and networks would have been very different if a monopoly had been able to control the development of data telecommunications.


Networking Computers

(Copyright © 2016 Piero Scaruffi)

At the same time that microprocessors were revolutionizing the concept of a computer, gigantic progress was underway in the field of networking, although its impact would be felt only decades later. In 1972 Ray Tomlinson at Boston's consulting firm Bolt, Beranek and Newman (the firm that had been redirected towards computer networks by Licklider) invented e-mail for sending messages between computer users, and also coined a procedure to identify the user name and the computer name separated by a "@".

Groupware started before the use of computers with the need during the Cold War for reaching consensus among a group of geographically-distributed military experts. Norman Dalkey and German-born mathematician Olaf Helmer developed the Delphi method at RAND Corporation as a tool to quiz experts only via questionnnaires, without the experts confronting each other, in order to bring out their reasoning, not their dialectical skills. The method was first described in the top-secret report "The Use of Experts For the Estimation of Bombing Requirements" (1951) to simulate the thinking of a Soviet strategist determined to strike the USA with nuclear bombs: experts were asked to determine what would be the optimal target and the optimal number of atomic bombs. It was later formalized in Helmer's paper "On the Epistemology of the Inexact Sciences" (1958), a treatise that followed the trend established by Rudolf Carnap's "Logical Foundations of Probability" (1950) and Leonard Savage's "Foundations of Statistics" (1954) to study the way humans take decision in the face of uncertainty. As Helmer explained, in the exact sciences explanation and prediction have the same logical structure, but that is not true in the "inexact science" that presides over strategic decisions about complex problems. Simulation and gaming become more important than logical deduction. Up to this point Delphi was a purely manual procedure. (In 1968 Helmer left RAND to co-found the Institute for the Future).

Murray Turoff, an astrophysicist who in 1964 at Brandeis University had implemented a computer simulation of a planetary nebula in Fortrain on an IBM 704 and in 1965 at the Institute for Defense Analysis (IDA) in Virginia had worked on one of the earliest computerized anti-missile systems (the Nike X), joined the newly established Office of Emergency Preparedness in 1968 with the task of designing an online computer system for national emergencies. In 1970 he implemented a system on a Univac 1108 for a 13-week online Delphi (i.e. virtual) nation-wide discussion among 20 experts. In 1971 this system evolved into the Emergency Management Information System And Reference Index (EMISARI), running on the same Univac, a pioneering "groupware" that offered services for messaging, conferencing and collaboration. Its "Party Line" was the precursor of "chat" systems.

The first public computerized "bulletin board" system was set up in 1973 in a record store in Berkeley by activists (including Lee Felsenstein) who had access to the university's time-sharing machine.

Groups of computer users were already working together, sharing "notes" on files that were accessible by the whole network. The most popular of these early note-sharing systems (later "groupware") was perhaps PLATO Notes, written in August 1973 by University of Illinois' student David Woolley, originally to keep track of software "bug" reports on their mainframe-based time-sharing system PLATO, developed by CDC and used to host the educational software of the Computer-based Education Research Laboratory (CERL). As the PLATO time-sharing system (hosted on one CDC mainframe at the CERL) was spreading to more and more organizations, PLATO Notes, renamed Group Notes in January 1976, rapidly evolved into an online community discussing a broad range of topics. Commercialized in 1975 by Control Data Corporation in Minneapolis as simply Plato, it spread around the world.

EIES, implemented in 1975 by Murray Turoff, now at the New Jersey Institute of Technology, provided its customers with electronic discussion boards.

These systems and their bulletin boards were a strong motivation for AT&T to develop faster modems.

Taylor at Xerox PARC promoted the same idea of the distributed network of computers that ARPA had promoted with the Arpanet. PARC, however, was thinking not of big university computers but of small computers in the same building and the main purpose was not to survive a nuclear attack but simply to to share the same laser printer. Bob Metcalfe was a mathematician who had worked on Project MAC at the MIT and on the Alohanet at University of Hawaii. In 1973 he coined the term "Ethernet" for a local-area network that would connect computers using special cables and adapters instead of the telephone lines used by the Arpanet. Unlike the Arpanet, which was very slow, the Ethernet had to be very fast to match the speed of the laser printer. The first Ethernet was finally operational in 1976. The Ethernet was going to cause an economic revolution in the world of computing because since 1950, when Herb Grosch had formulated his "Grosch law", the world believed that it was more cost-effective to purchase one big computer than connect several small ones. The Ethernet turned that law upside down. The Ethernet created the conditions for a seismic change in the way information-technology budgets were allocated. On the contrary, Metcalfe held that the value of a network of devices increases exponentially with the number of connected devices. This came to be known as "Metcalfe's law" and basically meant that the value of a network increases exponentially with the number of the people that it connects.

IBM went in the opposite direction: realizing that its customers increasingly wanted to connect multiple terminals to their mainframe, in 1974 it introduced a proprietary Systems Network Architecture (SNA). It was not a distributed system: it was just a way to connect many objects to a mainframe.

DEC responded in the same year with its proprietary architecture, DECnet, that, instead, offered a distributed model to connect multiple PDP-11 minicomputers (initially only two, point to point, and only if they ran the RSX-11 operating system, but later also any number of the time-sharing systems running RSTS or TOPS). Influenced by the Arpanet, it was one of the first commercial peer-to-peer network architectures, In 1975 DECnet added a feature for a computer to access files on other computers (DEC's Data Access Protocol or DAP).

The Arpanet had grown to 2,000 users in 1973. On that year the first international connection was established, to the University College of London. In May 1974 Vinton Cerf of Stanford University and Bob Kahn of DARPA (the man behind the stellar demo of 1972) published the Transmission Control Protocol (TCP), which would become the backbone of Arpanet transmission: it enabled a computer to communicate with any other computer, regardless of its operating system and of its network. It was actually made of two parts: the Transmission Control Protocol (TCP) and the Internet Protocol (IP). Vint Cerf figured out a 32-bit address space for an estimated 16 million time-sharing machines for each of two networks in 128 countries (TCP/IP would run out of addresses in 2011 and be replaced by IPv6). The name "Internet" would replace "Arpanet" only in 1984 after the military sites got their own network, the Milnet.

The Internet's three basic operations were becoming clearly defined: logging into a remote computer (the "telnet" command, proposed in 1969 by Steve Carr of the University of Utah), transferring files between computers (the "ftp" command, using the File Transfer Protocol designed by Abhay Bhushan at the MIT in 1971) and messaging between users (the "email" command).

Up to this point the most ambitious projects for computer networking had been funded by DARPA: Project MAC at MIT, Doug Engelbart's NLS at SRI, and the Arpanet at BBN.

TCP was not the only option. In fact, the service that popularized packet-switching technology in the age of analog telephone lines was X.25, a standard published in march 1976 by the International Telecommunication Union (ITU). Packet-switching consisted in breaking up each phone call into packets. This allowed X.25 to carry multiple phone calls over a single line. For the telephone companies that sponsored the standard it meant increasing capacity. An indirect benefit was the ability to carry digital data over their analog telephone lines. The first large X.25 public network was Telenet (1974), established by Bolt Beranek and Newman (BBN), basically to create a for-profit version of the non-profit Arpanet (that BBN was contracted by the government to run).

In 1978 the British Post Office, Western Union and Tymnet collaborated to create the first international packet-switched network, the International Packet Switched Service (IPSS), basically an X.25 evolution of Tymnet (offered by Cupertino's Tymshare since 1971). In 1982 the Post, Telephone and Telecommunication (PTT) Bureau of France debuted what was to remain arguably the most popular of X.25 services (and of all videotext services before the World-wide Web), Minitel, an online service that any subscribers were able to access via their home telephone and that displayed text on the screen of their television set. Unlike the Arpanet, X.25 could be used to carry out commercial activites.

Electronic commerce between businesses already existed, but had never been standardized. In 1968 the railways had pioneered a standard for their paper documents. In 1975 that consortium, the Transportation Data Coordinating Committee (TDDC), came up with a set of rules for exchanging electronic documents. Its name was later changed to Electronic Data Interchange Association (EDIA) and Electronic Data Interchange (EDI) became the name for electronic commerce.

Electronic commerce was, instead, forbidden within the Arpanet, which in theory was reserved for ARPA-funded research projects.

These wide-area networks worked over traditional telephone lines. In April 1973 Martin Cooper at Motorola demonstrated the first portable, wireless or "cellular" telephone.

Radio Frequency Identification (RFID), an evolution of the radar technology used in World War II and perfected at the Los Alamos National Laboratory during the 1970s, was turned into an industrial product by Charles Walton, a former IBM scientist, who had founded Proximity Devices in Sunnyvale in 1970, and in 1973 designed a "portable radio frequency emitting identifier".


Knowledge-based Systems

(Copyright © 2016 Piero Scaruffi)

During the 1970s the knowledge-based approach of "expert systems" dominated Artificial Intelligence. Bruce Buchanan's Mycin (1972) at Stanford for medical diagnosis, Hayes-Roth's Hearsay-II (1975) at Carnegie Mellon University for speech-recognition, and John McDermott's Xcon (1978) at Carnegie Mellon University for product configuration were the most influential projects. Several innovations in knowledge representation enabled them: Ross Quillian's semantic networks (1966) at Carnegie Mellon University, Minsky's frames (1974) at the MIT, Roger Schank's scripts (1975) at Yale University, Barbara Hayes-Roth's blackboards at Stanford University (1984), etc.

Learning from examples and analogies was becoming a popular subject: Patrick Winston at the MIT (his thesis "Learning Structural Descriptions from Examples", 1970), Doug Lenat's Automated Mathematician (1976) at Carnegie Mellon, Tom Mitchell at Stanford ("Version Spaces - A Candidate Elimination Approach to Rule Learning", 1977) Frederick Hayes-Roth at the RAND Corporation ("The Role of Partial and Best Matches in Knowledge Systems", 1978), etc. Carl Hewitt's Planner (1969) at the MIT and the STRIPS planner (1971), developed by Richard Fikes and Nils Nilsson at SRI for Shakey the Robot, were some of the reasoning architectures for a knowledge base. Drew McDermott's non-monotonic logic (1979) at Yale University and David Marr's theory of vision (1979) at the MIT contributed ideas. Intellicorp, the first major start-up for Artificial Intelligence, was founded in Silicon Valley in 1980. Alain Colmerauer created the PROLOG programming language (1972) to program inference rules in a non-sequential manner, an alternative to the sequential programming of traditional machine languages.

Japan was already fascinated with robots. In 1973 Ichiro Kato's team at Tokyo's Waseda University introduced the anthropomorphic robot Wabot-1.

John Holland at the University of Michigan (one of Arthur Burks' students, who in 1959 had been the first student in the world to obtain a PhD in computer science) introduced a different way to construct programs by using "genetic algorithms" (1975) and in 1976 introduced classifier systems, which are reinforcement-learning systems, but the school of neural networks was in disarray.

Natural Language Processing was becoming an independent branch within Artificial Intelligence for which specific models were invented: Charles Fillmore's case grammar (1967) at Ohio State University, Roger Schank's conceptual dependency theory (1969) at Yale, William Woods' augmented transition networks (1970) at Harvard, etc. The systems that were built on top of these, such as Terry Winograd's SHRDLU (1970) at the MIT and Woods' LUNAR (1973), were limited to narrow domains and short sentences.

In 1970 Martin founded Threshold Technology in New Jersey which developed the first commercial speech-recognition product, the VIP-100.

The most influential schools for speech recognition were Raj Reddy's team at Carnegie Mellon University that produced Hearsay II and Jim Baker's Dragon; and Fred Jelinek's team at IBM. Baker and Jelinek adopted a statistical method.

In some cases A.I. was useful to push the limits of hardware. For example, DEC's 36-bit computer PDP-6, introduced in 1964, was influenced by the demands of MIT's AI Lab; and the time-sharing TOPS-20, released by DEC in 1976, was a commercial version of the TENEX developed in 1969 at BBN (by Daniel Murphy, Ray Tomlinson and Daniel Bobrow) to add virtual memory to the PDP-10 for internal A.I. projects, and the improved PDP-10 of 1973 on which it ran owed something to the Stanford AI Lab's custom PDP-10.


piero scaruffi | Contact/Email | Table of Contents
Timeline of Computing | Timeline of A.I.
The Technologies of the Future | Intelligence is not Artificial | History of Silicon Valley
(Copyright © 2016 Piero Scaruffi)