A Brief History of Electrical Technology
Part 7: The Microprocessor

by piero scaruffi | Contact/Email | Table of Contents

Timeline of Computing | Timeline of A.I.
The Technologies of the Future | Intelligence is not Artificial | History of Silicon Valley
(Copyright © 2016 Piero Scaruffi and all pictures should be public domain)


The Microprocessor

(Copyright © 2016 Piero Scaruffi)

A microprocessor is a programmable set of integrated circuits; basically, a computer on a chip. It had been theoretically possible for years to integrate the CPU of a computer on a chip. It was just a matter of perfecting the technology.

Lee Boysel joined Fairchild in 1966. In 1967, at a time when computer memories made of transistors were still a rarity, he wrote a memo in which he argued that it was possible to use MOS chips (a ROM, a DRAM and an ALU, arithmetic logical unit) to build the equivalent of a 32-bit mainframe computer. In 1968 he quit and started his own company, Four-Phase Systems, to complete that project.

In 1970 he demonstrated a 24-bit minicomputer (code-named System/IV) made from nine MOS chips (three ALUs, three ROMs and three DRAMs). This was the first all-semiconductor computer because even the memory was a semiconductor memory. He then built the first microprocessor out of one of these chips: the 8-bit AL1.

Viatron, founded in Boston in 1967, shipped a MOS computer in 1969, although it still used magnetic-core memory: the 16-bit desktop computer System 21, which came in two models (2140 and 2150). Trivia: Viatron introduced the term "microprocessor" in 1968, although they were referring to their 2101 terminal.

Texas Instruments' first step towards creating a computer on a chip was achieved in March 1970, with the introduction of the first complete 4-bit ALU on a single chip, the SN74181 integrated circuit (followed in October 1970 by Fairchild's pin-to-pin equivalent 9341).

The next step came from a custom project. Computer Terminal Corporation or CTC (later renamed Datapoint) was founded in 1968 in Texas (in San Antonio) by Phil Ray and Gus Roche, who had worked on NASA projects at General Dynamics. In 1969 they debuted the Datapoint 3300 video terminal, the first serious competitor for the Teletype 33.

For their next terminal (conceived as a programmable desktop terminal) they asked both Intel and Texas Instruments to deliver an 8-bit MOS chip, but they ended up rejecting both. The reason is very simple: TTL integrated circuits were still faster than MOS chips. CTC went on to build the 8-bit processor of the Datapoint 2200 out of TTL integrated circuits. This was, de facto, a desktop computer (first delivered in April 1971) because it could be programmed in high-level languages such as BASIC.

Nonetheless, Texas Instruments carried out the project under the supervision of Gary Boone, who had joined it in 1969 and who in 1971 delivered both the TMX-1795, the world's first 8-bit microprocessor, and the TMS-0100, the world's first single-chip microcontroller. However, the TMX 1795 was never sold commercially. At the time Texas Instruments specialized in building custom chipsets for manufacturers of desktop calculators such as Olivetti. Boone also produced the TMS-0100 microcontroller by July 1971, the world's first single-chip microcontroller. This chip was truly a "computer on a chip" because it included all all the functions of a computer on a single block of silicon. A microprocessor is just the CPU of a computer. A microcontroller includes CPU, memory and input/output systems. A microprocessor is typically used to build a computer. A microcontroller is typically embedded in some other object, whether a microwave oven or an automobile part.

Meanwhile, Intel was also working on another custom project. Busicom was a Japanese manufacturer of calculators that in January 1971 had introduced the world's first pocket calculator, the LE-120A Handy. In April 1970 Busicom sent his engineer Masatoshi Shima to Intel for six months to design a MOS chip for their new line of calculators, working with Ted Hoff's team that had expertise in silicon-gated MOS technology.

In 1970 Hoff hired Federico Faggin, the inventor of silicon-gated transistors at Fairchild. Faggin implemented Hoff's design of a 4-bit CPU in silicon, and in November 1971 Intel unveiled the 4004, a small thumbnail-size electronic device containing 2,300 transistors, spaced by 10,000 nanometer gaps, and capable of processing 92,000 instructions per second. The 4004 replaced what would normally have been six specialized chips. Intel's dominant business was memory chips, and the design of the 4004 reflected that fact: Intel's designers partitioned the world of computing into RAM, ROM, and CPU (central processor unit). Intel's microprocessor was actually not a computer on a chip, just a CPU on a chip. Nonetheless, Intel's tiny 4004 chip was as powerful as the ENIAC, but millions of times smaller and ten thousand times cheaper. It also implemented subroutines at the hardware level, via a "last-in first-out" stack.

Intel delivered the first working 4-bit chip to Busicom in March 1971. Texas Instruments delivered the 8-bit TMX-1795 to Datapoint in July. But the TMX-1795 was announced in March, whereas the Intel 4004 wasn't announced until November. Yet the 4004 was the microprocessor that changed the history of computing.

At the same time Intel was also working on the 8-bit chip originally commissioned by Computer Terminal Corporation, code-named 1201. This was a completely different design from the 4004, and became the 8008 microprocessor. The 4004 was designed for four-bit "binary-coded decimal" arithmetic, whereas the 8008 was designed for "eight-bit character" representation. Their instruction sets were, accordingly, different. Therefore by August 1972 Intel had ready what was marketed as an 8-bit version of the 4004, the 8008, a microprocessor whose 8-bit word allowed it to represent 256 ASCII characters, including all ten digits, both uppercase and lowercase letters and punctuation marks.

Intel was not convinced that a microprocessor could be used to build a computer. It was up to Bill Pentz at California State University in Sacramento to prove the concept. In 1972 his team built the Sac State 8008, the first microcomputer, and helped Intel fine-tune the microprocessor for the task of building computers.

Intel's initial motivation to make microprocessors was that microprocessors could help sell more memory chips.

(In 1968 a reclusive electrical engineer, Gilbert Hyatt, had founded Micro Computer in the Los Angeles region and filed for a patent on what would be known as the microprocessor, but never built one).

A few months earlier in 1971 Intel had introduced another important invention, the EPROM, developed by the Israeli-born engineer Dov Frohman. An EPROM (Erasable Programmable Read Only Memory) is a non-volatile memory made of transistors that can be erased. By making it possible to reprogram the microprocessor at will, it also made it more versatile. The Intel 1702 was the first EPROM chip. The EPROM would become the natural memory for firmware like the BIOS of personal computers.

The 4004 and the 8008 had been produced in small quantities (the latter mainly as the basis for DEC's own processors), but in April 1974 Intel unveiled the 8080, designed at the transistor level by Shima, that lowered both the price and the complexity of building a computer while further increasing the power (290,000 instructions per second). The 8080 was a tiny device that was almost as powerful as a DEC minicomputer.

The microprocessor clearly unnerved the manufacturers of minicomputers: in 1974 DEC introduced the LSI-11, a single-board implementation of the PDP-11, a multi-chip 16-bit microprocessor.

Since the foundation of Shockley's lab, dozens of semiconductor companies had been founded in Santa Clara Valley, many by former Fairchild engineers and managers. In 1972 venture-capital company Kleiner-Perkins, founded by Kleiner of Fairchild Semiconductor fame and by former Hewlett-Packard's executive Tom Perkins, opened offices in Menlo Park, followed by Don Valentine of Fairchild Semiconductor who founded Capital Management Services, later renamed Sequoia Capital. That year a writer, Don Hoeffler, popularized the term "Silicon Valley".


Xerox PARC

(Copyright © 2016 Piero Scaruffi)

Xerox, as the New York-based photocopier company was known after 1961, that its sales force was inside the very offices where IBM was selling its lucrative mainframes. Since it had no computer technology of its own, Xerox acquired Scientific Data Systems (SDS) of Los Angeles. SDS had briefly passed DEC thanks to an advanced technology that was preferred for time-sharing and already boasted virtual memory. Xerox, however, was more interested in the mainframe line, the Sigma. This turned out to be a flop (the division would be sold to Honeywell in 1975). At the same time, in 1970, Xerox also set up the Palo Alto Research Center (PARC) in Palo Alto to fund more basic research and hired Bob Taylor, the former director of ARPA's IPTO, to run the research on computers.

A year earlier Xerox's scientist Gary Starkweather had invented the laser printer, and quickly moved to PARC to continue his work on what he called the SLOT (Scanned Laser Output Terminal).

In 1971 Taylor hired Alan Kay, a pupil of computer-graphic pioneer Ivan Sutherland in Utah (a group that Taylor had promoted when he was at IPTO), who was then a visiting scholar at Stanford's AI Lab (SAIL). Kay's vision was the bring computing to the schools and therefore make computers smaller and easier to use. His thesis in 1968 at University of Utah (implemented in ALGOL on a Univac 1108) had been a system called FLEX that (quote) "merged hardware and software" and aimed for "an interactive man-machine dialog" The programming language was based on ALGOL 60 and was "semantic" rather than purely syntactic, i.e. it could be used for describing and executing a whole class of programming languages (FLEX was in fact written in itself). Kay was influenced by McCarthy's LISP (1960), that treated both data and procedures as objects, Sutherland's Sketchpad (1963), in which objects were described in abstract terms ("master drawings") and then instantiated as actual drawings, and by by Cliff Shaw's JOSS (1963), one of the friendliest human-machine interfaces, Seymour Papert's Logo (1967), the MIT environment to teach children programming. Ideas from Sketchpad had percolated into Simula, a programming language for simulations that had been defined in Norway by Kristen Nygaard and Ole-Johan Dahl (Sketchpad's masters and instances were recast as activities and processes in Simula) and Simula became the foundation for Kay's new language. In 1971 Kay prototyped his "object-oriented" method of programming in a user-friendly software development tool called Smalltalk.

The synthesis of all of this was the Dynabook that he described in 1972.

Dan Ingalls developed most of the programming language in 1972, and in 1973 Smalltalk was ready for experiments in schools. Patrick Suppes, a philosopher of science at Stanford, had been running the Computer Curriculum Corporation since 1967, a pioneering firm in interactive e-learning. One of his students, Adele Goldberg, joined the PARC team. Teaching children how to program a computer was important because it changed the goal of computer science: not faster processing but better interaction.

PARC benefited from a 1970 "amendment" credited to senator Mike Mansfield that was meant to reduce DARPA's funding for pure research. This never happened (funding actually increased) but it was enough to convince many computer scientists in academia to join industrial research centers. Such had been the case with Taylor and Kay. In 1971 Taylor convinced most of the Berkeley Computer Corporation (the group that had tried to commercialize Berkeley's Project Genie) to joined PARC, notably Butler Lampson, Chuck Thacker and Peter Deutsch (the former student who had ported LISP to the PDP-1). Other academics lured to PARC were Bob Metcalfe (MIT) and Charles Simonyi (UC Berkeley), all coming from some of the biggest recipients of DARPA funding. Another stroke of luck was a schism getting worse at nearby SRI: some in Doug Engelbart's laboratory wanted to reimplement NLS as a distributed network a` la Arpanet whereas Engelbart preferred the time-sharing system. The result was that Bill English, Jeff Rulifson and others quit and joined PARC. In 1973 PARC also lured William Newman away from the University of Utah. PARC got in trouble in 1972 for two unrelated events.

In 1972 Charles Thacker led the development of a PDP-10 clone called MAXC that directly competed with Xerox's Sigma 7 built by the SDS group, and that used an integrated circuit for RAM instead of core memory; thus proving to management that Xerox PARC was not well integrated in Xerox's computer business. And Rolling Stone magazine published an article written by Stewart Brand, titled "Spacewar", that hailed the hippy-like community of hackers at PARC, hardly the style favorite at headquarters.

For a while the scientists at Xerox PARC enjoyed absolute freedom, and Taylor shaped an environment that fostered creativity, egalitarian, with no dress code and no work hours.


Gadgets

(Copyright © 2016 Piero Scaruffi)

The rapid increase in chip density enabled new electronic appliances outside the world of computing.

In 1970 Imlac, founded in 1968 in Boston, introduced the first low-cost graphical display system, integrating a 16-bit minicomputer, a CRT monitor, a keyboard, a light pen, and a separate control panel, basically what would be called workstation in the 1980s.

In 1973 John Bergey of watch manufacturer Hamilton Watch in Pennsylvania built the first digital watch, the Pulsar, a watch with no moving parts and no hand. The idea spread rapidly all over the world, with prices constantly collapsing, especially after the Pulsar featured in the James Bond movie "Live and Let Die" (1973).

Automatic Electronic Systems (AES), an electronics form founded by Stephen Dorsey in Montreal (Canada), develop its own "microprocessor" (actually a set of 50 chips) and used it in 1973 to make its minicomputer "AES-90", a "word processor" that combined a screen (a CRT monitor) and a floppy disk.

In 1973 Don Lancaster (an engineer at Goodyear Aerospace in Arizona) developed the TV-Telewriter that displayed characters on an ordinary television set at a time when minicomputers were using teletypes.

In 1973 Japan's Sharp developed the LCD (Liquid Crystal Display) technology for the booming market of calculators.

In 1974 the first major innovation in cash registers in decades took place as IBM and NCR introduced cash registers capable of scanning the "bar code" of merchandise in supermarkets (the code being the Universal Product Code).

A Vietnamese-born engineer working in France at a national research center, Andre Truong Trong Thi, used the 8008 to build the Micral in February 1973, the first personal computer.

The microprocessor reached a much wider audience than its inventors had intended thanks to the magazines specializing in electronics such as "Radio Electronics", "QST" and "Popular Electronics". The Intel 8008 generated the most enthusiasm and created a whole new market: kits sold by mail-order that allowed hobbyists to build computers at home. That market was pioneered by Scelbi (SCientific ELectronic BIological, a firm founded in 1973 in Connecticut by Nat Wadsworth and Bob Findley that in March 1974 advertised its Scelbi-8H. In July 1974 another magazine announced the Mark-8, developed by Virginia Tech's student Jon Titus.


The Videogame

(Copyright © 2016 Piero Scaruffi)

Inspired by Steve Russell's ten-year old but still popular "Spacewar", in 1971 Ampex employees Nolan Bushnell and Ted Dabney quit their jobs and created the first arcade videogame, "Computer Space", a free-standing terminal powered by a computer and devoted to an electronic game that anyone could use. In May 1972, radio and TV set maker Magnavox introduced the first videogame console, Ralph Baer's transistor-based "Odyssey" (acquired by Philips in 1974).

Bushnell was inspired again, this time by an electronic ping-pong game. He founded Atari in Santa Clara (mainly with Ampex engineers) and asked his engineer Allan Alcorn to create a similar game, which became "Pong" in november 1972, a runaway success.

One year earlier in 1971 Bill Pitts at Stanford had implemented a new version of Spacewar on a PDP-11. Pitts and Hugh Tuck formed Computer Recreations, packaged the PDP-11/20 computer with a Hewlett Packard monitor, and started selling it as Galaxy Game just ahead of Atari's "Pong".

Videogames spread also on networks of computers. In 1973 John Daleske at Iowa State University created Empire for PLATO and in 1974 Jim Bowery at the University of Illinois created Spasim also for PLATO. Steve Colley wrote Maze War in 1973 on the Imlac PDS-1 at NASA Ames (only two players, each on a PDS-1), but in 1974 Greg Thompson at the MIT rewrote it for the PDP-10 on the Arpanet so it could be played by multiple players.


piero scaruffi | Contact/Email | Table of Contents
Timeline of Computing | Timeline of A.I.
The Technologies of the Future | Intelligence is not Artificial | History of Silicon Valley
(Copyright © 2016 Piero Scaruffi)