A Brief History of Electrical Technology
Part 9: Power to the People

by piero scaruffi | Contact/Email | Table of Contents

Timeline of Computing | Timeline of A.I.
The Technologies of the Future | Intelligence is not Artificial | History of Silicon Valley
(Copyright © 2016 Piero Scaruffi and all pictures should be public domain)


The Hobbyist Market

(Copyright © 2016 Piero Scaruffi)

The baby boomers of the 1960s were famous for many excesses but the most under-reported was probably the passion for electronis that coexisted with the passion for Marvel comics and for acid-rock. Some of these were "nerds" who religiously read the electronics magazine and built marvels at home. Others were imbued with the spirit of the juvenile delinquent and with the spirit of the hippies. John Draper (better known as Captain Crunch) was a famous "phone phreak" who in 1971 built the "blue boxes" capable of fooling AT&T's phone system. Esquire issue of October 1971 ran an article exposing the "phreaking" phenomenon and Draper was arrested. One of his fans was Steve Wozniak, back then an engineer at Cupertino's public radio station KKUP.

In December 1974 Ed Roberts, the owner of MITS in New Mexico, which used to sell model rockets, launched an advertising campaign on hobbyist magazines for his kit the Altair 8800 that showed how to build a small computer at home. The Altair was just a kit, not a computer: the buyer had to use garage tools to assemble the kit and turn it into a computer; but it was telling that the "do-it-yourself" method had come down to the computer, that only a few years earlier was known as the "giant brain". This small brain, instead, was based on the 8080 microprocessor (introduced by Intel eight months earlier) and was sold only by mail order (for $395).

MITS sold 2,000 Altair 8800 systems in one year. It quickly created a small community of passionate hobbyists. Roberts had made it easy to customize and expand by adopting DEC's model of an open "bus". This allowed users to connect memory cards, TV sets, keyboards, printers and cassette tapes to the Altair. The machine was easy to set up and easy to customize, but difficult to program: the user needed to learn the machine code of the Intel processor and to flip switches in order to enter the program. Among its early users were two Harvard University students, Bill Gates and Monte Davidoff, who decided to write a BASIC interpreter for the Altair on the university's PDP-10, using as a template Bramhall's BASIC. Gates and his old friend Paul Allen, a college dropout working as a programmer at nearby Honeywell, started a company named Micro-soft, initially also based in New Mexico. Software was still mostly free in those days, but Micro-soft's business plan was to make money by selling software. Microsoft's BASIC interpreter was welcomed by the community and Micro-soft closed the year with revenues of $16,005.

The Altair was soon followed by many imitators, such as the IMSAI 8080 (December 1975). Information Management Science Associates (IMSAI), a consulting firm founded by William Millard in 1972 in the East Bay, had actually built a special-purpose machine for the navy by coupling together several 8080s. This Hypercube II cost a fraction of an an IBM 370 mainframe and delivered almost the same power.

Microsoft BASIC was not the first high-level programming language for Intel microprocessors. In 1973 Gary Kildall, who was an instructor at the Naval Postgraduate School in Monterey, took IBM's PL/I and turned it into a simpler language, PL/M (Programming Language /Microprocessor). Intel accepted to market it as an add-on and "burned" it into the Read Only Memory (ROM). In 1975 Kildall released CP/M (Control Program/Microcomputer), an operating system to read and write files to and from a floppy drive on 8080-based machines. Kildall's CP/M was largely based on concepts of the PDP-10 operating system (VMS). MITS offered its own operating system, but IMSAI bought Kindall's CP/M. Kildall rewrote CP/M to make it hardware-independent, so he could sell it to any manufacturer in need of a disk operating system for a microprocessor. To achieve this feat, Kildall isolated the interaction with the hardware in a module called BIOS (Basic Input/Output System). In 1974 Kildall started selling his CP/M as Digital Research (DRI), again using the hobbyist magazines. Any machine equipped with CP/M transformed from a hobbyist passtime into a a general-purpose computer. The CP/M made the microprocessor (a chip invented for process control) into a competitor of minicomputers and even mainframes: the "giant brain" was becoming a home appliance.

The CP/M-equipped machines just needed applications in order to become home appliances, and in those days the software firms were developing applications for mainframes, not for hobbyist kits. The applications for microprocessors came from equally eccentric independents, such as Alan Cooper, who had worked with Kildall at DRI and who in 1975 started selling a General Ledger as the Structured Systems Group (SSG).

In March 1975 a group of hobbyists that included Hewlett-Packard's engineer Steve Wozniak, Lee Felsenstein and Bob Marsh met in the garage of their friend Gordon French to discuss the Altair: that group became the Homebrew Computer Club, one of the many clubs formed by Altair fans.

Bob Marsh and Lee Felsenstein used the Intel 8080 to design the Sol-20, released in June 1976, the first microcomputer to include a built-in video driver.

Wozniak demonstrated the first prototype of his Apple at the Homebrew Computer Club meeting of December 1976.

In 1976 club member Li-Chen Wang signed his Tiny Basic with the motto "Copyleft - All Wrongs Reserved" (instead of "Copyright - All Rights Reserved")

Los Angeles had the Southern California Computer Society, formed in September 1975.

A number of stores opened to serve the growing community of computer hobbyists that before the Altair had relied on mail-order catalogs: Dick Heiser's Arrowhead Computers in Los Angeles (1975), the first computer retail store in the world, Paul Terrell's Byte Shop in the Bay Area (December 1975 ), the store that sold the first units of Wozniak's Apple, and William Millard's Computer Shack also in the Bay Area (1976), that would grow to become the nation-wide chain Computerland.

Besides the shops, there were "hamfests", chaotic events where ham-radio operators and assorted electronics geeks met to sell or swap their equipment.

When the Altair hit the magazine stands, the computer market was split along the mainframe-mini divide: IBM and the "BUNCH" sold mainframes; DEC and its competitors sold minis. The large corporations that dominated these two fields had the know-how, the brains and the factories to produce desktop computers for the home market. They did not do it. The market for home computers was largely invented by an underground community of hobbyists that operated outside the big bureaucracies of corporations, academia and government, and that maintained its network via magazines, stores and clubs. Magazines and shops were the true visionaries of this revolution.

Until then, progress in computer technology had been funded by government agencies and multinational corporations; and most innovation had been achieved by university professors and students, or in corporate research laboratories. A generation of young hobbyists was changing the very dynamics of the computer industry. Many of them did not have a college degree. Many of them had no business training. None of them had DARPA funding. The biggest revolution in the history of computers was about to happen, and it was started by a grassroots movement. After all, the epicenter of this revolution was the Bay Area, that a decade earlier had witnessed a revolution in values, politics, sexuality, dress code, music and art.


Calculator and Microprocessor Wars

(Copyright © 2016 Piero Scaruffi)

Until 1973 Motorola was more interested in walkie-talkies that integrated circuits, leveraging its background in radio engineering. After all, the first man on the Moon spoke to Earth using a Motorola system; and Motorola's DynaTAC was the first radio telephone (i.e. mobile phone). Tom Bennett had worked at Sylvania with Longo on the first commercial TTL integrated circuit (the SUHL of 1963) and had designed an electronic calculator for Chicago-based Victor Comptometer in 1971. Having joined Motorola's laboratory in Arizona to jumpstart their calculator business, in 1974 he delivered the 8-bit 6800, a more advanced microprocessor than anything that Intel had introduced yet.

However, his team led by Chuck Peddle resigned and formed MOS Technology that in 1975 introduced the 8-bit MOS 6502, a 6800 clone that was much cheaper ($25) than the 6800 ($180) or the Intel 8080 ($360), and in 1976 relocated to Pennsylvania.

The first personal computers (mostly mail-order kits) were really toys for adults. The real business machines made with microprocessors were the calculators, which sold by the millions. Texas Instruments was the main supplier of microprocessors used in calculators in the USA. In 1975 it targeted its competitors by increasing the price of its microprocessors. Its competitors were left scrambling for alternatives.

At that point Peddle was then hired by Commodore to build an entire computer, the Commodore PET (Personal Electronic Transactor), demonstrated in January 1977.

It was getting difficult to compete with Intel because Intel boasted a full line of semiconductor products: RAMs, EPROMs and microprocessors. Microprocessors drove sales of memories, and sales of memories funded research in microprocessors.

Competition to Intel eventually came from Silicon Valley itself. In 1975 Jerry Sanders' Advanced Micro Devices (AMD) introduced the AMD8080, a reverse-engineered clone of the Intel 8080 microprocessor. Above all, in 1975 AMD launched the 4-bit 2901 chip for the "bit-slice" method of building microprocessors. This method conceives a microprocessor as a set of modules: a control unit and several arithmetic logic units (ALUs). While the bit-slice method leads to bigger microprocessors (apparently defying the whole point of the microprocessor), in those days it offered a huge cost-saving advantage in creating high-performance microprocessors. By attaching a series of ALUs horizontally, a manufacturer was able to create virtually any kind of microprocessor. For example, by joining four 2901s, one obtained a 16-bit microprocessor. By contrast, the Intel 8080 was an 8-bit microprocessor and could only be an 8-bit microprocessor. The big-slide method also marked the resurrection of the bipolar chip. Bipolar chips were faster than Intel's MOS chips but generally avoided because of the problem of cooling them down. Splitting them into several ALUs made them feasible again. The first bit-sliced microprocessor was made by National Semiconductor in 1973, the IMP-16, a 16-bit architecture that consisted of four identical 4-bit ALUs (the IMP-00A). In 1974 Intel introduced the 2-bit 3002, and Monolithic Memories (MMI), another Fairchild spinoff founded in 1969 by Zeev Drori, introduced the 4-bit 6701. However, it was the AMD 2901 that caused a sensation and became a de-facto standard. After all, at a time when semiconductor companies were founded by physicists, AMD had been founded by marketing experts. One reason of AMD's success was the ability to find "second sources": Motorola (1975), Raytheon (1975), Thomson (1976), National (1977), NEC (1978) and Signetics (1978) all signed up to manufacture the 2901. Computer manufacturers were more likely to invest on a chip made by multiple and reliable sources than on chips made only by the inventor. T.I. made the 4-bit SBP0400A in 1976 and Motorola the MC10800 in 1979, but it was too late to catch up with AMD in the bit-slice market.

Right after finishing the 8080, Federico Faggin left Intel with coworker Ralph Ungermann and with Masatoshi Shima. In July 1976 Faggin's new company, Zilog, unveiled an 8-bit 8080-compatible microprocessor, the Z80, which was faster and cheaper than the 8080 (designed at transistor level by the same Shima). Basically Zilog did to Intel what MOS Technology had just done to Motorola. More importantly, the Z80's innovative architecture made it easier for computer manufacturers to design a computer around it. It also supported the operating system CP/M.

National Semiconductor had already introduced PACE (Processing and Control Element) in December 1974, the first 16-bit microprocessor on a single chip (the IMP-16 consisted of four chips). This was followed in 1976 by the Texas Instruments TMS 9900 (a single chip version of T.I.'s 990 minicomputer) and in 1977 by the Fairchild 9440 (a single-chip version of Data General's Nova minicomputer); but the market for 16-bit microprocessors had to wait for its king until 1978, when Intel introduced the 8086.


Minicomputer Wars

(Copyright © 2016 Piero Scaruffi)

The minicomputer manufacturers continued to thrive for a few years, but clearly the microprocessor had made their proprietary architectures redundant. The big news was semiconductor memory, which made it possible to build cheap computers with huge memories, and therefore required long memory addresses. In 1973 Boston-based Prime, founded the previous year by former Honeywell employees who had been with Computer Control Company and by alumni of the MIT's Multics and Project MAC (notably Bill Poduska), shipped the first 32-bit minicomputer, the Prime 200, which was basically a 32-bit version of the Honeywell DDP-516.

In 1977 DEC introduced the 32-bit VAX series, and Data General responded with the Eclipse series, and IBM introduced the 4300 series. Designed by Bill Strecker, the VAX was much more expensive and much more powerful than the PDP-11: it was a completely new architecture with a new operating system (VMS) and with the most sophisticated implementation of Atlas-style virtual memory. Because of the PDP-11 legacy, many VAXes ended up running Unix though. DEC would sell more than 100,000 of them. But the VAX's most lasting legacy was perhaps its ASCII-based terminal, the VT-100, that would remain the standard for decades, eventually obliterating IBM's EBCDIC standard.

In 1975 IBM introduced a desktop minicomputer, the Model 5100, that combined a keyboard, a cassette tape and a video. It had 16K of RAM and was programmable in BASIC and APL. But IBM had a good excuse to miss the personal computer craze: it was distracted by the lawsuit brought by the government (the trial started in 1975, just a few months after the Altair came out).


Relational Databases

(Copyright © 2016 Piero Scaruffi)

IBM's San Jose laboratories (later renamed Almaden Research Center) had always been at the vanguard of data storage. In 1970 IBM's British scientist Edgar Codd wrote an influential paper, "A Relational Model of Data for Large Shared Data Banks", in which he explained how one could describe a database in the mathematical language of first-order predicate logic, a paper that marked the birth of "relational" databases.

Development of the first relational database management system (code-named System R) began in 1973. In 1974 Donald Chamberlin defined an algebraic language to retrieve and update data in relational database systems, SEQUEL, later renamed SQL (Structured Query Language). System R and SQL finally debuted in 1977 running on an IBM 38.

However, IBM's flagship database system remained the IMS, originally developed in 1968 for NASA's Apollo program on IBM's mainframe 360. That was, by far, the most used database system in the world. IBM's research on relational databases spilled over into UC Berkeley, and in 1973 a group led by Michael Stonebraker started the Ingres (INteractive Graphics REtrieval System) project.


Factory Automation

(Copyright © 2016 Piero Scaruffi)

Some significant progress was being made in software applications for manufacturing. Evans & Sutherland, formed in 1968 by David Evans, who had worked on the Bendix G15 and on U.C Berkeley's time-sharing system, and Ivan Sutherland, the MIT pioneer of graphical user interfaces (GUIs), both now employed at the University of Utah, created a pioneering LDS (Line Drawing System) in 1969 that in 1973 evolved into their Picture System, a graphics system for Computer Aided Design (CAD). They employed bright young engineers such as Jim Clark, Ed Catmull and John Warnock who went on to establish the field of computer graphics in the Silicon Valley.

In the 1970s MRP (Manufacturing Resource Planning) was one of the success stories of the mainframe software industry. In 1972 some engineers of IBM in Germany (Claus Wellenreuther, Klaus Tschira, Hasso Plattner, Dietmar Hopp und Hans-Werner Hector) founded Systemanalyse und Programmentwicklung (SAP, later read as "Systeme, Anwendungen und Produkte") taking with them some software that IBM had inherited from Xerox and that IBM didn't want anymore. SAP set out to create integrated business and manufacturing applications for large firms, and eventually introduced a mainframe-based product that integrated manufacturing, logistics, distribution, inventory, shipping, invoicing, and accounting. That became the reference for Enterprise Resource Planning (ERP) applications (the term was actually coined in the 1990s).

Another pioneer was based in Mountain View, and it was even by a woman, Sandra Kurtzig, a former General Electric saleswoman who would go on to become one of the first multimillionaire women of the computer industry: ASK. In 1978 it introduced ManMan, a "universal manufacturing program" running initially on the Tymshare's time-sharing service, and later on the HP 3000 mini-computer, a system that enabled mid-sized manufacturing companies to control the operation of an entire factory. Unlike SAP, that was thinking big, ASK aimed for a broader market.


Machine Vision

(Copyright © 2016 Piero Scaruffi)

The technology that enabled digital imaging was the charge-coupled device (CCD), invented in 1969 at AT&T Bell Labs by Willard Boyle and George Smith. One of the Bell Labs' physicists, Michael Tompsett, demonstrated how CCD could herald the age of electronic photography and video when in 1971 he designed a video camera that used a CCD sensor. In 1973 at Fairchild a former member of that Bell Labs group designed the first commercial CCD: Gil Amelio (the future CEO of National Semiconductor and Apple). Kodak became active in the field: in 1974 Bryce Bayer invented an electronic filter for color images, and in 1975 Steven Sasson built the first digital camera that incorporated the Fairchild CCD. At that point startups for real-time image processing multiplied quickly: View Engineering, a Hughes spinoff founded in Los Angeles by Dick Hubach (1976), Maryland's Quantex (1977), New Jersey's Object Recognition System or ORS (1977), Boston's Octek (1978), etc.

The theoretical work that had been started at the MIT by Larry Roberts peaked in 1978 with David Marr's work ("Representation and recognition of the spatial organization of three-dimensional shapes," 1978). Marr proposed to analyze images in three steps, starting with a two-dimensional "primal sketch" of the scene, that is transformed into a 2.5D sketch to provide depth, and finally into the three-dimensional model. His posthumously published book "Vision" (1982) influenced the field for decades.

Two major centers for machine vision were the SRI in California and the Enrivonmental Research Institute of Michigan (that was to the University of Michigan what SRI was to Stanford). In 1980 Charles Rosen, the founder of the Artificial Intelligence group at the SRI, and his engineer Earl Sacerdoti founded Machine Intelligence Corporation in Mountain View (which in 1982 evolved into International Machine Intelligence, a joint venture with Japanese robot manufacturer Yaskawa Electric). Ted Panofsky designed their first machine-vision product, the VS-100, used for industrial robotics. Startups whose technology was "incubated" at the Enrivonmental Research Institute of Michigan included Machine Vision International (orgiinally Cyto in 1981) and Applied Intelligence Systems (1982). Perceptron was founded in 1981 by students of the nearby General Motors Institute, notably James West.

At that point progress was rapid. Kazuo Iwama's team at Sony developed its own CCD technology and in 1985 introduced the first camcorder capable of recording video on standard 8mm tape: the CCD-V8. Cognex, an MIT spinoff founded in 1981 by Robert Shillman, introduced in 1988 the first VLSI chip dedicated to image analysis.


Office Automation

(Copyright © 2016 Piero Scaruffi)

The integrated circuit and then the microprocessor had steadily moved the electronic computer from the secluded and air-conditioned "computer room", guarded by officers dressed in white like hospital nurses, to the regular bustling office and even to the school.

Xerox PARC was in the lead. Engelbart's group at the SRI had lost funding from ARPA (and was eventually disbanded in 1977), so several of his engineers started moving to Xerox's PARC. In 1973 the PARC unveiled the Alto, the first workstation with a mouse and a Graphical User Interface (GUI). Inspired by Douglas Engelbart's old On-Line System, and developed by Charles Thacker's team, the Alto was a summary of all the software research done at the PARC since its inception. This compact interactive machine, with a user interface that allowed users to mix text and graphics on the screen, was designed for a broad range of applications, from office automation to education. It wasn't based on a microprocessor yet, but on a Texas Instruments' 74181 chip (that was just an ALU, not a full microprocessor),

In 1974, Hungarian-born Charles Simonyi, another recruit from UC Berkeley, developed Bravo, the word processor that introduced the "what you see is what you get" (WYSIWYG) paradigm in document preparation.

The Alto included the first commercial laser printer, the EARS (Ethernet, Alto, Research character generator, Scanned laser output terminal), which subsequently evolved into the Xerox 9700.

Xerox played the role of a venture capitalist investing in new technologies: graphical user interfaces, desktop computers and local area networks. By hiring Robert Taylor, Xerox de facto transplanted the futuristic vision of ARPA's IPTO into the semiconductor industry. The IPTO's vision, in turn, represented two decades of computer research carried out mostly in the Boston area. Xerox, indirectly, helped to transfer Boston's lead in computing to the Bay Area, which also happened to be the world's capital of semiconductor engineering. This was similar to what Shockley had done when he transferred the East Coast's lead in semiconductors to the world's capital of radio engineering.

In 1974 a strong believer in artificial intelligence, former MIT student Ray Kurzweil, introduced an Optical Character Recognition (OCR) software that could read text written in any font. Coupled with a scanner and a text-to-speech synthesizer, it yielded the first reading machine for the blind. (Xerox would eventually acquire the software in 1980).

The industry felt the need for better human-machine interfaces as much as the SRI and PARC scientists did. In 1972 IBM introduced the 3270 terminal to connect to mainframes. Previous terminals (generally known as "ASCII terminals") interacted with the mainframe at every keystroke, basically sending characters back and forth. The 3270 instead presented the user with a form to fill, and sent the form to the mainframe only when completed. Because it greatly reduced the input/output interactions with the mainframe, it allowed many more terminals to connect to the mainframe at the same time. It also introduced a highly popular display with 24 lines of 80 characters each, a size that would remain a standard for at least two decades.

In 1974 IBM also introduced the Interactive System Productivity Facility (ISPF), one of the earliest integrated development environments, which allowed programmers to design menu-driven applications. It basically marked the end of the punched cards. Previously, a program was entered by punching the cards and then feeding them in the card reader of the mainframe. With the 3270 and the ISPF the programmer could enter ("edit") the program directly on the terminal.

In 1974 the German government funded development of an integrated development environment by Softlab, which resulted in Programm Entwicklungs Terminal system (PET, later renamed Maestro), implemented by USA-born Harald Wieler on a Four Phase IV/70 (distributed by Philips in Europe).

The term "word-processor" had been introduced in 1964 by IBM to describe the Magnetic Tape Selectric Typewriter (MTST), available in 1966 and sold in Europe as the MT72; but that was little more than an electric typewriter (the very popular Selectric typewriter of 1961) with the addition of a magnetic tape for storing characters. If Mark Twain's "Life on the Mississippi" (1883) was the first book written on a typewriter, the first novel written on a word-processor was "Bomber" (1970) by British best-selling author Len Deighton, typed on a MT72 by his secretary Ellenor Handley (just like a typist had typed Twain's book).

IBM's main competitor was Redactron, a New York company founded and run by a woman, Evelyn Berezin, a pioneer in computer design since the 1950s, that in 1971 introduced the "Data Secretary", an electronic text-editing typewriter. By 1975 Redactron had the second-largest installed base after IBM.

In may 1972 Boston-based Wang Labs introduced the 1200, a new kind of word-processing machine: Harold Koplow, a designer of Wang calculators, had simply wired together a calculator, a typewriter and a cassette so that the user could type a document, store it, retrieve it, edit it and print it. He had rewritten the microcode of the calculator so that it would perform word-processing functions instead of mathematical functions.

In 1976 Wang added a CRT monitor so that the typist could check the text before printing it, and menus to make the interaction more user-friendly. The Word Processing System (WPS) was one of the inventions that changed every office in the world. For centuries people had to retype a page in order to correct trivial mistakes or to make simple changes. For decades people had to use a photocopier to make copies of a document. That era ended in 1975. When in 1977 Wang's Office Information System also added an Intel 8080 microprocessor, AES' pioneering vision finally went mainstream.

Unbeknownst to most people, an influential technology of the future was being developed in Europe. In 1972 Bent Stumpe, a Danish-born engineer at the CERN in Geneva (the joint European laboratory for particle physics) invented the concept of a touch screen, a screen that reacts to being touched with a finger. In 1977 CERN inaugurated its use for industrial control and in 1980 the Danish industrial-control manufacturer NESELCO commercialized the technology in the world's first touch-screen computer.

Simultaneously, in 1972 Donald Bitzer's PLATO project at the University of Illinois' Education Research Laboratory (CERL) introduced a new terminal based on plasma display for the PLATO IV release.


The State of Computing

(Copyright © 2016 Piero Scaruffi)

The electronic computer went rapidly through a number of stages. At first it was a military tool, used for two main applications: breaking enemy code and calculating firing tables. At the end of World War II the same technology was sold (in very limited numbers) to the scientific world to carry out the calculations needed by scientists. It was becoming a commercial product, but still used for mathematical applications. IBM, Remington Rand and other manufacturers of office machines applied it to the tasks carried out by electromechanical office machines. It therefore began to morph from a mathematical tool to a business tool, an evolution that would continue for decades, to the point that for most people the underlying mathematical structure would become invisible (and "computing" would become the least important application). In the 1960s Boston's military applications lead to a boom in smaller interactive computers, starting with DEC. In the 1970s the counterculture of the San Francisco Bay Area led to a boom in personal computers. In the 1980s the military network invented for the Cold War led to the Internet. In the 1990s the multinational megaprojects of the European Union led to the World-wide Web and in in the 2000s they led to the smartphone.

Note that up to this point the computer industry was (with extremely rare exceptions) a white Anglosaxon industry: immigrants played a negligible role in the development of computer technology.


The Videogame

(Copyright © 2016 Piero Scaruffi)

At this time videogame lovers were enjoying the first consoles based on microprocessors instead of custom logic, namely Fairchild's Video Entertainment System (November 1976), later renamed Channel F, that used the 8-bit Fairchild F8 (the first machine to use that microprocessor), designed by the Jamaican-born Jerry Lawson, who was the lone African-American in the Homebrew Computer Club; and Atari's Video Computer System (September 1977), later renamed 2600, that used the MOS Technology 6502. These consoles read videogames stored on "ROM cartridges".

Jay Smith built the handheld console Microvision (1979) based on a Texas Instruments TMS1100 and commercialized by Boston-based board-game seller Milton Bradley, that a long history of bestselling games, from The Checkered Game of Life (1860) to Twister (1966).

The videogame arcades that proliferated all over the world soon had a lot of choice: Tomohiro Nishikado's Space Invaders (1978), based on the Intel 8080, a game that drew inspiration from the film "Star Wars" and that, released in 1980 on the Atari 2600, single-handedly legitimized the videogame console at a time when it was still considered a novelty; Toru Iwatani's Pac-Man (1980), the first hit that was neither a space shooter like "Space Invaders" nor a sports game like "Pong", Ed Rotberg's Battlezone (1980) for Atari, Eugene Jarvis' Defender (1981), another space shooter, built by Chicago-based pinball-machine manufacturer Williams Electronics, and Shigeru Miyamoto's Donkey Kong (1981), the game in which the character Mario debuted.


piero scaruffi | Contact/Email | Table of Contents
Timeline of Computing | Timeline of A.I.
The Technologies of the Future | Intelligence is not Artificial | History of Silicon Valley
(Copyright © 2016 Piero Scaruffi)