A Brief History of Electrical Technology
Part 6: Unix

by piero scaruffi | Contact/Email | Table of Contents

Timeline of Computing | Timeline of A.I.
The Technologies of the Future | Intelligence is not Artificial | History of Silicon Valley
(Copyright © 2016 Piero Scaruffi and all pictures should be public domain)

The Unbundling and the Software Industry

(Copyright © 2016 Piero Scaruffi)

Since the beginning of the computer industry, software had been "bundled" with the hardware; or, better, a customer was charged for the cost of the computer, which included the software. It didn't seem to make sense, to the early computer manufacturers, to sell the software.

In June 1969 IBM preempted an antitrust lawsuit by "unbundling" its software, i.e. it started charging customers for it. This meant that anybody could approach the same customers and try to sell them different software. It made sense to compete with IBM in software, if not in hardware. Overnight, this boosted the software business. This was a golden opportunity for independent software firms, as the market for mainframe applications was colossal. Until then only Max Palevsky's Scientific Data System (SDS) had charged customers for software.

IBM's decision might also have been driven by the increasing costs of developing software. In theory IBM's software was "free", but, of course, its cost was factored in the price of the computer. As the cost of developing software increased, IBM was forced to charge higher prices for computers that could have been much cheaper, and the customer was forced to pay for all the software that came with the computer regardless of what was really used. Initially IBM's customers were not amused of being told that they had to pay for software that used to be free, but soon the advantages of having a free market for software applications became obvious, as multiple firms introduced different "packages" addressing different industries.

Software companies started transitioning from being "consultants" hired to develop custom applications to being third-party vendors selling their own off-the-shelf packages. The first packaged product from an independent software company was probably Autoflow, an automatic flowcharting system unveiled in 1965 by ADR (Applied Data Research), originally founded in 1959 in Princeton (New Jersey) by a group of Univac programmers, notably Martin Goetz, formerly of Sperry Rand. Goetz went down in history as the first person to be awarded a patent for a software "invention" in 1968 (a sort program that he had developed way back in 1964).

Informatics, founded in 1962 by Walter Bauer, the former manager of information systems at Ramo-Wooldridge in Los Angeles, and Frank Wagner, a former president of Share, introduced the term "informatics" in popular jargon. In 1964 Informatics acquired Advanced Information Systems from aviation colossus Hughes and turned its file management system (developed by John Postley) into a product, Mark IV, that went on to become the first best-seller of the software industry. Bauer modeled the business after IBM's business model, assuming that selling software wasn't different from selling hardware

In 1968 John Cullinane of CEIR started in Boston the first software company funded by Wall Street investors: Cullinet Software. Joseph Piscopo founded Pansophic Systems in Chicago in 1969, a pioneer of CASE (Computer Aided Software Engineering).

The first software companies were consulting operations that helped data centers program their computers to provide applications to their organization. The term "user" tended to apply to the data processing center, not to the department that actually used the results of the application program. The typical customer in the 1950s was a government agency or a research laboratory. As prices decreased and "standard" software became more useful for practical office chores, computers spread to financial and industrial corporations. Initially they too thought that the "user" of a computer was the data processing center. The standard software that came bundled with a computer solved simple problems of accounting and recording. Organizations that had spent millions to buy computers wanted to do more with those computers. Since there were precious few "programmers" around, and organizations were reluctant to hire "programmers" (still an uncharted profession), it made sense that some of those programmers decided to start companies specializing in writing programs for data processing centers. On one hand the organizations did not want to hire programmers who were needed only for the duration of a project (so the management thought) but were useless for the main business of the organization. On the other hand the programmers themselves were not excited to work for an organization because there was no career path for them. Their status and salary was certainly higher than it had been for the early (female) programmers, but they had no prospects for promotions within the organization. Thus the software business was born as a form of outsourcing.

The main problem that the early software companies encountered was the problem of estimating the costs of a software project. There was no science behind it. It was based on intuition and negotiation: you charged what the customer was willing to pay. Since the costs were so difficult to estimate, it was natural that software companies started to pay attention to which code was being rewritten for different applications. That code could be recycled from one project to the next one, thus reducing the unpredictability of development costs.

The other problem that the early software companies had to face had to do with the perception that software was not a product: customers were reluctant to pay for software, since they saw it as an integral part of the computer. They naturally expected the computer to solve their problem, not to become an additional problem. The data processing centers quickly realized the reality: the computer was indeed an additional problem that needed a whole new class of solutions. It took longer for the other departments of an organization to accept that the computer was not a solution but a problem, albeit one that potentially could lead to higher productivity and competitiveness.

The next step for software companies was to resell an entire application to multiple customers, and the category of software products was finally born. Suddenly, organizations were willing to pay good money for software applications, while software companies were able to control the costs. Software was becoming a lucrative business.

One gets the impression that at the beginning the software companies themselves were not convinced that software products were useful. These companies had very few sales people. The official reason was that only few people understood their software products, but in reality those sales people did not understand the product as much as they understood the customer, i.e. how to convince the customer to buy the product. The product was something that most customers were not asking for and sometimes did not even understand. The sales person had to create a market that did not exist. The sales person was something in between the organizational consultant, the explorer, and the itinerant medicine peddler.

All of this happened far away from the Bay Area. Most software companies were based on the East Coast or in the Midwest, because most computer users were based on the East Coast or in the Midwest. The only major exception was Los Angeles, where the aviation industry was based. The Bay Area had only two major users of computers: Bank of America in San Francisco (that had already installed the ERMA in 1955) and the Lawrence Livermore Laboratories in the East Bay (an early adopter of the PDP-1 in 1961 and of the CDC 6600 in 1964). Nothing at all in Santa Clara Valley.

In 1971 the NASDAQ (National Association of Securities Dealers Automated Quotations) market was launched. It was a fully computerized virtual market, not a traditional trading floor, implemented on two Univac 1108 computers and accessed via Bunker Ramo's 2217 terminals. It quickly becomes the preferred financial platform for technology companies.

In 1975, just months before the introduction of the first Apple computer, the whole market for software products in the USA was still worth less than $1 billion.

Software Engineering

(Copyright © 2016 Piero Scaruffi)

Software (whether developed by computer manufacturers, software companies or in-house facilities of large users) was growing rather chaotically. For too long software had been conceived as a mere appendage to hardware. Millions of lines of code had been written with no standards and often no methodology. Now the computer manufacturers were plagued with "bugs" in their operating systems that were impossible to fix, and customers were plagued with mission-critical applications that contained "spaghetti code": difficult to update and difficult to port to other machines.

In 1968 the Dutch mathematician Edsger Dijkstra wrote an article titled "GO TO Statement Considered Harmful". It was a simple (and typically illiterate) statement about how "not" to write software. Computer programmers had conceived programs as sequences of instructions. It was time to conceive them as the complex architectures that they were.

In the same year Donald Knuth published "The Art of Computer Programming", that hailed the ability of software engineers to create highly individualized programs, but the trend was against that and towards creating norms to minimize creativity and maximize productivity. Later in the same year NATO sponsored a conference in Germany on "Software Engineering" with the idea that software had to become as reliable as civil engineering: you drive on a bridge because you trust that it will not collapse, but software instead had (and still has) "bugs" that cause the machine to collapse. (This fact was inacceptable in military applications).

In 1970 Niklaus Wirth at the University of Zurich in Switzerland developed the language Pascal, a descendant of ALGOL. This language, more than anything else, marked the birth of the "structured programming" advocated by Dijkstra.

The importance of software was also recognized when (in 1968) Los Angeles-based Computer Science Corp (CSC), the largest software company in the country, that had been founded in 1959 by two members of SHARE (Roy Nutt, who had worked at IBM on the original FORTRAN, and Fletcher Jones of North American Aviation) to write software for the defense industry, became the first software business to be listed at the New York Stock Exchange. It had already "gone public" on the AMEX in 1963.

The world of computing was beginning to be called "Information Technology" (IT).

IBM and the BUNCH

(Copyright © 2016 Piero Scaruffi)

By then IBM had shot far ahead of the competition. The 370, introduced in 1970, used integrated circuits (designed and built in-house, different from the TTL chips sold by most of the semiconductor industry) and featured a vast improvement in data storage: the IBM 3330 boasted a capacity of 100 Mbytes per disk pack, hundreds of times more than the glorious 350. In 1971 the 370 also gained a "time-sharing option" (TSO). In 1972 it also introduced virtual memory to the IBM world.

IBM's competition relied on the anti-trust laws to manufacture "clones" of IBM mainframes. They were called the "BUNCH" from the initials of their names: Burroughs, Univac (that in 1971 bought RCA's computer business), NCR, Control Data Corporation (that had acquired both Bendix and Librascope), and Honeywell (that in 1970 bought General Electric's computer business).

Besides DEC, IBM had to face a new competitor: the chief architect of its mainframes, Gene Amdahl, started his own business in 1970 in the future Silicon Valley to build IBM-compatible mainframes, less expensive and faster than IBM's models. Amdahl's first machine would come out in 1975.

In 1972 Cray left CDC to start his own company. The Cray 1 supercomputer was first installed at Los Alamos National Laboratory in 1976, and Los Alamos helped port the Livermore Time Sharing System (LTSS), developed at Lawrence Livermore Laboratory on a CDC 7600.


(Copyright © 2016 Piero Scaruffi)

IBM's San Jose laboratories were assigned the task to develop a cheap storage medium to load the 370 mainframe's microcode and replace the cumbersome tape units. Previous IBM mainframes had used non-volatile read-only memory to store the microcode, but the 370 instead used a read and write semiconductor memory that had become affordable and reliable, besides solving many engineering problems. However, semiconductor memory was volatile (it was erased whenever the power was switched off), and therefore IBM had to provide a medium to reload the microcode. In 1971 David Noble came up with a cheap read-only 80-kilobyte diskette: it was nicknamed the "floppy disk". It made it easy to load the control program and to change it whenever needed. A floppy disk had the capacity of 3,000 punched cards so it didn't take long for IBM to realize that it could be used for data processing in general.

The floppy disk was originally designed to be written once and read many times, but just one year later a team at Memorex led by Alan Shugart built the first read-write floppy-disk drive, the Memorex 650, that obviously could serve more than the purpose of loading control programs into a mainframe.

Even more importantly, in November 1973 IBM introduced the 3040 hard-disk drive, the so-called "Winchester drive", with a total capacity of 60 megabytes. It was developed at the same San Jose laboratories by Kenneth Haughton's team for the low-end System/370 models. This was the drive that truly enabled transactional systems.

Computers were especially vulnerable in "real-time" transactions, when they had to process multiple critical transactions in milliseconds, a problem typical of financial applications. In 1968 IBM had started beta-testing the transactional system CICS (Customer Information Control System), developed at IBM's Palo Alto laboratories by Ben Riggins' team. Riggins had been inspired by his experience working with big utility companies (in Virginia and Michigan) to provide a way for them to access information online and in real time instead of the batch processing that was the norm on mainframes.

The Winchester hard disk and CICS bootstrapped the market for online transaction systems. CICS, officially released in July 1969, one month after the "unbundling", was one of IBM's first software products (software that was actually for sale), and it would remain one of software's all-time bestsellers: within 20 years, 489 of IBM's top 500 customers would be running CICS, for a grand total of 30,000 licences worldwide. Early adopters of CICS in 1968 (when it was still free and only supported assembly language) included Transamerica in Los Angeles and United Airlines. In 1970 IBM added support for COBOL and PL/I. CICS' early customers were utilities, telephone companies, banks, insurance companies, retail chains, and government agencies. Ordinary people would end up using CICS daily in their bank operations, utility payments, credit-card transactions, etc.

The ability to process large amounts of data in a short time became the real value of IBM mainframes as DEC minicomputers were expanding out of their initial niche into more and more areas. IBM continued to enjoy an unrivaled advantage in "data processing", an activity that now required real-time access to large amounts of data.

For the record, the other two software products released by IBM in 1969 were Information Management System (IMS) and Generalized Information System (GIS).

Power to the People

(Copyright © 2016 Piero Scaruffi)

The 36-bit PDP-10 (1967) and even more the PDP-11 (1970), the machine designed again by Gordon Bell (after a proto-designed by Harold McFarland) that finally converted DEC to the 16-bit word, to the integrated circuit and to the bus, represented a real revolution in the way thousands of users could interact with the computer. Instead of having to deliver their deck of punched cards to the computer room's staff, this generation of users could access the computer directly, without an intermediary. The machine was still shared with many other users (one at a time or all at the same time, depending on the operating system), but at least each user was able to touch and feel the computer. The PDP-11 came with the RT11 operating system, a single user real-time system. In 1972 DEC added the multi-user, multi-tasking operating system RSX-11M (Resource Sharing eXtension), the first operating system designed by Dave Cutler (later designer of VMS at DEC and Windows NT at Microsoft). Both the 10 and the 11 still used magnetic cores instead of integrated circuits for the RAM.

The Nova had a bus that standardized and simplified how the various components were wired together. DEC expanded and perfected this concept with the PDP-11/20's Unibus that connected in the same way all memory and input/output devices, thus allowing the customer more flexibility in configuring the machine. Strategically, the Unibus was not kept as a corporate secret but widely publicized so that anybody could add compatible equipment. The PDP-11 was also a lot easier to program than the PDP-8 (mostly in Fortran). Overall this machine greatly expanded the range of applications for minicomputers. DEC would end up selling almost 200,000 PDP-11s.

The PDP-10 ran the ITS (Incompatible Timesharing System) at the MIT. In 1972 DEC introduced the operating system TOPS-10, co-developed with this MIT group, and this operating system made it easy for users to create their own files and operated on these files. Each file had a name and an extension, and the extension told the system what the file was for. The user accessed her or his files from a regular terminal, directly and in person. The user had a text editor called TECO to make changes to the data contained in a file. DEC had made it very easy for people with limited knowledge of computers to write their own programs and manage their own data. Several startups began offering time-sharing systems to the public. (In 1970 the Computer Center Corporation, C-Cubed, offered computer time on a PDP-10 to the population of Seattle, including a teenager called Bill Gates). The PDP-10 was such a runaway success among students and young scientists that in 1972 Xerox PARC built a clone of the PDP-10 called MAXC, despite the fact that Xerox was selling the Scientific Data Systems 940 for time-sharing applications.

HP still specialized in laboratory equipment, such as the Fourier Analyzer 5451A for digital signal analysis (1972) that was built on a 2100A mini-computer.

When Hewlett-Packard decided to enter the minicomputer battlefield in earnest, it followed a different strategy, that indirectly acknowledged the rise of software: hardware and software engineers cooperated in writing the specifications. The HP /3000, released in November 1972, was one of the first computers to be completely programmed in a high-level language instead of the prevailing assembly languages. In 1974 it also introduced its own database management system, IMAGE. The success of this machine transformed HP from a company of laboratory equipment into a company of office equipment.

At the same time, a small revolution in printing was taking place on the East Coast. In 1970 DEC introduced the first dot-matrix printer, the LA30, which set the standard for minicomputer printers. In the same year Centronics (a Wang division based in New Hampshire) introduced its Centronics 101, another dot-matrix printer but equipped with the parallel port interface that would become a standard for connecting printers to computers.

A gap began to grow between the machine and the users. For decades the computer user had been trained, guided and assisted by corporations such as IBM that had a sales force specifically for that purpose. DEC did not have such a sales force, and certainly the makers of pocket calculators didn't. Nonetheless, the users of time-sharing systems running on DEC and the users of programmable calculators like the HP-35 needed the same if not a greater degree of assistance in using their machines. This led to a proliferation of clubs (the evolution of the "user group") and of magazines (the evolution of the "newsletter") through which computer users helped each other, and soon the shops of electronic components and devices became the epicenter of this alternative computer world (like the ones on "Radio Row" in New York, before construction of the World Trade Center dispersed them).

The "do-it-yourself" spirit of the minicomputer world helped create an "alternative" view of what computers could do for society, other than serve military and business purposes. The spirit of the counterculture of the time began to percolate into the world of computers, as demonstrated by books such as Ted Nelson's "Computer Lib" (1974) and Joseph Weizenbaum's "Computer Power and Human Reason" (1976).

The Unix Operating System

(Copyright © 2016 Piero Scaruffi)

In November 1971 the Bell Labs unveiled the Unix operating system. It was the successor to MULTICS (Multiplexed Information and Computing Service), a time-sharing operating system for General Electric's mainframe GE-645 that had been jointly developed by the MIT's Fernando Corbato (the father of time sharing) and by Bell Labs, and that was first deployed in 1965. The Bell Labs scientists, Kenneth Thompson and Dennis Ritchie, pulled out in 1969 and worked on a smaller version of Multics, which became Unix, written in the assembly language of the PDP.

In 1973 they also rewrote it in a programming language called C, developed by Ritchie the year before, so that Unix could be easily ported to other computers (although de facto until 1976 Unix only ran on DEC PDPs). That marked the first time that an operating system was written in a high-level language. In 1958 AT&T (the owner of Bell Labs) had been struck with an anti-trust court sentence that forbade it to ever enter the computer business and that forced it to license any non-telephone invention to the whole world. This old caveat turned Unix into a worldwide phenomenon, as it spread from one corner of the computer world to the other one.

Britain and Japan

(Copyright © 2016 Piero Scaruffi)

In the 1940s Britain had pioneered computers. In the 1960s Japan was experiencing the most spectacular economic boom in the world. These two countries were the only countries that could potentially compete with the USA. Their stories were very different. The British computer industry had all the know-how necessary to match developments in the USA. The Japanese computer industry had to start from scratch in the 1950s. In both cases the government engaged in sponsoring long-term plans and in brokering strategic alliances among manufacturers. In Britain leadership came from the National Research Development Corporation or NRDC, in Japan from the Ministry of International Trade and Industry or MITI. In both cases initial research came from a government-funded laboratory: in Britain the National Physical Lab or NPL and in Japan the Electrotechnical Lab or ETL. However, the outcomes were completely opposite: Japan created a vibrant computer industry within the existing conglomerates (notably Fujitsu, Hitachi and NEC), whereas Britain's computer industry self-destroyed within two decades.

In 1954 Fuji Telecommunications Manufacturing (later renamed Fujitsu) developed Japan's first practical computer, the electromechanical FACOM 100, followed by the Mark II that was designed by Electrotechnical Laboratory (ETL) and manufactured by Fujitsu. The first electronic computer of Japan, FUJIC, was created by Okazaki Bunji of Fuji (the photography company) in 1956. In 1956 the Electrotechnical Laboratory (ETL) of Japan built a prototype transistorized computer, the Mark III. Hitachi entered the computer business in 1957 with the HIPAC MK-1, followed by the transistorized HITAC 301, and NEC in 1958 with the NEAC-1101, followed almost immediately by the transistorized NEAC-2201. Fujitsu's first transistorized computer came out in 1961, the FACOM 222. In 1963 NEC introduced a series of computers that cloned Honeywell models. In 1964 RCA debuted a new business model with its Spectra 70 computer, that could run the same software as the IBM System/360. In 1970 Amdahl chose that as his company's strategy and created the industry of plug-compatible manufacturers (PCMs). This soon became a Japanese specialty, with Fujitsu and Hitachi conquering the largest share of the PCM market.

By 1979 Fujitsu would pass IBM to become Japan's main computer manufacturer. By then International Computers Limited (ICL), created by merging all the main British manufacturers, was broke. Eventually, Fujitsu would acquire it.

piero scaruffi | Contact/Email | Table of Contents
Timeline of Computing | Timeline of A.I.
The Technologies of the Future | Intelligence is not Artificial | History of Silicon Valley
(Copyright © 2016 Piero Scaruffi)