Lab Inventors: A Case Study on Xerox PARC and its Innovation Machine (1969-83)
Database Lords: A Case Study on Larry Ellison's Oracle Corp. (1977-2010)
Magicians: Steve Jobs' Reality Distortion Field and Apple Computer (1976-2010)
by Arun Rao
by Arun Rao
The Creation of a Corporate Research Lab
Xerox Palo Alto Research Center
(PARC) was the US’s most successful corporate research lab
in the 1970s. Researchers invented the personal computer, the graphical user
interface (GUI), the laser printer, and Ethernet networking
technology. Many agree that the secret sauce that made PARC so successful was
its highly talented employees. Six factors brought these people together in a
creative environment. First was Xerox’s seeming endless pool of cash devoted to
research. Second was a buyer’s market for talent. PARC was started in a weak
economy when after the Vietnam War the federal government was cutting back on
research staff. Third was the state of computer technology, which was at an
inflection point due to Moore‘s Law. Fourth
was its quality management, which knew how to hire the best researchers, give
them a broad mandate, and then let them play without directives, instructions,
or deadlines. Freedom to experiment was invaluable. Fifth were the premium
salaries that Xerox paid its researchers, about $30,000-35,000 in 1970, a nice
amount for a new PhD. Sixth was a paucity of startup opportunities; when PARC
was started, computer science researchers couldn’t easily find funding for a
startup, though that would change.
While Xerox never
commercialized all the wonderful technologies at PARC, the company did earn billions from these
innovations, and so made its money back many times over. A handful of people
deserve credit for starting PARC. Jack Goldman, Xerox’s Chief Scientist,
submitted in May 1969 his proposal for an “Advanced Scientific & Systems
Laboratory” to pursue research in computing and solid-state physics. As Goldman
told Xerox execs: “If you hire me, you
will get nothing of business value in five years. But if you don’t have
something of value in ten years, you’ll know you’ve hired the wrong guy.” Xerox’s CEO, Peter McCullough, had the vision
and long-term good sense to approve and champion it. In 1969 McCullough had
Xerox purchase Max Palevksy’s Scientific Data Systems (SDS) for $920 million in
stock. It was a computer company with a second rate minicomputer product that
Xerox would divest years later. Yet McCullough wanted the company to explore in
that direction and he had Jack Goldman take the lead for PARC to create “the
office of the future.”
Goldman first recruited some star managers. The most
important was Bob Taylor, a former ARPA director. Next, coming in early 1970,
George Pake accepted the job of director PARC and persuaded
Goldman to locate it in Palo Alto, California, near Stanford University.
Taylor had a gift for finding and cultivating talented researchers in
the computer science field. After the GI Bill paid for Taylor’s study of
psychology at the University of Texas, he eventually joined JCR Licklider at
the government-run ARPA. Around October
to December 1970, Arpanet, the precursor to the Internet, became formally
operational, with four nodes up and running. ARPA had a $14 million budget for
computer science research, more than the top 5 other grant-givers combined.
Taylor eventually became Ivan Sutherland’s deputy at ARPA, and then soon began
running the Information Processing Techniques Office (IPTO). Taylor would
approach the best computer science programs in the country and work with PhD
students and junior faculty to find cutting edge projects to promote, many in
the field of human computer interaction. Taylor was important because at ARPA
he funded the country’s first computer science grad programs at Stanford, CMU,
and MIT. He knew all the young researchers in the field and had their trust. He
knew enough to ask good questions and direct them, but was candidly not a
specialist and would not micromanage research. So Taylor built one of the best
professional networks in the field, and met people like Alan Kay, who said in 1972 that “90% of all good things that I
can think of that have been done in computer science have been done funded by”
ARPA. The ARPA model was to find good people, give them a lot of money, and
then step back. If the researchers didn’t deliver in three years, they were dropped.
Alan Kay was one of the
spiritual leaders of PARC. In July 1969, Kay’s doctoral dissertation, “The Reactive Engine,” was
accepted at the University of Utah (he only got into the PhD program because Don
Evans, the director, never looked at grades). Within Kay‘s paper were early descriptions of his “Dynabook”
personal computer, basically an early laptop.
Kay was a non-stop
idea machine; half of which were brilliant and unworkable, the other half could
be tested and be revolutionary. He had been a child prodigy and pure motion -
he could never sit still. Kay hated the
time-sharing computer terminals that everyone had to use at that point. Whether
it was a mainframe or a minicomputer, you had to share it and they had blinking
green text and were only accessible to a nerdy few. Kay wanted an
interface children could use, more like finger paints and color TV.
As PARC took off, the
1970s was a tough decade for Xerox. In 1970 IBM brought out its
first office copier, ending Xerox’s historic monopoly and introducing a period
of painful retrenchment at Xerox. Also as some execs tried to later kill PARC,
Xerox board member and Nobel laureate John Bardeen (co-inventor of
the transistor) fought to save PARC in board meetings, believing the $1.7
million budget was worth it.
Douglas Engelbart and SRI’s
Augmentation Research Center
Before delving more into PARC, it’s important to understand its neighboring
institution, the Augmentation Research Center (ARC), and Douglas Engelbart. Near the end of World War II, Engelbart was midway
through his college studies at Oregon State University when the Navy drafted
him. He served two years as a radar technician in the Philippines, where, on a
small island in a tiny hut up on stilts, he read Vannevar Bush‘s 1945 article “As We May Think.” Bush wrote about computing and a future when
a “memex” device would augment human intelligence. A human could use it to
store “all his books, records, and communications, and which is mechanized so
that it may be consulted with exceeding speed and flexibility.” Engelbart’s experience as a radar technician convinced him that
information could be analyzed and displayed on a screen. He dreamt of knowledge
workers sitting at display “working stations,” probing through information
space and harnessing their collective intellectual capacity to solve problems.
Engelbart returned to
complete his Bachelor’s degree in Electrical Engineering in 1948 and he got a
PhD from UC Berkeley in 1955. After
a year of teaching at Berkeley as an Acting Assistant Professor, he took a
position at the Stanford Research
Institute (SRI) in Menlo Park. In October 1962, Engelbart published a key
document about computing and his beliefs on the modern workplace: “Augmenting Human Intellect: A Conceptual Framework.”
At SRI, Engelbart had a dozen
patents to his name and he proposed research to augment the human intellect
using computers. ARPA, a US government research agency, funded him and he
launched the Augmentation Research Center (ARC) within SRI. ARPA gave the team
funds to explore Man-Computer Symbiosis, plus technology for “time sharing” of
a computer’s processing power between a number of concurrently active on-line
users. Engelbart and his team
developed computer-interface elements such as bit-mapped screens, the mouse,
hypertext, collaborative tools, and precursors to the graphical user interface
in the mid-1960s, long before the personal computer industry did. At that time,
most individuals were ignorant of computers; experts could only use mainframes
with proprietary systems and difficult-to-master text interfaces. After two
years of unproductive work for ARPA, Bob Taylor at ARPA funded
a project to experiment and evaluate various available screen selection
devices, or pointers, for use in on-line human-computer interaction.
Taylor’s ARPA grant led to the modern computer mouse.
Engelbart conceived of
the device and Bill English actually built
the first wooden prototype. In 1967, Engelbart applied for a
patent (with Bill English) for a wooden shell with two metal wheels: a computer
mouse (US Patent 3,541,541). They
described the device as an “X-Y position indicator for a display system.” No one at the lab remembered who gave it the
name “mouse,” but someone did because the tail came out the end. Sadly, Engelbart and English
never received any royalties for the mouse. SRI held the patent but had no idea
of its value; it later licensed the mouse to Apple Computer for
about $40,000.
A year later, Engelbart gave the
“Mother of All Demos.” On December
9, 1968, Engelbart and his group
of 17 researchers gave a 90-minute, live public demonstration of their work. It
was at a session of the Fall Joint Computer Conference held at the Convention
Center in San Francisco, attended by about 1,000 computer professionals. A
number of experimental technologies that have since become commonplace were
presented. It was the public debut of the computer mouse, hypertext
(interactive text), object addressing, dynamic file linking, video
conferencing, teleconferencing, email, and a collaborative real-time editor
(where two persons at different sites communicated over a network with audio
and video interface).
A year later, Engelbart‘s lab became the second node on the Arpanet (the
predecessor network that evolved into the Internet). On October 29, 1969, a
link was established between nodes at Leonard Kleinrock’s lab at UCLA and Engelbart‘s lab at SRI. Both sites would serve as the backbone
of the first Internet. In addition to SRI and UCLA, UCSB and the University of
Utah were part of the original four network nodes. By December 5, 1969, the
entire 4-node network was connected. Engelbart‘s ARC lab soon became the first Network Information
Center; it managed the directory for connections among all Arpanet nodes. One
could say that Engelbart‘s lab in Palo Alto was the
physical home of the most important Arpanet/Internet node for its first few
years.
During his time at SRI, Engelbart developed a
complex philosophy about man improving through technology, a sort of
co-evolution through human-computer interactions. Engelbart was strongly
influenced by Benjamin Lee Whorf’s hypothesis of linguistic relativity. Whorf
argued that the sophistication of a language controls the sophistication of the
thoughts expressed by a speaker of that language. In parallel, Engelbart believed that
the state of current technology controls people’s ability to manipulate
information. Better manipulation led to more innovation and new, improved
technologies. People could even work in groups, where the collective IQ would
be larger than the sum of the parts (witness the modern laptop, created by
teams of specialists using other computers to design and prototype a laptop’s
different components). Engelbart pithily stated
to Reader’s Digest: “The rate at which a
person can mature is directly proportional to the embarrassment he can
tolerate. I have tolerated a lot.” He
was paid more by Reader’s Digest for this quote than for his many inventions.
By 1976, Engelbart slipped into
relative obscurity. Some of his ARC researchers became alienated from him and
left to join Xerox PARC. Engelbart saw the future
in collaborative, networked, timeshare (client-server) computers, while younger
programmers preferred working on personal computers (individual machines that
would not be shared and controlled by a centralized authority). Eventually
funding from ARPA stopped by 1977 and SRI transferred the lab to Tymshare,
which tried to commercialize some of Engelbart’s software. However Engelbart was
marginalized and relegated to obscurity. Management, first at Tymshare, and
later at McDonnell Douglas (which took over Tymshare in 1984), liked his ideas
but never committed the funds or the people to further develop them. Engelbart retired from
McDonnell Douglas in 1986 and in 1988 founded the Bootstrap Institute with
modest funding to promulgate his ideas.
Hiring the Best Computer Scientists Around
On July 1, 1970, Xerox’s Palo Alto Research Center
(PARC) officially opened its doors at 3180 Porter Drive,
near Stanford University. For
the location, Yale’s New Haven was the first choice, but Goldman was put off by
the snobbery of the university and its hostility to enterprise next door to it.
Berkeley had no
dedicated real estate near the campus, and Santa Barbara had no large airport.
The physical and cultural climate in Palo Alto helped. Pake had hired Bob
Taylor to help him
staff the Computer Science Lab. Taylor forced the researchers to build things
they could use daily and avoid prototypes and playthings that just sat on a
shelf. He described his position at Xerox like this: “It’s not very sharply
defined. You could call me a research planner.”
Taylor made two key hires for PARC. First, in November he hired the engineers of the
failing Berkeley Computer
Company, including Butler Lampson, Chuck Thacker, and Peter Deutsch. Second,
Taylor raided Doug Englebart’s lab at SRI’s Augmentation Research Center, where
there was no desire to make a product or prototype, but just to search for
knowledge. Bill English, a brilliant hardware engineer, left for PARC and
other Englebart protégés followed.
PARC’s first big project came from a corporate squabble.
The researches decided to build a clone of the DEC PDP-10, which
was the standard minicomputer machines of the time that all the researchers
wanted. Xerox had tried to
force them to take an inferior SDS machine because Xerox owned SDS. Instead,
the PARC researchers lost the battle to buy a PDP-10 but won the war by just
ordering parts and putting together a PDP-10 clone. It was a great bonding
exercise and a waste of one year. They called it the MAXC as a comeuppance to
Xerox management and the poor products that Max Pavelksy’s SDS made.
The PARC researches were
tinkerers and hackers. They liked to make things. Generally the office had a
feeling of collegiality and a grad school environment. It had lots of informal
collaboration or “Tom Sawyering,” with someone proactively setting forth an
idea or project and then convincing others to join to attack it. If the problem
or project got momentum, the ad hoc team could spend 3-6 months on it; if not
everyone dispersed and looked for something else. One project was to make
replicas of the expensive Bose 901 speaker systems, where a set cost $1,100.
They reverse-engineered the speakers and made 40 pairs for the team at a cost
of $125 per set. Alan Kay once said “a
true hacker is not a group person. He’s a person who loves to stay up all
night, he and the machine in a love-hate relationship.” Hackers were nerdy kids who were smart but
un-interested in conventional goals. Computing was ideal because no credential
or PhD was required and coders could be independent artisans, selling directly
to customers based on the quantity and quality of output and not pedigree or
something else.
One PARC institution was
“Dealer,” a weekly meeting in a lounge with sofas and bean bag chairs at lunch
time, usually Tuesdays. Attendance was mandatory for the computer science
researchers. It began with housekeeping, and then one person would be the
“dealer” and take over, to set a topic for discussion and rules of debate.
Topics were unconstrained, like how to take apart and re-assemble a bike, how
programming algorithms are similar to kitchen recipes, or a presentation on the
sociolinguistics of the Nepalese language and culture. Discussion and blunt
talk were common, with people calling each other out with ejaculations like
“bullshit” and “nonsense,” not to mention denunciations like “That’s the
stupidest thing I’ve heard” or “It’ll never work.” It was a feral seminar, a match of intellects.
By the summer of 1972, Kay and a
hand-picked team completed the first version of their revolutionary
object-oriented programming language, Smalltalk, which would heavily influence
such modern programming systems as C++ and Java. Kay had the idea in
a shower in Building 34 on the Xerox campus for an
entirely new syntax of computer programming based not on data and procedures,
but on “objects” that would be discrete modules of programming. Object-oriented
languages are easier to code in because as a program becomes more complex, much
complexity is kept within an object. So a programmer can manipulate the program
more easily and stick to the big picture rather than getting lost in the
granular code. Because anything could be an object, like a list, word, or
picture, Smalltalk did well for a graphical display. It was the language that
enabled the Alto to be really
useful.
Around that time, Stewart Brand wrote an article about
PARC titled
“Spacewar.” It was about
a game called Spacewar on their computer which joined computers and graphic
displays. It was play, part of no one’s scheme or theory, and just done for
competitive fun. Yet it encompassed many of the things the researchers were
trying to do for computing. As Brand noted, it was interactive in real-time,
used live graphics, served as a human communication device, was on stand-alone
computers, and was quite fun in a way that only games could be.
The PARC researchers
would go on to make numerous devices that lived up to these principles.
The Miraculous Inventions of PARC
In early 1971 Gary Starkweather transferred from Xerox’s other research lab in Rochester to PARC, bringing with him the concept of the laser printer.
Starkweather was a scientist outcast at the other lab in Webster, where he
created a laser technology to “paint” an image onto a xerographic drum with
greater speed and precision than ordinary white light. In November 1971
Starkweather completed work on the world’s first laser computer printer. He had
modulated a laser to create a bit-mapped electronic image on a xerographic
copier drum. The commercial project was approved and killed three times, saved
only by Jack Lewis, a Xerox executive who ran the printing division and ignored
orders. In 1972, the Lawrence Livermore Lab in Berkeley put in an order
for the printers, which Xerox declined to fulfill (too low production
run-unwilling to create an early adopter market). A corporate committee decided
to delay for three years until a conventional high-speed printer, the 9000
series, was made and sold. The Xerox 9700 laser printer only came out in 1978,
and that was after Burroughs showed it in a demo at the Hanover Messe. The
laser printer and its successors would generate billions in sales.
In September 1972, after MAXC was completed, Thacker
and Lampson invited Kay to join their
project to build a small personal computer. The machine would be known as the
Alto, and have a keyboard, screen, and processor in
portable, suitcase-sized package (it would later have a mouse and GUI interface). The
idea was that processors would be cheap enough in 5-10 years for every person
to have their own “personal computer” instead of sharing time on an office
computer.
In November 1972, Thacker began design work on the
Alto. The original plan was to make 30 units for the PARC computer science
lab. The screen would be 8.5x11” to mimic paper and the projected cost was
$10,500 per machine. In the end, Xerox made 2,000
Altos at a cost of about $18,000 per machine, which fell to $12,000 after a
high-volume program was put in place. There were some technical innovations
like micro-parallel processing (to shift the memory access problem to the
microprocessor) and a new high-performance display that used less memory (and
so allowed the user to actually run apps).
Meanwhile, in June 1972, Bob Metcalfe encountered a
technical paper by Norman Abramson describing Hawaii’s ALOHAnet, a radio
network. Metcalfe would use several principles in that paper while designing
the first Ethernet, a computer networking technology for local area networks
(it’s how most office Internet networks are connected even in 2010). A month later, Bob Metcalfe wrote a patent
memo describing his networking system, using the term “Ethernet” for the first time.
Metcalfe had come from Harvard after they rejected his doctoral thesis on how
networks transmit data in digital packets because it was “insufficently
theoretical.” He would later use the
concepts in that thesis to build a multi-billion dollar company and transform
the networking industry (he also resubmitted his thesis with more math, and it
was accepted). Metcalfe had a huge advantage over many researchers because he
was the Arpanet liaison or “facilitator” at MIT in 1971, and so saw the early
networking technical issues and he had valuable personal connections with
people on Arpanet. Instead of getting a university position after graduation,
Metcalfe chose Xerox for the high
pay, beautiful weather, and pure research freedom with no teaching
responsibilities or worry about tenure. Metcalfe hooked up MAXC to Arpanet, but
other local network proprietary systems were too expensive. Taylor had set
specs for a local area network linking the Altos whose cost was no more than 5%
of the computers it connected and was simple, with no complex hardware, and
that was reliable and easily expandable (didn’t want to splice cable all the
time). Metcalfe used Abramson’s paper, adapting it for Altos and building in
redundancy (a string of verification bits known as “checksum”) and an algorithm
to deal with interference. It would also require a physical line and Metcalfe
called it the Ethernet.
Initially none of the Alto users wanted to
use Ethernet at a $500 cost, and it competed with “sneakernet,” that is, people
using hard disks and walking between labs with sneakers to transfer data. But
when an early version of Starkweather’s laser printer was connected to the
Ethernet, the “EARS” system was too valuable. Ethernet for the network, the
Alto for the personal computer, a Research character generator for early word
processing, and a Slot machine (the name for the laser printer) to make
professional paper documents. On March 31, 1974 Metcalfe filed a patent for
Xerox (awarded two
years later). He then quit for a job at Citibank, where he got higher pay and a
chance to work on its electronic fund transfer system. He was the first top
researcher to leave PARC.
Bob Metcalfe of 3Com (2010)
The Alto is the First
Personal Computer (PC)
In April 1973, the first Alto became
operational, displaying an animated image of Sesame Street’s Cookie Monster.
The Alto was described in a memo in 1972 by Butler Lampson (himself inspired by
the “Mother of All Demos” of Doug Engelbart); Chuck Thacker was the main designer of the Alto.
Lampson’s memo had proposed a system of interacting workstations, files,
printers, and devices linked via one co-axial cable within a local area
network, whose members could join or leave the network without disrupting the
traffic.
The Alto was
revolutionary because it was a personal workstation for one, not a room-sized,
time-sharing computer for many, meant to sit on a single desktop. It is
credited as being the first “personal computer” (PC) in a world of mainframes
(note that some would argue for other PCs being first, like the Olivetti P101).
The Alto had a bit-mapped display, a graphical user interface (GUI) with windows and icons, and a “what you see is what
you get” (WYSIWYG or “wizzy-wig”) editor. It also had file storage, a mouse,
and software to create documents, send e-mails, and edit basic bitmap pictures.
Also in April 1973, Dick Shoup’s “Superpaint” frame buffer recorded and storeed
its first video image, showing Shoup holding a sign reading, “It works, sort
of.” It was the first workable paint
program.
The Alto got better as
PARC’s programmers built apps for it. In fall 1974, Dan Ingalls invented
“BitBlt,” a display algorithm that later made possible the development of key
features of the modern computer interface (overlapping screen windows, icons,
and pop-up menus which could be manipulated with a click of the mouse). This
was the desktop metaphor used by 99% of personal computers around the world
even in 2010. At the same time, Charles Simonyi, Tim Mott, and Larry Tesler began work on two
programs which would become the world’s first user-friendly computer word
processing system.
The Alto, BitBlt, and Bravo basically created the modern
industry of desktop publishing, used by office workers around the world.
Ordinary people at home or work could turn out professional quality
newsletters, magazines, books, quarterly letters, and so on, faster and more
easily.
Bravo, the word processor, has a fascinating story.
Charles Simonyi, a Hungarian computer science student who defected to
the US, was a key actor. His defection, as a side note, caused the Hungarian
government to fire his father from a teaching job at a Budapest engineering
institute, showing how the vaunted “Soviet science” system devoured its best
talent for idiotic political reasons. Simonyi built on Burt Lampson’s ideas for
holding an entire document in memory using “piece tables” to create an app
called Bravo. It was the first “what you see is what you get” WYSIWYG word
processor on a computer at a reasonable speed – a useful application. People
started coming to PARC to use it for
personal stuff like PTA reports, letters to professional bodies, resumes, and
so on. Their friends writing PhD theses wanted to use it. Larry Tesler and Tim
Mott improved the Bravo user interface to create something similar to the
menu-based interface people use in MS Word in 2005. It had features like “cut,”
“paste,” and so on, after watching how non-engineers actually interacted with
the interface.
In early
1975, Xerox established the
System Development Division, as a stronger attempt to commercialize PARC technology.
More than five years later, SDD would launch the Xerox Star. Meanwhile, a Sante
Fe startup called MITS was selling the Altair 8800, a hobbyist’s personal
computer sold as a mail-order kit. It made the cover of Popular Electronics and
caught the attention of a generation of youthful technology buffs—among them,
Bill Gates and Paul Allen.
In February 1975, PARC engineers
demonstrated for their colleagues a graphical user interface for a personal
computer, including icons and the first use of pop-up menus. This concept would
later be stolen by Steve Jobs and Bill Gates and be
developed into the Windows and Macintosh interfaces of
today. A month later, PARC’s permanent headquarters at 3333 Coyote Hill Road
formally opened.
Others Commercialize on PARC Technology
Due to one bad corporate decision, a billion dollar
product was lost. In August 1977, Xerox shelved a plan
to market the Alto as a commercial
project. It closed the door to any possibility that the company would be in the
vanguard of personal computing. If Xerox had followed through with its plan, it
would have released a PC in mid-1978, beating the IBM PC by three
years with a much better machine. The project was killed because Xerox
President, Archie McCardell, was an accountant who didn’t get technology. Also,
because of Xerox’s poor organizational structure, the Altos would have to be
made by a Dallas manufacturing facility that made typewriters. The managers in
Dallas just wanted to keep making the same product and get their highest
short-term bonuses. Xerox’s top execs
just didn’t get the Alto or PCs. They were used to a leasing business model
where customers leased a copy machine and paid annual fees for the copies used
based on the meter. Their fear was that if there was no print copy, “how would
Xerox get paid” over and over again?
However, Xerox did sell some
early Altos running Bravo to the Carter White House in 1978, and eventually to
Congress for their offices. John Ellenby tried to more aggressively push the
sales of Altos. But after senior management interfered too much (over the
course of 3 years), Ellenby quit Xerox in 1980. He started his own company,
Grid Systems, making some of the world’s first laptop computers.
At the same time, during a “Futures Day” at the Xerox World
Conference, Boca Raton, Florida, personal computers, graphic user interfaces,
and other PARC technologies
were introduced to a dazzled sales force. Other than the laser printer,
however, few reached market under the Xerox name.
In June 1978, PARC scientists
completed the Dorado, a high-performance PC, and Notetaker, a suitcase-sized
machine that became the forerunner of a generation of laptops. The next month,
PARC made a mistake by starting a program in silicon-based integrated circuits
and building an expensive fabrication lab. Building a fab lab briefly put Xerox in competition
with Intel, a hardware component company which Xerox had no
business competing against (and Xerox never made money in that business). Xerox
was attempting to do something internally that it could do much better and more
cheaply by sourcing externally.
In December 1979, two key events occurred. First,
Stanford University
Professor James Clark designed the “Geometry Engine,” the first 3-D computer
graphics chip and later the foundation of his billion-dollar company, Silicon
Graphics, Inc. He had used design principles formulated at PARC. The company’s chips allowed the computer-aided
design of cars, aircraft, roller-coasters, and movie graphics like “Jurrasic
Park.” Clark’s first test chip was built
by Lynn Conway at PARC, who came from IBM in 1972. She
had written a book with Carver Mead on VLSI chip design (how to pack more
circuits into a microprocessor). PARC then offered professors at a dozen
schools the use of PARC’s lab to create their own specialty microprocessors.
Clark moved to PARC’s offices and focused for 4 months in the summer of 1979 to
create his chip.
At the same time Carver Mead went to Xerox headquarters to
suggest they do a better job of commercializing PARC technology. He
suggested Xerox set up an internal venture capital arm to fund startups with
technology made by their scientists. Xerox would take an equity role and have a
strategic position, while incentivizing entrepreneurial scientists to run
companies. Xerox declined.
The second big event in December 1979 was when Steve
Jobs and a team of
Apple Computer
engineers visited PARC twice and took
copious notes. They came because one of Jobs’ key designers, Jef Raskin, had many relationships with PARC researchers and he
was impressed with their work. Jobs had signed a
deal with Xerox letting Abe
Zarem’s Xerox Development Corporation, a subsidiary, invest in Apple pre-IPO in
exchange for “marketing help.” It turned
out that the technology demos were much more important, and they gave Jobs a demo that no
other outsider had received at that point. After observing its hardware and
software in action, Jobs and his team
took steps to incorporate Alto‘s design principles and the GUI into the Apple
Lisa and Macintosh. Jobs even poached
some PARC talent, like Larry Tesler, who would eventually become Apple’s Chief
Scientist.
In September 1980, PARC finally
released its first invention to the world for commercialization. Along with
Intel and Digital
Equipment, Xerox issued a formal
specification for the Ethernet and made it publicly available for a nominal
licensing fee. Ethernet quickly became
the networking technology of choice. PARC scientists also worked on an Internet
Protocol standard, called PARC Universal Packet, or “Pup,” which eventually
became a crucial part of the Arpanet standard known as TCP/IP. It became the
standard for much of the data passing through the Internet. At the same time,
John Shoch invented an early computer virus, a “worm,” which temporarily shut
down the entire network and all the Altos at PARC one day in 1978.
Xerox did have a new
computer product; it just wasn’t a good one for the market. In April 1981,
Xerox unveiled at a Chicago trade show to wide acclaim the Star workstation as
the Xerox 8010 Information System, with a beautiful GUI and desktop
metaphor. It was the commercial offspring of the Alto and other PARC technology.
However, the Star was slow and cost $16,600. Moreover, customers needed to buy
2 to 10 at a time, and had to install Ethernet and a laser printer. The costs
were daunting. By August IBM unveiled its
Personal Computer, forever altering the commercial landscape of office
computing and making the Star obsolete. IBM’s machine only cost $5,000 and
didn’t have the pretty GUI. It didn’t have icons, windows, a desktop metaphor,
e-mail, or Internet; it crashed
randomly. Yet it was good enough for basic business tasks and apps and
it sold very well. Only 30,000 Stars were sold, compared to millions of IBM
PCs.
PARC’s talent was frustrated and wanted to leave. Earlier
that year, Charles Simonyi was thinking
about next steps. Bob Metcalfe suggested he
talk to a 22-year old kid named Bill Gates who ran a
startup called Microsoft. Gates and Simonyi hit it off right away with
high-bandwidth conversations on the nature of computing, the role of
technology, and future product ideas. Simonyi felt the Xerox corporate brass
didn’t know much about technology and didn’t care – they were bean-counters,
ex-Ford finance people that McCardell had hired to run the company. Gates on
the other hand was a visionary and a first-rate, cut-throat businessman. As
Simonyi put it, “you could see that Microsoft do things one
hundred times faster, literally.” So
Simonyi left PARC for Microsoft, where he became a “messenger RNA of the PARC
virus.” Within six years, the market
capitalization of Microsoft was higher than
Xerox’s, and Simonyi plotted a strategy to exploit a range of markets that
Xerox fumbled on: word processors,
spreadsheets, e-mail, and voice recognition. Simonyi especially helped on the
project to create Windows, a first-rate GUI operating
system that competed with Apple.
Another dispirited engineer who left PARC in 1981 was
Chuck Geschke, who was frustrated that Xerox wasn’t
commercializing their work. He went on to found Adobe Corp., a billion-dollar
company that used postscript, a typesetting language, to help computer users
make crisp, printable, presentable, and professional documents with text and
graphics. The company’s technology became the de facto standard of computer
typesetting and held that position in 2010.
By May 1983, Apple introduced the
Lisa, a personal computer with a graphical interface based
on principles developed at PARC. Jobs joked that
Xerox couldn’t
compete with his scrappy startup because Xerox’s cost structure was too high.
The company was fat and bloated. As one Xerox engineer joked, “If we built a
paper clip it would cost three thousand bucks.”
In September 1983, Bob Taylor resigned from
PARC under pressure.
Within a few months many of the center’s top computer engineers and scientists
resigned in sympathy. Many went to Taylor’s new employer, the DEC Systems
Research Center. In January 1984, Apple introduced the
Macintosh, the popular successor to the Lisa and the most
influential embodiment of the PARC personal computer, with a striking
“1984”-style television commercial during the Super Bowl.
Did Xerox PARC Blow It?
Why did PARC invent so many
great technologies and then fail to commercialize them? The first part of this chapter listed factors
leading to success. Now we turn to why Xerox failed at
commercialization. As Steve Jobs said in a
speech in 1996: “Xerox could have owned
the entire computer industry… could have been the IBM of the
nineties… could have been the Microsoft of the
nineties.”
One reason is that the company’s decision-making on
dozens of occasions was not about new technologies and opportunities, but about
personalities, politics, and short-term incentives.
The second was that the company’s managers saw it as a
copier company, not as a computer or a publishing company, let alone an enabler
of the “office of the future.” The
managers were fixated on the leased copier business model, and the sales force
was trained on copiers and typewriters, not new office technology. Also the
purchasing managers for computers were professional IT people, not the managers
who ordered copiers.
A third reason was that Xerox wouldn’t allow
entrepreneurial scientists to do spinouts and avoid the corporate bureaucracy.
New ventures had to be led by people running established divisions, people who
hated risk-taking. So Xerox lost talent like Clark, Simonyi, Geschke, Metcalfe,
and others who did startups that became billion-dollar companies much bigger
than Xerox.
Finally, the fault lay with PARC itself, which
often acted as a pure research center. The scientists were generally far away
from customer development, sales, or intrapreneurial development. The few Xerox execs (not PARC
researches) who tried to commercialize products were crushed by the corporate
bureaucracy. So while PARC was a success at an innovation, it was mostly a
failure at commercialization.
by Arun Rao
Selling Databases to the CIA
In 1977, three software engineers, Larry Ellison, Bob Miner, and Ed Oates, founded Software Development
Laboratories. They completed their first product, Oracle Version 1, in
less than one year. Their customer was the Central Intelligence Agency (CIA).
The engineers had significant experience designing customized database programs
for government agencies. Miner and Ellison had persuaded
the CIA to let them work on a lapsed $50,000 contract to build a relational
database program, after
they did some consulting work for a company called Omex.
A relational database allows business
users to match data by using common characteristics (a set of relations).
Relational databases, as implemented in relational database management systems
(RDBMS), have become the main place to store digital information used for financial
records, manufacturing and logistical information, personnel data, all Internet
records, and much more. They have become the guts of the global technology
infrastructure. For example, a data set containing all the real-estate
transactions in a town can be grouped in many ways: by the year the transaction occurred; by the
sale price of the transaction; by the buyer’s last name; and so on. An RDBMS
allows an organization to store massive amounts of data forever.
Oracle databases were
the guts of the electronic world. To put these databases in context, 98% of the
Fortune 100 companies depended on Oracle software to manage their information
in 2001. Every time someone would use a credit card, buy a plane ticket,
reserve a hotel room, order from any catalogue, surf the Internet, search
Google or Yahoo, get cash from an ATM, settle phone bill, or so on,
odds were that the person interacted with an Oracle database.
The idea for the product came from IBM research.
Ellison and Miner had come up
with the RDBMS idea after reading about it in the IBM Journal of Research and Development, realizing that no one had commercialized it. The key
insight Ellison and Miner had
was that IBM was interested in RDBMS, which many believed would allow computer
users to retrieve corporate data from almost any form. This came from an IBM
innovation called the Structured Query Language (SQL), a computer language that
would tell a relational database what to
retrieve and how to display it. Ellison and Miner had a hunch
that IBM would incorporate the new relational database and SQL into future
computers, mostly mainframes. So they set out to provide a similar program for
digital minicomputers and other types of machines, when conventional wisdom was
that it wouldn’t work and would be too slow. It would be the first commercial,
relational database. The founders renamed the company RSI.
To start the company, Ellison and Miner pooled $1,500
in savings to rent office space in Belmont, California. Ellison became
President and CEO and took charge of sales and marketing for the new company,
while Miner supervised
software development. They convinced the venture capitalist Donald Lucas to
become chairman of the board, after he stumbled upon their company working late
into the night in its early offices on 3000 Sand Hill Road. While the first version of the program was never
officially released, Version 2 came out in 1979, and was the first commercial
SQL relational database management system
(RDBMS), running on a PDP-11, a popular computer at that point. The product
attracted customers who used it for simple business functions and came out two
years before the IBM version.
In the start-up days of Oracle, Bob Miner was the lead
engineer, programming the majority of Oracle Version 3 by himself. As head of
engineering, Miner’s management style differed from Larry Ellison, who ran Oracle’s hard-driving sales team. While
Miner expected his
engineers to produce, he did not agree with the demands laid upon them by
Ellison. He thought it was wrong for people to work extremely
late hours and he wanted them to see their families.
Ellison just wanted
results. Bruce Scott, an early Oracle database engineer, felt Oracle was
successful mainly because of Ellison’s charisma, vision, and determination. One
example Scott gave was when the engineers in the startup had space allocated to
them and needed to get their terminals strung to the computer room next door.
They didn’t have anywhere to string the wiring. Larry walked in, picked up a
hammer, and slammed a hole in the middle of the wall. He said, “There you go,”
and then walked away.
In 1981 RSI began developing simple reporting tools
after recognizing that customers wanted to write applications to enter and
format data into usable reports. By 1982, RSI was profitable with 24 employees,
75 customers in the mainframe and minicomputer space, and reported annual
revenues of nearly $2.5 million. Ellison was hiring
salesman to aggressively increase revenues while Miner was more
circumspect. Ellison hit the road
and did demos of the product across the intelligence community to the CIA, NSA,
Air Force Intelligence, and so on. Ellison was working 14
hours or more a day, and even had to cut his own salary at one point. His
second wife left him as he focused on making products.
About a quarter
of Oracle’s 1982 revenues were poured back into research and development. This
led to the 1983 Oracle innovation of
the first commercially available portable RDBMS. The portable RDBMS enabled
companies to run their DBMS on a range of hardware and operating systems,
including mainframes, minicomputers, workstations, and personal computers.
Oracle doubled its revenues to over $5 million in 1983. That same year, Miner and Oates
rewrote the database code in the C programming language. After that, their
RDBMS would no longer be bound by any single platform and could be easily
modified for many types of computers. RSI became Oracle Corporation.
The next few
years brought more product innovations and growing revenue. In 1985 Oracle released
versions 5.0 and 5.1 to operate in client/server mode so multiple desktop
business applications could access a single database on a server. Oracle also
began to explore clustering, an early move toward flexible, scalable software.
That year, the company brought in more than $23 million in revenues. This more
than doubled to a record $55.4 million in 1986.
Two important events happened in 1986. First, the
database industry decided to make SQL as the industry’s standard language for
relational database management
systems. This in turn led to increased market acceptance of Oracle‘s SQL-compatible RDBMS.
Second, on March 15, 1986, a decade after the founding
of the company, Oracle had an initial
public stock offering of 2.1 million shares on the NASDAQ exchange. The company
had a market value of $270 million and Ellison owned 39% of
the stock. At the time, the company had 450 employees and was the
fastest-growing software company in the world. It had recorded 100% or better
growth in revenues in eight of its first nine years. Much of this growth came
from Oracle’s targeted end users: multinational companies with a variety of
previously incompatible computer systems. By 1986 Oracle’s customer base had
grown to include 2,000 mainframe and minicomputer users. These customer firms
operated in such fields as the aerospace, automotive, pharmaceutical, and computer
manufacturing industries, not to mention government organizations.
The IPO would reward investors well. Twenty years
later, Oracle had a global
workforce of 65,000 and annual revenue topping $15 billion. A $10,000
investment in the IPO of Oracle back in 1986 would by October 2006 be worth
over $4 million. Revenues would go from $20 million in 1986 to $11 billion in
2001, when the companies would have operating margins of 35% and a cash pile of
$6-8 billion dollars.
In 1986 Oracle expanded its
RDBMS product line and released a distributed DBMS based on the company’s
SQL*Star software. Under the distributed system, computer users could access
data stored on a network of computers in the same way and with the same ease as
if all of a network’s information were stored on one computer.
Getting into the Applications Market
By 1987 Oracle was fairly
successful. It had topped $100 million in sales and had become the world’s
largest database management software company, with more than 4,500 end users in
55 countries. But Ellison was restless,
and he wanted to branch out from databases into applications (computer
programs) that used the information in databases for business purposes. Oracle
created an applications division and began building its own business-management
software, integrated closely with its database.
Ellison had decided by
this point that he was a product guy and didn’t like most of the CEO duties. So
he concentrated on that and delegated all the rest, something Ellison called “closer
to abdication than delegation.” Ellison was also a
brilliant recruiter of programming talent because he knew the product so well.
While the sales and executive team could have a revolving door, there was a
“kernel group” who created the core product and stayed, accumulating knowledge
and experience to continuously make the software better.
Oracle a year later
introduced a line of accounting programs for corporate bookkeeping, including a
database for personal computers to work in conjunction with the Lotus
Development Corporation’s popular Lotus 1-2-3 spreadsheet program. The company
also introduced its Oracle Transaction Process Subsystem (TPS), a software
package designed to speed-up processing of financial transactions. Oracle’s TPS
opened a new market niche for the company, targeting customers such as banks
needing to process large numbers of financial transactions in a short period of
time.
Meanwhile,
hot backup allowed employees to continue working in the system while
administrators duplicated and archived data (so it reduced overhead costs). The
technology behind this, PL/SQL, generally allowed users to process data while
it remained in the database.
In 1989 Oracle was booming.
The company was added to the S&P 500 index of stocks. Oracle relocated from
Belmont to a new, larger office complex in nearby Redwood Shores, California.
Seeking to break into new markets, Oracle formed a wholly-owned subsidiary,
Oracle Data Publishing, in December 1989 to develop and sell reference material
and other information in electronic form. Oracle closed its books on the 1980s
posting annual revenues of $584 million, netting $82 million in profit.
Times were good but the growth bought trouble. In
March 1990 Oracle‘s revenues jumped 54% but net earnings rose only by
1%. Oracle’s first flat earnings quarter, attributed to an accounting glitch,
shook Wall Street out of its long love affair. Oracle had been booking revenues
too aggressively by discounting product and shipping incomplete or buggy
software. The day after the earnings announcement the company’s stock plummeted
$7.88 to $17.50 in record one-day volume with nearly 21 million of the
company’s 129 million shares changing hands.
In April 1990 a dozen shareholders brought suit
against Oracle, charging the company had made false and misleading
earnings forecasts. On the heels of this lawsuit, Oracle announced it would
conduct an internal audit and immediately restructure its management team with
Lawrence Ellison assuming the
additional post of chairman, while Lucas remained a director. Oracle also
formed a separate domestic operating subsidiary, Oracle US, aimed at addressing
its domestic management and financial problems, which the company attributed to
poor earnings.
Part of the problem was Larry Ellison’s management style. He was a sprinter who would work
hard, rest, and then sprint again. Ellison could get bored
of the company and then take weeks off to travel the world or spend time on his
yachts. He could listen to his executives with intensity or completely ignore
them. Ellison felt good when
everyone said he was nuts because it was a sign that Oracle was trying to do
something innovative. But Ellison also paradoxically cautioned, “when people
say you’re nuts, you might be nuts. You’ve got to constantly guard against that
possibility. You don’t want people saying you’re nuts too often.”
For the fiscal year ending May 31, 1990, Oracle initially
posted record sales of $970.8 million and profits of $117.4 million; but these
results were below Oracle’s own estimates. The company’s stock price fell from
a high of $28.38 to $19.88 then plunged to $11.62 in August after an internal
audit forced the company to restate earnings for three of its four fiscal
quarters. Jeff Walker, the CFO, had messed up in receivables and cash
management, and Ellison had not been a
good supervisor. At one point, Walker sought a cash infusion from the outside,
but Ellison held back as he
didn’t want to dilute his personal equity stake. As a result, Oracle negotiated
a $250 million revolving line of credit from a bank syndicate. A few weeks
later the company reported its first-ever quarterly loss of nearly $36 million
with expenses outpacing revenues by 20%. The corporate bank account had only $3
million at its low point. The stock tumbled once again, hitting a low of $4.88.
Ellison had to make
major changes, which started with changing his management team. Oracle also: moved to
reduce its annual growth rate goals from 50% to 25%; laid off 10% of its
domestic workforce of 4,000; consolidated Oracle US’s financial and
administrative operations; and folded various international units into a single
division. A lot of top talent left at that point, including people like Tom
Siebel. It was a low point for Ellison, as he had borrowed on his Oracle stock and so
received margin calls. Meanwhile, his third wife Barbara left him and Miner, his co-founder, wanted out by selling. The company’s
board wanted to kick Ellison out, but Don
Lucas stood on his side. Ellison had to start
paying attention to accounting and legal, not to mention the sales teams.
Ellison started
fighting back and said: “I’ve always
been more motivated by fear of failure than greed. And I hate losing.”
For 1991 Oracle topped the $1
billion sales mark for the first time in history and at the same time posted
its first annual loss of $12.4 million. In October the company secured a new
$100 million revolving line of credit from another bank syndicate. Oracle
negotiated an agreement for $80 million in financing from Nippon Steel
Corporation, which also agreed to sell Oracle products in Japan. In return,
Nippon was given rights to purchase as much as 25% of Oracle’s marketing
subsidiary in Japan, duly named Oracle Japan.
The year 1992 brought much change. Ellison brought in Ray
Lane, a senior partner from the consulting firm Booz Allen
Hamilton, to help turn around the sales team, while his partner Robert Shaw
came to build up Oracle‘s consulting arm. They would be two great hires and
they helped turn Oracle around. They built their own executive teams and forced
Ellison to listen and
respond to criticism and unpleasant facts. In 1992, Oracle7 also came out after
four years of research and development and two more years of customer testing.
It supported a larger number of users than previous versions, handled more
transactions, allowed data to be shared between multiple computers across a
network, and improved application development features. It won industry praise
and emboldened Ellison to talk up
database technology for the Internet. Bob Miner also left in
1992 as he had trouble managing a large team of engineers and soon found out he
had cancer.
By the end of its 1992 fiscal year, Oracle‘s balance sheet had improved as sales inched modestly
upward and earnings rebounded, with the company reaching $1.18 billion in sales
and $61.5 million in profits. Oracle entered 1993 with no bank debt, solid
long-term financing in place, and in an improved financial position controlled
by a revamped management team.
Ellison and Lane came
up with a three-part strategy. First, they wanted to build Oracle 7’s database
market share at whatever cost. Second, they had to make sure their database was
the best on the market by beating Sybase and Informix decisively. Third,
Ellison wanted to
branch out of databases into other applications. He was interested in
video-on-demand (which would be a dead end).
By 1993, Lane was doing so well in US sales that he
moved to global sales and was eventually heading to become the company’s
President. Ellison also made the
mistake of firing the entrepreneurial Geoff Squire, who would go on to build
Veritas into a strong software company.
By mid-1994 Oracle‘s sales had reached $2 billion, its consulting
services accounted for 20% of sales. Consulting became an important part of
Oracle’s model. While its competitor, SAP, had a nice relationship with
Anderson Consulting, the world’ largest IT systems integrator, Oracle was
pressing down hard.
The standard in the enterprise software business was
for a large company to buy dozens of “best of breed” applications from a range
of vendors. Then the company would hire an IT consulting company, like
Accenture, to connect all these bits of software, with their multiple databases
and systems. Data would become fragmented, duplicative, and conflicting across
the databases behind the multiple programs. The analogy would be buying dozens
of car parts from different places and then putting together your own car
(without the different parts-makers having a common standard tying them
together).
Since so many of Oracle‘s applications were so bad, its consultants pushed a
“best of breed” strategy and then worked to integrate outside applications.
Meanwhile, Ron Wohl would take over application development at Oracle and try
to improve the internal products. The battles between Wohl and the Lane/Shaw duo
would be rough, as the latter didn’t like selling the buggy software coming out
late from Wohl’s team. Meanwhile Oracle also started buying and partnering with
other, smaller software companies.
Fighting Microsoft over Internet
Strategy
By 1995 Larry Ellison had “found
Jesus” and had a specific vision and strategy for the Internet. During a
keynote presentation at a conference in Paris, Ellison introduced his
vision of the network computer, a small, inexpensive device that would run
applications via the Internet. The keynote speech took the technology world by
storm and would pit Ellison against Bill
Gates of Microsoft, who still believed in the PC/server model of
computing.
With Oracle‘s revenues topping $4 billion, in May 1996 Ellison took on the
“Wintel” (Microsoft Windows
software plus Intel‘s processing hardware) monolith by unveiling the
“Network Computer” (NC). It was a kind of stripped-down PC with no hard drive
and therefore no applications. Joining with such partners as SUN Microsystems
and Netscape, Ellison offered to free
corporations from the costly upgrades Intel and Microsoft forced on them
with every new release of Windows and the x86 family of processors. Using
Ellison‘s $500 NC, data and applications could be stored and
accessed as needed via the World Wide Web or remote server computers, equipped,
naturally, with Oracle’s databases. Since corporations would no longer have to
buy storage and applications for each computer, they could save millions with
no loss in functionality, and Oracle would have a vast new market for its
database products. By late 1996 this strategy had evolved into the “Network
Computing Architecture,” a complicated new three-tier world for corporate computing
consisting of a client computer (the computer accessed by the user), an
applications (such as word processing software) server, and a database server.
While Ellison battled
Microsoft, he was still practical. In 1996 Oracle ported all of
its development tools, object technology, and modeling and analysis tools to
NT. Recognizing that Microsoft‘s Windows NT operating system was becoming
increasingly popular with small businesses, Oracle delivered a multi-node
scalable database for Windows NT clusters.
In 1997 Ellison unveiled
Oracle8, based on his vision of the Internet and network computing. Meanwhile,
the “best of breed” consulting strategy was falling apart, as it was difficult
and expensive to link many different application programs. That same year, Ray
Lane wanted to leave
for the Novell CEO post, but Ellison bribed him to
stay with $2.5 million in options. Many Oracle applications
weren’t good enough, so Ellison had to fire Ron
Wohl and take over the applications development group himself. He would have to
start making applications for things like order management, tables, and
accounting; all were things Ellison had found
boring before, but he now had to master them. Ellison also started to
feel that Lane wanted him to fail.
Selling its Own E-Business Application Suite
Oracle prepared to
release version 11i of its E-Business Suite in 2000. It would provide the most
substantial integration of CRM and ERP applications to date. It was intended as
an entire ecosystem of enterprise computing and the IT consulting industry, as
Oracle claimed customers could get all they wanted from one place. There was no
need to find a “best of breed” supplier and then pay expensive consultants for
years to patch together systems. Ellison was changing
the company’s strategy to have a single, global database support a range of
business applications, a suite from marketing, sales, supply chain,
manufacturing, customer service, accounting, and so on
It was an attack against Microsoft‘s client/server model of computing again. The
difficulty with that was anytime new software or hardware was updated, it had
to be installed across hundreds of computers or more. That required costly IT
labor hours.
Ellison also started in
1999 to get more involved in sales force compensation, an important issue. He
took this responsibility away from Lane and created a better comp plan that was
transparent, had stretch targets, and allowed the best salesmen to make more
money and the worst to clear out. He also corrected dysfunctional incentives,
such as this one: if a salesman sold a
million dollars of Oracle product
directly, he got $100,000; if he sold Oracle product through a partner and made
the company $600,000, he got $120,000. So the sales force pushed the less
profitable deals for large bonuses.
Another big event in 1999 was the arrival of Safra
Catz, a former DLJ investment banker, who became Ellison’s chief of staff. As a former lawyer and banker, she
had a forensic approach to digging out facts and then analyzing them. Ellison appreciated her
methodical approach, saying: “In an argument when nobody has any facts… the
person with the strongest personality wins. But when one person has the facts
and the other doesn’t, the one with the facts always wins. When both people
have facts, there’s no argument.” After
Catz arrived, Ellison started
stripping Lane of more responsibility.
Oracle also launched
its rebuilt application server, Oracle9i Application Server. It included Web
caching technology that dramatically increased Web site performance and
scalability and cached dynamically generated as well as static pages.
Oracle shipped Oracle
E-Business Suite Release 11i, the first Internet-enabled suite of business
applications built on a single data model for seamless, real-time business
intelligence. It was a big deal because Ellison was turning
against the client/server model of computing for an Internet “cloud” model. The
hardest part was convincing Oracle’s own engineers that this was the right
direction and to get them to support the new product strategy. Ray Lane fought Ellison‘s technology decision and was aghast at making the
Internet the core of the company’s platform.
Ellison had a brilliant
strategy for selling 11i. He first implemented the entire software internally
at Oracle, and showed that it saved Oracle $1 billion annually.
Oracle then spent $300 million marketing this message: “By using our own E-Business Suite, Oracle
saved $1 billion in one year.” The
message was mostly correct; the new suite required that companies adapt their
business processes to it, but it was cheaper and more powerful, as an
independent Economist Intelligence Unit study showed.
Oracle finished fiscal
2000 with revenues of $10.2 billion and earnings at an all-time high of $6.3
billion due to an extra $4 billion from selling shares in Oracle Japan. By the
following year, Oracle prospered like its former self of the 1980s with soaring
sales, new product releases, and a myriad of new ventures both in the United
States and abroad. The company finished the year with sales close to $11
billion and $2.6 billion in earnings.
In June 2000, Ellison fired Ray Lane, the company’s President and Ellison‘s heir apparent, saying: “It’s like a marriage that went bad. I don’t
know [what went wrong].” Lane had
cleaned up the US sales force and helped the consulting business grow, yet
Ellison didn’t see him
as his replacement anymore. Lane never had a real social relationship with
Ellison, who stated it this way: “Ray’s a duck hunter. I raise mallards every
spring. We couldn’t be more different in personality and pastimes.”
In 2001 Oracle released
Oracle9i Database, with technology supporting software as a service. Oracle
also redesigned its business applications to run on wireless and mobile
devices. Oracle9i Database added Oracle Real Application Clusters, giving
customers the option to run their IT on connected, low-cost servers—expanding
performance, scalability, and availability of the database.
In 2002 Oracle Database 10g
was introduced. An early database for grid computing, it allowed groups of
low-cost servers to be connected by Oracle software and run applications faster
than the fastest mainframe, in addition to offering self-management
capabilities.
Meanwhile, Ellison started
re-tooling the Oracle sales culture
to get creativity out of the process and make it more “engineered.” Ellison wanted a new
process that involved first identifying a customer’s decision maker and
documenting it in their sales system. Then, the salesman could send a set of
key customer references and case studies showing how other customers got better
performance on a lower cost system, with case studies. Third, the salesman
could send a proposal quantifying cost savings in hardware, software, and labor
by implementing a specific Oracle product at that customer’s company. Finally,
the salesman could send a contract with standardized terms and a price quote.
By 2004 Oracle began to offer
easy-to-implement, low-risk, affordable solutions for small and medium-sized
businesses with Oracle E-Business Suite Special Edition and Oracle Database
Standard Edition One. In 2005 Oracle OpenWorld was the biggest event in
Oracle’s history, opening its doors to more than 28,000 attendees, and offering
more than 800 sessions and activities.
Oracle was the king of American IT technology conferences; only Steve Jobs could do
better.
Growth through Targeted Acquisitions
From 2003 onward, Oracle‘s growth strategy shifted towards acquiring other
companies making business software. The motivations behind Oracle’s largest
acquisitions were to increase market share in large business software markets,
to expand profitability by consolidating high-margin, customer support revenue
while cutting labor costs, and to offer a complete technology “stack” of
software applications and hardware.
All of Oracle‘s large deals over the next six years met these
requirements. Peoplesoft, Siebel, and Hyperion all strengthened Oracle’s market
share position in the applications market while contributing captive customer
bases that pay highly profitable support fees. BEA Systems was important
for its stake in the middleware market, which helped Oracle integrate so many
applications it sold. Finally, SUN Microsystems brought recurring support
revenue. It was also interesting for two other reasons: first, it demonstrated Oracle’s willingness
to move into servers and storage (including hardware); second, Oracle took
control of Java, a key programming language for Web and Internet development.
In 2010, Ellison succinctly
stated: “our strategy is in creating and acquiring intellectual property.”
In mid-2003 Oracle initiated a
hostile takeover of PeopleSoft Inc. for $5.1
billion. The Pleasanton, California-based PeopleSoft, was in the process of acquiring J.D. Edwards &
Company and was not amused by Oracle’s takeover bid, no matter how attractive
the offer. For its part, Oracle raised its offer several times in the
succeeding months, to as high as $9.4 billion, only to be met by a storm of
controversy. Few people, save Ellison, were in favor of the takeover. Shareholders of both
firms were unhappy. The US Department of Justice got involved over antitrust
issues. By the end of 2003, Ellison was determined
to win the battle, whatever the cost. Oracle’s year-end revenues fell for the
second year in a row to $9.5 billion.
Oracle announced the
acquisition of PeopleSoft at the end of
2004 and completed the transaction in January 2005, adding PeopleSoft Enterprise, JD
Edwards EnterpriseOne, and JD Edwards World applications to its product lines.
This was followed by the acquisitions of Siebel, Retek, Oblix, and other
strategic companies.
In 2006 Oracle released Oracle
Database 10g Express Edition, its first free database edition for developers
and learning DBAs. Oracle acquired several companies including Sleepycat
Software, the makers of the world’s most popular open-source database, Berkeley DB, and
released Oracle Secure Enterprise Search, a new standalone product that enabled
secure, high-quality, easy-to-use search across all enterprise information
assets.
In July 2007 Oracle bought Hyperion
Solutions Corporation, a global provider of performance-management software
solutions, through a cash tender offer for $52.00 per share, or approximately
$3.3 billion. Earlier that year, Oracle filed a court case in the Californian
courts against a major competitor, SAP AG, for malpractice and unfair
competition. In October 2007 Oracle announced a bid to buy BEA Systems for a price of
$17 per share, an offer rejected by the BEA board, which felt that it
undervalued their company. In January 2008 Oracle bought BEA Systems for
$19.375 per share in cash for a total of “$7.2 billion net of cash.”
In September 2008 Oracle started
marketing servers and storage in a co-developed and co-branded data warehouse
appliance named the HP Oracle Database
Machine.
In April 2009 Oracle announced its
intention to acquire SUN Microsystems for $7.4 billion ($9.50 per share). After
a bidding battle with IBM, in January 2010 Oracle acquired SUN. SUN Chairman
Scott McNealy had to leave the company he co-founded 28 years earlier. He wrote
in a bittersweet memo, “My hat is off to one of the greatest capitalists I have
ever met, Larry Ellison.” McNealy preferred that SUN would be the great and
surviving consolidator, but he was happy with the sale and his payout.
Sun’s technology gave Oracle a place in the
server, storage, and processor domains. Oracle became a direct competitor to
more companies, even hardware customers to whom Oracle sold its database and
other software for use on servers sold by those competitors. Now IBM, Hewlett-Packard, Cisco Systems, and
EMC would be direct competitors. SUN made most of its revenue from selling
computers but Oracle executives said they didn’t regard SUN as a hardware
company. As Oracle President Safra Catz stated,
hardware would mean factory ownership and large capital investments. However,
SUN outsourced nearly all the manufacturing, assembly, and servicing of its
hardware. It was more of a hardware design company.
Oracle’s sales pitch was one of integrated products:
hardware and software built to work together so that customers didn’t have to
do the integration work themselves or pay an expensive third-party consulting
firm, like Accenture, to do it. Also it would reduce software development costs
and bugs, and improve security. Finally, Sun’s computer designers could tailor
hardware to the combined company’s software, promising further gains in
efficiency (something that Oracle was very good at advertising).
Beyond hardware, Ellison said that Sun’s
Java programming language and its Solaris operating system were the main
attractions, calling the highly popular programming language Java “the single
most important software asset we have ever acquired.” Oracle could offer a
more complete set of corporate software, ranging from Sun’s hardware, operating
system, and programming tools to Oracle’s existing database and business
applications. The end goal would be to help companies automate operations like
finance and customer relations management.
Oracle was always a hard-charging company, and it got
into trouble again in late 2010. In July 2010 Oracle was indicted for fraud by
the US Department of Justice, after an employee tip-off and investigation
starting in 2007. The government accused Oracle of defrauding the US General
Services Administration (USGA), which negotiated contracts for the government,
on a software contract running from 1998 to 2006, involving more than $1
billion in sales. According to the filings, Oracle agreed to give federal
buyers discounts of up to 40%, which Oracle said was steeper than the discounts
it gave similarly sized corporate customers. In reality, Oracle’s sales force
was allegedly authorized to give similar-sized customers discounts ranging from
40% to 70%, where 90% of other corporate deals contained discounts larger than
what Oracle gave to the government. The government also claimed Oracle failed
to inform it that Oracle was giving other customers better deals. Oracle
allegedly went out of its way to manipulate deals so that they wouldn’t have to
report them to the government.
by Arun Rao
Prequels: Early
Personal Computers
Before Steve
Jobs and Apple Computer, there
was Douglas Engelbart, Xerox PARC, and Jef Raskin. Doug Engelbart was a Berkeley PhD and
professor who took a position at Stanford Research
Institute (SRI) to study next generation computing. He was influenced by
Vannevar Bush‘s article “As We May Think” on using next generation
computing to enhance human ability. Engelbart got ARPA
funding to start an Augmentation Research Center at SRI. Engelbart was credited
with inventing the computer mouse and getting a patent in 1967 with Bill
English. His lab at SRI was the one of first two points that
connected the ARPAnet (the predecessor of the Internet). Engelbart was well-known
in computing circles for his “Mother of All Demos” in the Fall Joint Computer Conference in San
Francisco on December 6, 1968. During that conference, he publicly showed for
the first time ever the computer mouse, as well as interactive text, video
conferencing, teleconferencing, email, hypertext, and a collaborative real-time
editor. Engelbart was a master of
presentation but a hard person to work for. He was a spiritual precursor to
Steve Jobs, though Jobs commercialized
his innovations and Englebart never had working products.
In 1970, Xerox opened its Palo
Alto Research Center
(PARC), and employed a star team of computer scientists,
including key members who left Engelbart‘s lab. Raskin started taking several trips to PARC as
a visiting scholar for the Stanford Artificial
Intelligence Laboratory. In 1973, PARC completed the Alto, the first true PC
and computer with a graphical user interface (GUI). Both were major influences on Steve Jobs at Apple and Bill Gates at Microsoft, who openly stole PARC’s ideas. PARC also had the
first laser printer, which was connected to a group of Altos using the first
Ethernet network.
In 1967, Jef Raskin (later
co-creator of the Macintosh) wrote his PhD thesis on the Graphical User Interface
at Penn State University. In his thesis he first coined the term “QuickDraw,”
which would eventually become the name of the Mac’s graphics routine 17 years
later. Raskin was later responsible for connecting Jobs to Xerox PARC and the
innovations happening there.
The relationship between Steve Jobs and Steve
Wozniak (“Woz”) began
in 1968, when Bill Fernandez introduced his high school buddy Jobs to his neighbor
Wozniak. Wozniak was a dropout
and garage tinkerer who had an uncanny ability to focus and could spend days
designing circuit boards and tinkering. Jobs was also a
dreamer and a dropout, who had once toured India as a teenager, only to see
extreme poverty and realize spiritual enlightenment was overrated. As Jobs said of India
and its ever-present poverty: “It was
one of the first times that I started to realize that maybe Thomas Edison did a
lot more to improve the world than Karl Marx and Neem Kairolie Baba put
together.” At one point, Jobs and Wozniak worked as Alice
in Wonderland characters at a shopping mall in San Jose.
In 1972, Jobs became one of
the first 50 employees at Atari, a high-flying Valley startup founded and run by the
entrepreneur Nolan K. Bushnell. Jobs later asked
Wozniak for help in
creating the sequel to the smash hit “Pong,” called “Breakout.” In 1975 Wozniak began attending
meetings of the Homebrew Computer Club. Wozniak was intrigued
by the Altair 8800, but could not afford one. He decided to build his own
microcomputer and began work on what would become the Apple I. Meanwhile, Jobs was attending
meditation retreats and studying Zen Buddhism with Kobin Chino, a major
influence on Jobs‘s life, encouraging the spontaneous, the intuitive,
and the simple.
Apple the Startup
Apple the startup
happened quickly and in an un-organized way typical of startups. In March 1976,
Wozniak finished work
on a microcomputer kit. Wozniak first asked his
employer, Hewlett Packard, if they were interested in an $800 machine that ran
BASIC; management was not interested. The next month, Wozniak teamed up with
Steve Jobs and Ron Wayne
to form Apple Computer Company. Wozniak and Jobs funded their
company with $1,000; Wozniak sold his prized
HP 65 calculator
for $500, and Jobs sold his VW bus
for the same amount. In May, they introduced the Apple I at the Home Brew
Computer Club meeting for $666.66. Most members showed little interest, but
Paul Terell, President of the Byte Shop chain, made an order for 50 units at
$500 a unit. Since they didn’t have cash, Jobs badgered a
local supply house, Kierulff Electronics, to give them 30 day credit terms for
$20,000 of goods.
In June, Apple finished and
delivered part of the Byte Shop order one day before deadline, with 12 units
for $6,000. The two kids made a profit of $3,000 (Ron Wayne had left the
company). Later in the fall, Jobs and Wozniak showed an Apple
II prototype to Commodore representatives and asked for $100,000, some
Commodore stock, and salaries of $36,000. Commodore turned the ragged-looking
pair down. The Apple II was a major innovation not for its color screen, but
for its expansion slots (which made upgrading easy) and its operating system
(which was free and already loaded on, making the machine a plug-and-play). The
machine was also quiet with no fan (Jobs found that a
fan distracted him in his meditation practice). Finally, the pair convinced Rod
Holt, an Atari associate, to
design a neat switching power supply that was lighter, cooler, and smaller than
any other on the market. They went to the first “Personal Computer Festival” in
Atlantic City on Labor Day that year to show their product, where the feedback
they received was to make a completed product and not just a kit. The computer
had to be a real product and not a hobbyist item.
The two founders needed more money for the company. In
August, Jobs was pestering
Frank Burge of the major national ad agency Regis McKenna to work with them.
Burge met the team working in their garage but was unconvinced. Jobs then pestered Burge’s
boss with calls three or four times a day, and his secretary gave in and so
Regis McKenna took the call and granted them an interview. When they met and
McKenna wouldn’t take the account, Steve refused to leave his office. McKenna finally took the account and decided
to advertise for them in Playboy magazine, a publication for young men who were
the target customers. As Apple didn’t have any
money, McKenna recommended that Steve contact Don Valentine, a venture capitalist. Jobs wore him down
with calls. Valentine in turn pointed
Jobs to Mike
Markkula, a marketing expert.
Markkula was a retired techie and marketing executive
who was rich because of Intel stock options.
Like Wozniak and Jobs, Markkula was a loner, but he was also a professional
and looking for the next big thing. He took a look at Apple and soon decided
the company could join the Fortune 500 in less than 5 years (he was right).
On January 3, 1977 Apple Computer, Inc.
was officially created after the company was incorporated. Mike Markkula invested
$91,000 in Apple, with the intent to invest $250,000. The company also secured
loans of $250,000 on Markkula’s credit. Jobs, Markkula, and Wozniak took about 30%
each and Holt got 10% for his work. Markkula would serve as Jobs’ management
mentor at Apple, teaching him how to run a business, and then eventually firing
him. Markkula also recruited Mike Scott as President and managed Jobs. An early Apple marketing executive, Floyd Kvamme,
recalled Markkula’s attention to user experience. On Kvamme’s first day at the
company, Markkula told him to go out and buy an Apple Computer, then take it
home and set it up, to better understand the customer’s needs. Meanwhile, the
friendship between Jobs and Wozniak started to
erode, mostly because of Jobs’s “holier-than-thou” attitude which came from his
deep, intuitive connection with the end user (he was both right and
insufferable).
In April, the Apple II was publicly
introduced for $1295. Within months Apple sold 300 computers. That year Jobs’ girlfriend Chris-Ann gave birth to a baby girl of
his and he mostly gave up drugs (but stayed vegetarian). Jobs started hiring
“A-players” to join the company and in the summer of 1979 Apple sold $7.3
million worth of private stock to 16 buyers, including Xerox and some
venture capital firms. Jobs was a
millionaire on paper at 24. He bought a house in Los Gatos with no furniture
other than cushions and a bedroom mattress, plus a Mercedes coupe. He also
donated money to a Nepalese charity.
In January 1978, 34-year-old Jef Raskin joined Apple Computer as
employee #31. He eventually became the Manager of Advanced Systems, where he
created what became known as the Macintosh Project for a
$500 portable computer. He worked on it from 1980 until his departure in 1982.
Raskin focused on designing computers from the user interface out. Most other
manufacturers tended to provide the latest and most powerful hardware, and let
the users and third-party software vendors figure out how to make it usable.
While Raskin worked on the Macintosh and its
graphics-based system, Jobs worked on an
alternate project, the Lisa Project, a
character-generator-based machine.
A New Hope: The
Darling of Silicon Valley
In December 1979, Jobs took his first
visit to PARC in exchange for
allowing Xerox to invest $1
million in Apple. The same month, Jobs returned to
PARC with several vice presidents and management heads to see a demo of the
wonderful PARC Alto personal
computer and its features like windows, menus, and so on in a graphical user
interface (More on this in the PARC chapter). By March 1980, the Lisa project was
revamped to include all the features of the Alto, with several more. Later that
summer, Jobs hired 15 Xerox
employees to work on the Lisa Project.
The year ended
with a bang. On December 12, 1980, Apple went public.
Apple’s share rose 32% that day, making 40 employees instant millionaires (the
most for any IPO in history up to that point). Jobs, the largest shareholder, made $218 million dollars
alone. Markkula made $203 million that day, a 220,700% return on investment.
Jobs supposedly
said: “When I was 23, I had a net worth
of a million dollars. At 24, it was over $10 million dollars. At 25, it was
over $100 million.” However, neither
Jeff Raskin, nor Daniel Kottke (one of the original Apple employees) were
allowed to buy stock and so made no money during this time. Wozniak in 1980 started
a plan to distribute his stock to other key employees and friends, but Jobs never did.
In January 1981, Jobs forced himself
into the Macintosh Project, after
earlier dismissing and often trying to cancel it. He saw greater potential
there and pushed Raskin aside, hiring his old Apple II partners
like Wozniak and Jerry
Mannock. The team moved into a separate space and Jobs set a goal for
a machine to get to market in a year, a ridiculous timetable. Jobs wanted the core
computer to be the size of a telephone book, and rejecting the conventional
wisdom, did not want it to be expandable. Eventually Jobs‘s plan was for the Macintosh to come out the
same time as the Lisa at a price of
$1,500, including software. He made up the estimated sales number of 500,000
units in year 1.
Jobs was a
micromanager who mixed catch phrases, fast comeback, and some original insight.
Jobs often took
credit for others’ ideas, and his group joked that he had a “reality distortion
field” around him. Jobs thought about
design and usability problems non-stop and he often denigrated the ideas of
those who worked for him (and then later came back claiming they were his own
while proposing them anew). Jobs’ team tended to be bright men, yet homogenous, and
his interviewing questions were along the lines of “How many times have you
taken acid?” and “When did you lose your virginity?” Jobs generally
didn’t like the wives or girlfriends of his all-male team and was difficult to
interact with in the real world. He
often sent dishes back in restaurants, only to not have cash at the meal’s end
and so someone else had to pay. Jobs was basically
an egotistical, unfeeling wunderkind at that point in his life. He was also
brilliant.
At the same time Mike Markkula became
President of Apple and approved
the Macintosh from being an
experimental project to a real product. Raskin left soon after and would go on
to release the Canon Cat, a beautiful PC product that won several design awards
but failed to become popular due to lack of production by Canon.
Competitors were not far behind. In June, Xerox introduced an
improved variation of the Alto, the $16,595 Xerox Star. It included dragging and
double clicking of icons. In August, IBM introduced the
IBM PC for $1565. With 16K RAM and a 5.25” floppy drive, running the first
version of MS-DOS, it was a poor computer (barely reaching the efficiency of
the Apple II released 4
years earlier) but it sold well. Yet, 1981 was a big year for Apple in terms of
name recognition; in the beginning, only 10% of Americans knew of it and by the
end about 80% did.
Apple continued
developing the Mac and Lisa in 1982. In
January 1983, the Lisa was introduced
for $9998, but it was a big flop compared to the much cheaper IBM PC. Apple’s
stock sank. The Apple IIe was introduced for $1395. It became the most
successful and most popular Apple computer and would be produced for 10 more
years.
1983 was a
big year, within which Apple hit three major
milestones. In the spring, its ad company, Chiat/Day, created a “1984” ad for
use in Super Bowl XVIII in January 1984. The 30-second version of “1984”
appeared in theater previews across the country. In the ad, an attractive
female prisoner breaks into a drab prison where a Big Brother lecturer is on a
large screen. She dashes to the screen and spins around with a huge
sledgehammer, and then lets it go to smash the screen, which explodes in a
blinding flash of light. A voiceover says:
“On January 24th Apple Computer will introduce the Macintosh. And you’ll understand why 1984 won’t be like
1984.” The ad was so admired that it was
often replayed for free. It also temporarily boosted the company’s sales,
employee morale, and stock price.
Second, in April 1983 Jobs convinced John
Sculley, then President of PepsiCo, to become President and
CEO of Apple. The two lines he used to recruit Sculley were: “Do you want to spend the rest of your life
selling sugared water or do you want a chance to change the world?” and “It’s better to be a pirate than to join the
Navy.”
Third, in May 1983 Apple entered the
Fortune 500 at #411 after only five years of existence. It had become the
fastest growing startup company in US corporate history.
In January
1984 a $2495 Macintosh and $3495 Lisa 2 were
introduced; the Macintosh did well
initially, after the 1984 ad, but failed to meet Jobs‘s target of 500,000 units. Internally the Macintosh team was angry
about how underpaid they were and demanded raises. Meanwhile the Apple II team, whose
products sold the most, was angry that they were being displaced and that the
Macintosh team was
getting so many perks (like massages, free food, and outings). Trouble was
brewing. In April, the Apple IIc was introduced at the Apple Forever Conference
in San Diego. Later that year, the Apple IIc won an Industrial Design
Excellence Award. However sales for all the products were below targets, and
inventories were rising.
The Empire Strikes Back:
Jobs Forced Out and
Apple Loses its
Direction
Jobs’s time at Apple was perhaps
destined to be short. The board was frustrated with the company’s performance,
and board member Arthur Rock told Sculley to
take charge and correct the problems of rising inventories and unhappy
employees. Meanwhile, board Chairman Jobs was plotting to
get rid of Sculley. On May 24, Jobs tried to force
Sculley out of Apple by mounting a coup against him. On May 28th, the board,
including Markkula, sided with Sculley and stripped Jobs of all his
duties. ‘‘I felt betrayed by Mike,’’
Jobs later said,
‘‘But I still have a very warm spot in my heart for him.’’ Markkula expressed similar ambivalence: ‘‘I
thought the way Steve left was at best ungentlemanly,’’ he said, referring to
Jobs’ angry departure. Jobs’ title became “global thinker”
after the board vote and his remote office became “Siberia” (in a remote Apple
building).
To process the events of his dismissal, Jobs went home to a
dark house and listened to Bob Dylan in the dark. He then went on a trip to
Paris and then the Tuscan hills outside Florence. He told a journalist: “You can’t always get what you want;
sometimes you get what you need.” Seeing
Jobs out, Bill Gates in July sent
Scully a proposal suggesting he license the Mac OS to companies who might create
Mac clones. After Jobs returned,
Sculley and the management team ignored Jobs, who would go home depressed. Jobs stopped coming
in to work. The stock price dropped and in for the quarter ending in June Apple
announced its first loss ever of about $17 million, with sales dropping 11%.
Sculley told the press: “There is no
role for Steve Jobs in the
operations of this company either now or in the future.”
Jobs didn’t know
what to do. He thought about politics but then decided that creating new,
innovative products was what he loved doing. By September, Jobs announced his
intent to create a new computer company with other “lower-level” employees,
creating a new computer for the university market. He distributed his
resignation letter to Apple and several
other news media figures, after initially (falsely) telling the board he
wouldn’t compete with Apple and that he would let Apple invest in the company.
On September 23 Apple filed suit
against Jobs. Apple claimed Jobs knew sensitive
technology secrets that he might use in his new company, and that he was
poaching key Apple employees, which violated Jobs‘s duty of loyalty as an ex-Chairman. Apple later
dropped the lawsuit when Jobs shamed the
company by saying: “It’s hard to think a
$2 billion company with 4,300 plus people couldn’t compete with six people in
blue jeans.” Jobs left in
September 1985, after selling nearly all his stock, about $90 million worth,
except one share, which he kept to get Apple’s annual reports. He still
professed a love for the company he founded.
Jobs reflected on
these experiences with maturity in a Stanford University
Commencement Speech in June 2005. He was fired by his hand-picked board and CEO
at age 30 after his garage startup grew from two people into a $2 billion
company with over 4,000 employees. Jobs felt “the focus of my entire adult life
was gone, and it was devastating.” He
met with previous entrepreneurs David Packard of HP and Bob
Noyce of Intel and “tried to
apologize for screwing up so badly.”
Since Jobs was a public failure he even thought about running away from
the Valley or becoming a professor. Yet he still loved what he did and felt
he’d been rejected but was still in love. He decided to start over and later
realized that “getting fired from Apple was the best
thing that could have ever happened to me. The heaviness of being successful
was replaced by the lightness of being a beginner again, less sure about
everything.” Jobs could enter another
creative period, founding NeXT Computer and building up the animation company
Pixar. Jobs also met his wife Laurene, started a family, and grew up a bit. He
said “It was awful-tasting medicine but I guess the patient needed it.
Sometimes life’s going to hit you in the head with a brick.”
After Jobs left, Apple started on a
path toward stagnation, losses, and a brush with bankruptcy. The competition
was close behind. Compaq introduces the first Intel 386 PC,
replacing IBM as the PC
technology leader. In January 1987, Apple renamed the Lisa 2/10 the
Macintosh XL, and
discontinued all other Lisa configurations.
Meanwhile Jobs was doing well
at NeXT. Ross Perot invested $20 million in NeXT for 16% of
the stock. However the NeXT Computer would be a year and a half late to market.
In October 1988, the NeXT Computer was released for $6500. It included a 25 MHz
processor, 8 MB RAM, 250 MB optical disk drive, math co-processor, digital
processor for real time sound, faxmodem, and a 17” monitor. Apple‘s newest Mac was half as fast, with no peripherals
for $1000 more. In September 1989, the NeXTstep OS was introduced, followed a
year later by the NeXTstation which was released for $4995. However, by June
1991, Ross Perot resigned saying his investment was one of his biggest
mistakes.
Jobs also lost a big
opportunity at NeXT. In 1987 he met with John Akers, the CEO of IBM, who wanted a new operating system for the IBM PC.
Jobs fought over
contract terms and negotiated hard. IBM became frustrated and its lead champion
of the deal (Bill Lowe) left. So IBM paid for Next’s NeXTstep operating system
but didn’t use it. Instead IBM promoted Microsoft Windows and its
own program, OS/2. Jobs had missed his
chance to usurp Microsoft‘s key OS product, as it would have been likely that
other PC manufacturers would follow IBM. NeXT, and not Microsoft, could have gotten a license fee for operating system
software on every computer, a solid monopoly. NeXT had lost an important battle
to be the next platform of the PC.
Apple had some
superficial success, but it was rotting within. In March 1987, Apple had 6
different Mac Pluses, though its product pipeline was dry. The Macintosh personal
computer had moved Apple into the business office market. Corporations saw its
ease of use as a distinct advantage. It was far cheaper than the Lisa and had the
necessary software to link office computers. By 1988, over one million
Macintosh computers had
been sold, with 70% of the sales to corporations. Vendors created software
connecting Macintoshes to IBM-based systems. Apple grew rapidly: its 1986 sales of
$1.9 billion and income of $217 million grew in 1988 to sales of $4.1 billion
and income of $400 million.
1988 to 1994 were slog years for Apple as it stagnated
and Microsoft moved ahead. In
January 1991 Microsoft released the second
version of Windows, version 2.03. Compared to Windown 1.01, which was almost
unusable, Microsoft made many
improvements, many of which were taken from the Mac. These included Mac-like
icons and overlapping instead of tiling windows. Even so, Windows was still not
up to par to the first Alto OS, written 15
years before.
Sculley and his management team made a string of bad
decisions over the next few years. First in 1988, the executives thought a
worldwide shortage of memory chips would get worse, so they bought millions
when prices were high. The shortage ended quickly and prices fell. Second, Sculley reorganized Apple again in August
1988 into four operating divisions: Apple US, Apple Europe, Apple Pacific, and
Apple Products. Many longtime Apple executives were frustrated with the changes
and left. Third, Apple fended off a lawsuit in December 1989, where Xerox Corp. claimed
that Apple was unlawfully using Xerox technology for its Macintosh software. Apple
won this lawsuit in1990, but had started its own lawsuit against Microsoft and Hewlett-Packard, charging copyright infringement over its graphical
user interface (GUI). In the spring of 1992, Apple lost its case when a
court decided copyright protection cannot be based on “look and feel”
(appearance) alone. Instead, developers would have to come up specific features
to protect. Finally, headcount became bloated. Apple had 5,500 employees in
1986 and over 14,600 by the early 1990s.
Apple had soared
through the 1980s due to large, expensive computers based off of Jobs’ innovative earlier designs. The company had a committed,
yet relatively small following. At a time when the industry was seeing slow
unit sales, the numbers at Apple were rising because of smaller and cheaper
desktop computers. In 1990, desktop Macs accounted for 11% of the PCs sold
through American computer dealers. A year later, the figure was 19%. The notebook PowerBook series, released in
1991, found a 21% market share in less than six months. However, profit margins
were lower and the company was not coming up with innovative products anymore.
After another reorganization and massive set of layoffs in 1990, profits fell
by 35% the next year. Apple’s board of directors fired Sculley in 1993 after
Apple’s PC market share had shrunk from 20% to 8% under his watch.
The next two CEOs, Michael Spindler and Gil Amelio, didn’t last long. Spindler broke tradition by
licensing Apple technology to
outside firms. A group of Apple clones hit the market, diluted the brand, and
even hurt Apple’s profits. Spindler did introduce the Power Macintosh line in 1994,
but the company underestimated demand and produced too few (after
overestimating demand for an earlier release of its PowerBook laptops).
Spindler,
unfortunately, wanted to compete against Microsoft in the business
and office machine market, not realizing that only price and performance
mattered there. Apple’s edge in style and design didn’t matter there, and
his strategy would fail. Later in 1994 Apple released the first PowerMacs using
the PowerPC 601 and also System 7.5, with a bunch of new features everybody
already had as shareware.
The next two years were bad ones. In 1995, Power
Computing released the first Mac clones, including the very successful Power 100.
In 1996, Apple licensed the
Mac OS to Motorola, allowed it authority to sub-license for the first time.
Then Apple licensed the Mac OS to IBM. In early 1996 a deal to sell Apple to SUN
Microsystems failed, as Apple’s revenue was falling. As Spindler tried to cut
costs, Jobs reportedly
said: “The cure for Apple is not
cost-cutting. The cure for Apple is to innovate its way out of its current
predicament.” By 1995, Apple had $1
billion worth of unfilled orders. Customers and investors were angry. The Apple
board replaced Spindler with Gil Amelio in February
1996. Amelio was a former Rockwell executive and Apple fan who had turned
around National Semiconductor. He had no experience selling goods in the
consumer marketplace.
Amelio’s tenure was short but forceful. He cut Apple‘s payroll by a third and slashed operating costs.
However, he also couldn’t oversee the creation of beautiful, desired products
and he took a fat paycheck. The company’s financial losses grew to $816 million
in 1996 and $1 billion in 1997. The stock, which had traded at more than $70
per share in 1991, fell to $14 per share. Apple’s market share in PCs was 16%
in the late 1980s, but it had fallen to less than 4%.
Meanwhile not all was well at NeXT either. In January 1992, Steve
Jobs announced
NeXTstep 3.0, a version of NeXTstep that could run on an Intel 486
simultaneously with MS-DOS. NeXT would eventually move its OS entirely to the
Intel x86 platform.
In February 1993, Jobs laid off 280 of
his 530 NeXT employees on “Black Tuesday,” sold his hardware line to Canon, and
tried to become a Microsoft-like company by concentrating only on the NeXTstep OS
for the Intel x86 platform.
Microsoft was doing well
but Apple wasn’t. In
spring 1992 Microsoft introduced
Windows 3.1, a big success. Microsoft did not make
another update (besides 3.11) for three years. Meanwhile that Spring Motorola
shipped the first 50 MHz and 66 MHz PowerPC 601.
Amelio had a hard time at Apple. Bill Gates called often to
offer Windows NT as a replacement for the Apple operating system. Then Steve
Jobs started calling
and offering the NeXTStep operating system, as did former Apple executive
Jean-Louis Gassee offering his new Be operating system. Amelio had a team
evaluate them all technically and Apple chose NeXTStep. Another area of
Amelio’s leadership was inventory management, where Amelio converted lots of
unsold products into cash. It was a lesson that Steve Jobs would later
thank Amelio for.
Amelio was ousted from the company in July 1997,
having spent less than three years there. Yet before his departure, he made a
significant deal that brought Apple’s savior to Cupertino. In December 1996, Apple paid
$377 million for NeXT. It was a small, $50-million-in-sales company still
run by Steve Jobs. Concurrent with the acquisition, Amelio hired Jobs as his special
advisor, marking the return of Apple’s visionary 12 years after he had left. In
September 1997, two months after Amelio’s exit, Apple’s board of directors
named Jobs interim chief
executive officer. Apple’s recovery occurred during the ensuing months.
The Return of Steve Jobs: Apple’s Rise on the Back of the iMac
By August, former “advisor” Steve Jobs became “de
facto head” and in September he became “interim CEO” of Apple at a salary of
$1. Jobs was back home
at the place he truly loved as a more mature leader-designer-CEO. As of 2010,
Jobs was still
Apple’s CEO. Jobs immediately
discontinued the licensing agreement that spawned Apple clones. Jobs eliminated 15
of the company’s 19 products, withdrawing Apple’s involvement in printers,
scanners, portable digital assistants, and other peripherals. From 1997 on,
Apple would focus exclusively on desktop and portable Macintoshes for the
professional and consumer markets. Jobs closed plants,
laid off thousands of workers, and sold stock to rival Microsoft Corporation.
Jobs replaced the
board with his own picks and even worked on changing Apple’s culture. He
proclaimed no more pets at work and no business class travel; he also had a
complete ban on talking to the press without a PR official watching.
In January 1997, Apple launched a new
operating system strategy with Mac OS 7.6.
Soon after, the Mac OS 8 was finally released and sold 1.25 million
copies in less than 2 weeks. Jobs announced an
alliance with Microsoft at the Macworld
Expo in Boston. Among the agreements were: a cross-platform license where
Microsoft could use
Apple’s design elements from its operating system; an agreement that Apple
would use MS Office and Explorer as its default office and browser programs;
Microsoft would invest
$150 million in Apple. As the saying went, keep your friends close, but your
enemies closer.
Jobs introduced new
products and dabbled in retail. In January 1998 Jobs announced a
projected $47 million profit for the first quarter at Macworld Expo. Apple had returned to
profitability; it would be the first profitable year since 1995. That month,
Mac-clone Power Computing went out of business for good. In February, after a
little over 5 years, the Newton/eMate line was discontinued by Apple.
The next month, Apple unveiled a
retail strategy. Jobs opened 149
Apple “stores within stores” in CompUSA locations across the country. This
helped many Mac users who hated the small, incomplete, and out-of-stock Apple
sections most retail computer stores provided.
In May, Apple announced the
iMac and new PowerBook G3 models. By August, Apple had 150,000 preorders for
the iMac. Apple’s stock went over $40/share, the highest stock market price in
three years. In August 1998, Apple finally introduced its new all-in-one
computer reminiscent of the Macintosh 128K: the iMac
(Amelio claimed most of the project had been completed under his watch). The
iMac design team was led by Jonathan Ive, who would later design the iPod and the iPhone. The iMac featured modern technology and a unique
design, with the monitor and computer in the same box. It had no floppy disk
drive like other computers, and that was a risk that worked. The iMac sold
close to 800,000 units in its first five months. Later that year Mac OS 8.5 was
released to an ecstatic audience. Surveys showed that 43% of all iMac buyers were
new to the Macintosh platform.
During the next few years, Apple purchased a few
software companies to create a portfolio of professional and consumer-oriented
digital production software. In 1998, Apple purchased Macromedia’s Final Cut
software for digital video editing. The following year, Apple released two
video editing products: iMovie for consumers and Final Cut Pro for
professionals. In 2002 Apple purchased Nothing Real for their advanced digital
compositing application Shake, as well as Emagic for their music productivity
application Logic, which led to the development of their consumer-level
GarageBand application. iPhoto’s release the same year completed the iLife
suite. Jobs also tried to
buy the hardware maker Palm, but its CEO Donna Dubinksy, a former Apple exec,
refused on the grounds that she never wanted to work with or for Jobs again.
As the Internet boom went on in the late 1990s,
thousands of would-be entrepreneurs flocked to Silicon Valley to start Internet
companies they could flip quickly for millions. Jobs said “the
rewarding thing isn’t merely to start a company or to take it public.” Instead, he felt it was like parenting, where
after the miracle of a birth, the more rewarding thing is to help your child
grow up. Many entrepreneurs wanted to start companies but not stick with them
because of the many moments filled with despair and agony (firing people,
cancelling products, etc.). But for Jobs, “that’s when you find out who you are
and what your values are.”
Meanwhile, in a big move Apple expanded its
retail strategy further. On May 19, 2001, Apple opened the first official Apple
retail stores in Virginia and California. Apple would go on to create memorable
stores. For example, the entrance of the Apple Store on Fifth Avenue in New
York City was a glass cube. It had a cylindrical elevator and a spiral
staircase that went into the subterranean store.
In March 2001, Jobs released Apple‘s new operating system, Mac OS X, based on his work
at NeXT. Mac OS X combined the stability, reliability and
security of Unix with the ease of use afforded by an overhauled, beautiful user
interface. As Jobs said of
it: “We made the buttons on the screen
look so good you’ll want to lick them.”
To help users migrate from Mac OS 9, the new operating system allowed
the use of OS 9 applications through Mac OS X’s Classic environment.
That same year, Apple introduced the
iPod portable
digital audio player, its first game-changing device after the PC. The iPod was
a masterpiece of design; simple and striking within its white box. Jobs was a master of
framing; instead of saying it had a 5-gigabyte hard drive, he said it was large
enough to hold 1,000 songs. To minimize the iPod’s $399 price tag, Jobs said, “There
are sneakers that cost more than an iPod.”
The product was phenomenally successful; it sold over 100 million units
within six years.
In 2003, Apple created its
iTunes Store, offering online music downloads for 99 cents a song and
integration with the iPod. Within a year it had 70% of the download market and
had sold 85 million songs. The service quickly became the market leader in
online music services, with over 5 billion downloads by June 19, 2008. The
difficulty in creating the store wasn’t technical, but rather in getting the
music companies to sign on and sell music on a medium they were hostile to.
Jobs managed to
convince them by sheer force of personality and by his celebrity.
Both the operating system and the iPod were launched
from secrecy at Apple events where
Steve Jobs, the master showman, spoke. Jobs would work with
the technical crew for weeks on end before the launch and he would be
intimately familiar with the product as its champion and visionary. Jobs would avoid
rehearsals or scripts, but would work closely with the show’s producer and
lighting and visual people to get the right effects. By doing so, Apple
launches would have hundreds of press people covering its magical act for free,
giving the company tens of millions in effective and free advertising.
At the Worldwide Developers Conference keynote address
on June 6, 2005, Steve Jobs announced that
Apple would begin
producing Intel-based Mac computers in 2006. On January 10, 2006, the
new MacBook Pro and iMac became the first Apple computers to use Intel‘s Core Duo CPU. By August 7, 2006 Apple had
transitioned the entire Mac product line to Intel chips, a year
sooner than announced. The MacBook Pro (15.4” widescreen) was Apple’s first
laptop with an Intel microprocessor.
It was announced in January 2006 and is aimed at the professional market. The
Power Mac, iBook, and PowerBook brands were retired during the transition; the
Mac Pro, MacBook, and MacBook Pro became their respective successors.
Apple’s success during this period was evident in its stock
price. Between early 2003 and 2006, the price of Apple’s stock increased more
than tenfold, from around $6 per share (split-adjusted) to over $80. In January
2006, Apple’s market cap surpassed that of Dell. This was sweet, as nine years
before, Dell’s CEO Michael Dell had said that if he ran Apple he would “shut it
down and give the money back to the shareholders.” Although Apple’s market share in computers
grew, it remained far behind competitors using Microsoft Windows, with
only about 8% of desktops and laptops in the US.
Game Changers: Apple Moves beyond PCs
to Phones, Pads, and Music/Movies/TV
Delivering his keynote at the Macworld Expo on January
9, 2007, Jobs announced that
Apple Computer, Inc.
would from that point on be known as Apple Inc. because computers were no
longer the singular focus of the company. This change reflected the company’s
shift of emphasis to mobile electronic devices from personal computers. Jobs also announced
the iPhone and the Apple
TV. The following day, Apple shares hit $97.80, an all-time high at that point.
In May, Apple’s share price passed the $100 mark. iTunes was also evolving in
response to buyer dislike of digital rights management (DRM) software that made
sharing difficult. In February 2007, Jobs defied the
music industry and said he would sell music on the iTunes Store without DRM if
record labels would agree. Two months later Apple and EMI, a large music label,
jointly announced the removal of DRM technology from EMI’s catalog in the
iTunes Store. Other record labels followed later that year.
The Mac, iPod, iTunes, iPhone, and iPad became the five pillars of Apple’s business. In July 2008 Apple launched the App Store
within iTunes to sell third-party applications for the iPhone and iPod Touch.
Within a month, the store sold 60 million applications and brought in $1
million daily on average. Jobs speculated that
the App Store would become a billion-dollar business for Apple, with margins
above 80%. The sixth pillar was raised.
These products were successful because of marketing
and Jobs’ genius at design. Unlike most marketers, Jobs hated focus
groups, saying: “It’s really hard to
design products by focus groups. A lot of times, people don’t know what they
want until you show it to them.”
Instead, Jobs had a different
philosophy, more suited for a designer-king leading his people. For Jobs,
design was a funny word. Most people thought design meant how a product looked.
But for Jobs, if you dug deeper, design was really how a product worked, so to
design something well you had to take the time to study it and get it. You had
to thoroughly understand a product and see how it affected and improved
someone’s life.
As an example, Jobs talked about
how he disliked most consumer devices, except for a new washing machine and
dryer his family got. His family spent two weeks at the dinner table talking
about options and arguing about design. They then chose a European washer,
Miele, that was slow but used less water and treated clothes more gently. Jobs said of
it: “They did such a great job designing
these washers and dryers. I got more thrill out of them than I have out of any
piece of high-tech in years.”
On December 16, 2008, Apple announced that
after over 20 years of attending Macworld, 2009 would be the last year Apple
would be attending the Macworld Expo. Almost exactly one month later, on
January 14, 2009, an internal Apple memo from Jobs announced that
he would be taking a six-month leave of absence, until the end of June 2009, to
allow him to better focus on his health. Apple’s talented COO, Tim Cook, took
over until Jobs returned.
After years
of speculation and multiple rumored “leaks” Apple announced a
large screen, tablet-like media device known as the iPad on January 27, 2010.
Jobs claimed the
idea for the iPad came before the iPhone. The idea to ditch the keyboard for a “multi-touch
display” came about in the early 2000s, although Jobs claimed the
company was working on a telephone at the time. At that point a prototype came
to him that used the device’s now-famous scrolling mechanism. Jobs thought: “My
God we can build a phone out of this.”
But the tablet product was put on the shelf and the iPhone went into
development for several years before making its debut in 2007. Apple started
selling the iPad tablet computer in April 2010.
The iPad ran the same touch based operating system
that the iPhone used and so
also ran many of the same iPhone apps. This gave the iPad a large app catalog
on launch even with very little development time before the release. Later that
year on April 3, 2010, the iPad was launched in the US and sold more than
300,000 units on that day and reaching 500,000 by the end of the first
week. At one million iPads in 28 days,
it took less than half of the 74 days it took to achieve this milestone with
iPhone. Walt Mossberg of The Wall Street Journal, perhaps the premier gadget
analyst in the US, called the iPad a “pretty close” laptop killer. In May 2010,
Apple‘s market cap exceeded that of competitor Microsoft for the first
time since 1989.
In June 2010, Apple released its
fourth generation iPhone. It introduced video calling, multitasking, and a new
stainless steel design, which acted as the phone’s antenna. Yet it had weak
signal strength problems at times.
Apple at this point
was competing head-on with Microsoft and Google. Its operating system on computers was preferred over
Microsoft Windows for
ease-of-use (not for the applications, which Microsoft had locked in).
The Apple iPad easily defeated Microsoft‘s clunky Zune player, and its phone was much more
successful than any phone with a Windows Mobile operating system.
More importantly, Apple competed
against Google on two major
fronts. First, the Google Android mobile operating system was the best
competitor Apple had. Android was “open” and so gave application developers
more freedom and creative control to make things. Apple was “closed” and so
disliked by many app developers. Jobs had
characterized Apple’s system as “integrated,” meaning the user experience was
consistent and safe, like a gated community. In contrast, Android was
“fragmented,” meaning the users had more freedom, but also had to worry about
security on the Internet, viruses, and malware. Second and more broadly, both
were competing to be next generation content delivery systems. Apple had the
iTunes store and Google had the Android store and Google TV.
With the iPad, a consumer had the anti-Internet in her
hands, offering major media companies the ability to essentially re-create the
old, closed business model. Big Media could push content to users on their
terms rather than users going out and finding content, or using a search engine
to find content, via the Google model.
The battle in 2010 between Apple and Google over the mobile
operating system, and in general for all media devices, came down to an age old
computer battle for the application programming interface (API), the platform
on which applications (apps) are built. Every operating system had its APIs;
these defined what the system does, how it looked to the user, and how
programmers at other companies could build applications on the API.
As Jerry Kaplan of Go Corp. memorably explained, when a firm
created an API, it was like trying to start a city on a tract of land. First
the firm tried to persuade other programmers to build their businesses on it.
This attracted customer/users, who wanted to live there because of all the
“shops” the programmers had built on the land. This caused more programmers to
want to come and build apps to rent space to be near customers. Eventually the
process gathered momentum and the city grew faster than competitors. Once the
city reached a high point, the owner of the API became a king making rules,
collecting tolls, setting taxes on programmers and users, and holding back
prime real estate (keeping confidential APIs for personal use). As Jobs said in 2004:
“I’ve always wanted to own and control the primary technology in
everything we do.” By 2010, Jobs was
closer than ever to making that dream a reality, though he would retire the
next year for health reasons.
(Copyright © 2013 Arun Rao )
|