A History of Silicon Valley

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence
Purchase the book

These are excerpts from Piero Scaruffi's book
"A History of Silicon Valley"


(Copyright © 2010 Piero Scaruffi)

15. The Survivors (1999-2002)

by Piero Scaruffi

Bursting

Between 1998 and 1999 venture-capital investments in Silicon Valley firms increased more than 90% from $3.2 billion to $6.1 billion. In 1999 there were 457 IPOs in the USA. The vast majority of the companies that went public were high-tech start-ups and about 100 were directly related to the Internet, and an impressive number were based in Silicon Valley. In 2000 the number of public companies in Silicon Valley reached 417. In 2000 venture-capital investment in the USA peaked at $99.72 billion or 1% of GDP, mostly going to software (17.4%), telecommunications (15.4%), networking (10.0%) and media (9.1%). By the end of 1999 the USA had 250 billionaires, and thousands of new millionaires had been created in just one year. Microsoft was worth $450 billion, the most valued company in the world, even if it was still many times smaller than General Motors. Bill Gates was the world's richest man with a fortune of $85 billion.

One of the worst deals ever in the history of Silicon Valley took place in january 1999, when @Home acquired a struggling Excite for $6.7 billion in what at the time was the largest Internet-related merger yet. (Later that year Excite refused to buy for less than one million dollars the technology of a new search engine developed by two Stanford students, Google).

A symbolic event took place in january 2000 when America Online, the pioneer of dial-up Internet access, acquired Time Warner, the world's largest media company. A humble start-up of the "net economy" had just bought a much larger company of the old "brick and mortar" economy. At one point or another in the early months of 2000s Microsoft, Cisco and Intel all passed the $400 billion mark by market valuation, and the only company of the old economy that could compete with them was General Electric.

Then came the financial crash of march 2000. The dotcom bubble burst even faster than it had expanded. Within 30 months (between march 2000 and october 2002) the technology-heavy Nasdaq lost 78% of its value, erasing $4.2 trillion of wealth. The losses in Silicon Valley were astronomical. In 2001 there were only 76 IPOs.

The telecommunications bubble of the 1990s ended in a massive rout, with dozens of fiber-optic startups slaughtered and about two trillion dollars of value wiped out of the stock market, but, in reality, that "bubble" created an infrastructure of high-speed fiber-optic networks that a few years later would enable the boom of a new generation of dotcoms, the Google and Facebook of the world, data-hungry services that would not have been possible without those networks.

There were multiple causes for the crash, i.e. for the inflated values of dotcom stocks. One was certainly the gullible and inexperienced "day traders" who enthusiastically purchased worthless stocks, and another one was the incompetent Wall Street analysts who created ad-hoc reports to justify the aberrations of those worthless stocks. A final boost may have come from the US central bank, the Fed, which pumped more money into the system in late 1999 so that people had cash and wouldn't need to stockpile more for a Y2K Armageddon.

If this were not enough, the large IT companies based on the East Coast were probably hurt more by the end of the Y2K panic than by the dotcom crash. The first of january of 2000 came and went without any apocalypse. The Y2K paranoia was rapidly forgotten, as if it had never existed. The last day of december of 1999 would remain the best day ever to fly, because planes were empty: people were so afraid that airplanes would crash all over the world. Unfortunately, the Y2K paranoia had created an easily predictable "boom and bust" situation: billions of dollars had been spent in acquiring new hardware and software before 1999, but all of this, by definition, came to an end one minute after midnight. It was one of the few cases in which a "bust" was widely advertised before it happened.

There is no question that the dotcom bubble had gone out of control, but the drop in IT investment after the Y2K scare exacerbated the problem.

The direct impact of the stock-market crash was on jobs. Half of all dotcoms shut down. The other half had to restructure themselves to live in a new age, an age in which growth was proportional to (not independent of) profits. They needed to make money, and, failing real revenues, the only solution was to cut costs. Silicon Valley truly learned how to trim costs in the early 2000s. On top of the layoffs due to cost cutting, there were three additional problems. First of all, the number of software engineers coming out of universities had massively increased to keep up with demand: but now there were no jobs for these young graduates. Secondly, Silicon Valley companies had begun outsourcing jobs to India: 62% of India's exports of software in 2000 went to the USA. Thirdly, the USA government had just bent to the demands of the IT industry to increase the number of visas for foreign IT workers, causing a flood of immigrants: 32% of Silicon Valley's high-skilled workers were foreign-born in 2000, and mostly from Asia. These combined factors caused the first massive decline in employment in the Bay Area since the end of the Gold Rush era.

The 2001 recession was significant because California, and the Bay Area in particular, had been largely immune from recessions since the Great Depression. Recessions in California tended to be milder, and recoveries faster and stronger. In 2001 the opposite happened: California fared a lot worse than the rest of the nation.

Out of the Ruins

The crash of the Nasdaq did not mean that the Internet was already dying. On the contrary, in 2000 it was estimated that 460 million people in the world were connected to the Internet, and that 10 billion e-mail messages a day were exchanged over the Internet. In 2001 alone 42 million users traded $9.3 billion worth of goods on eBay. For the first time even a small business in a remote town could reach a market of millions of people. To mention just one emblematic statistic, Merrill Lynch reported that trades by institutional clients over its e-commerce platforms amounted to $1.9 trillion in 2000. According to the US Census Bureau, the grand total of e-commerce was just short of one trillion dollars in 2000. 94% of e-commerce was B2B (Business to Business, basically the Internet-based version of the decade-old Electronic Data Interchange) and not yet B2C. Retail e-sales (B2C, Business to Consumer) were only $29 billion in 2000, but in the following years they would increase rapidly, with a double-digit year-over-year growth rate.

Some of the most innovative ideas for the Web emerged out of the Bay Area right in the middle of the crisis. In february 1999 Marc Benioff founded Saleforce.com to move business applications to the Internet, pioneering "cloud" computing (you don't need to own a computer in order to run a software application). SalesForce launched the "software-as-a-service" (SaaS) revolution: corporations were no longer required to purchase, install and run business software in house, but they could instead pay for it only when using it. This would become a trillion-dollar market within less than two decades. The term "cloud computing" had been first used to refer to web-based services by Compaq executive George Favaloro and Sean O'Sullivan's startup NetCentric in 1996. Oracle's Net Computer had come out at about the same time, and General Magic had, of course, already pioneered the concept. Saleforce was the first major company to bet its business plan on it. In 2006 Google's CEO Eric Schmidt would introduce the term to the media, and the following year IBM, Amazon and Microsoft would re-brand their web-based services as "cloud computing".

eHow.com was founded in march 1999 to provide a practical encyclopedia to solve problems in all sorts of fields via articles written by experts in those fields. Friendster, launched in 2002 in Morgan Hill (south of San Jose) by Jonathan Abrams, Peter Chin and Dave Lee, allowed people to create "social networks". Blogger.com, founded in august 1999 by Evan Williams and Meg Hourihan, enabled ordinary Internet users to create their own "blogs", or personal journals. Dave Winer, a blogger whose blog Scripting News (1997) was influential in Silicon Valley, pioneered audioblogging with "Radio Userland" (2000).

Tim Westergren, an alumnus of Stanford's Center for Computer Research in Music and Acoustics (CCRMA), had devised a search engine for music called Savage Beast and had launched the Music Genome Project out of his Menlo Park home to archive songs based on their musical genes and calculate musical proximity (the algorithm was largely the work of co-founder Will Glaser): the search engine simply looked for songs whose genome was similar to a given song. In january 2000 that project website evolved into Pandora, an Internet-based streaming radio simulator that "broadcast" music based on the listener's preference: given a song, Pandora produced a customized radio program of similar songs.

The smartphone application Shazam, designed by Stanford alumnus Avery Wang and launched in 2002 in Britain as "2580", allowed users to identify songs (acquired in 2018 by Apple).

When in 2000 Yahoo opted for Google's search engine, Inktomi read the ink on the wall (that Google was going to wipe out the competition) and decided to invest in a new field: streaming media. Inktomi paid $1.3 billion for FastForward Networks, which specialized in large-scale delivery of radio and television broadcasting over the Web in the wake of Seattle's RealNetworks. In december 2001 Listen.com launched Rhapsody, a service that provided streaming on-demand access to a library of digital music (RealNetworks acquired Listen.com in august 2003).

Relatively few businesses were accepting credit cards for transactions on the Internet. Billpoint, founded in 1998 in Redwood City by Jason May and Jay Shen, tried to change that by offering a simple method to transfer money between individuals. It was used for purchases on websites such as Excite@Home and eBay, and it was acquired in 1999 by eBay that re-launched it in 2000, but a more formidable enemy was going to dominate that sector.

German-born Peter Thiel, founder in 1987 of the conservative student magazine Stanford Review and a successful currency trader, funded Confinity in december 1998 in Palo Alto with two editors of the Stanford Review, Luke Nosek and Ken Howery. The company was the brainchild of cryptography expert Max Levchin, a Ukrainian Jew from Chicago who brought with him a group of University of Illinois alumni, including Russel Simmons and Jeremy Stoppelman (all indirect beneficiaries of the National Center for Supercomputing Applications' Mosaic project). Their goal was to develop a system for Palm Pilot users to send ("beam") money to other Palm Pilot users, i.e. to make payments without using cash, cheques or credit cards. The first entities to be impressed by Confinity were European: Nokia and Deutsche Bank used Confinity software to "beam" from a Palm Pilot their $3 million investment in the company to Thiel. Meanwhile, X.com had been founded also in Palo Alto by South African-born Elon Musk in march 1999 after he had sold his first company Zip2 (its software powered websites for news media companies). X.com offered online banking services including a way to email money. In 2000 Confinity and X.com merged to form PayPal, and Confinity's original concept evolved into a web-based service to send money over the Internet to an e-mail address, therefore bypassing banks and even borders. Thiel's utopian vision of a universal currency was embedded in much anti-government rhetoric that reflected the traditional anti-establishment mood of the Bay Area from a right-wing perspective. However, ironically, PayPal quickly had to devote most of its efforts to fight fraud. For example, to make sure that the user was a human being and not a program, Dave Gausebeck and Levchin resurrected a technique invented by AltaVista in 1997: display blurred and distorted characters and ask the users to enter them on the keyboard; basically a reverse Turing test (a machine that tries to figure out if it is talking to a human), which became popularly known as CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart). PayPal's success was immediate, beating all the competitors that had preceded it in trying to help consumers sell and buy over the Internet. Paypal was another case, like Netscape before it, of the public choosing a standard before either government or corporations could do so.

In october 2001 PayPal already boasted 12 million registered users. Its IPO in early 2002 netted $1.2 billion dollars. The establishment, however, struck back: both banks and local governments tried in every legal way to derail PayPal. Eventually, PayPal found that the only way to survive was to sell itself to eBay (in july 2002, for $1.5 billion). The company had only 200 employees, but PayPal was an impressive nest of talents, and extremely young ones (Levchin was 26 at the IPO, Musk was 31, Thiel was the oldest at 35). Half of those 200 would quit by 2006 and found or staff new start-ups. In december 2002 Reid Hoffman of PayPal launched LinkedIn in Mountain View, the main business-oriented social networking site. In 2002 PayPal's co-founder Elon Musk founded Space Explorations Technology or SpaceX to develop space transportation. Roelof Botha became a partner at Sequoia Capital, and Thiel started his own venture-capital fund, Clarium Capital. In the following years former PayPal employees would found Yelp (Jeremy Stoppelman and Russel Simmons in 2004), YouTube (Chad Hurley, Steven Chen and Jawed Karim in 2005), Slide (Max Levchin in 2005), Halcyon Molecular (Luke Nosek in 2009). It was not just a mafia (as it was widely nicknamed in Silicon Valley), but a self-sustaining mafia because it included venture capitalists, entrepreneurs, managers and engineers. PayPal was neither the only one, nor the most advanced, method of online payment. For example, Pay By Touch, founded in 2002 in San Francisco by John Rogers, allowed users to pay with a swipe of their finger on a biometric sensor.

In june 2000 Google had achieved the feat of indexing one billion pages, a world record. Google's technology was clearly superior in many ways to the technology of the other web-search contenders. In january 2001 Google hired Wayne Rosing, a Silicon Valley veteran who had overseen the Lisa at Apple and Java at SUN. In february Google completed its first acquisition (an archive of the old Usenet, dating back to 1995) to create an extra application (Google Groups): it was the same tactic used in the past by Microsoft to create its portfolio of applications. Venture capitalists John Doerr of Kleiner-Perkins and Michael Moritz of Sequoia Capital became more involved in steering the business of the company, which eventually led to hiring another Silicon Valley veteran, Eric Schmidt (Zilog, Xerox PARC, SUN), as chairman. In 2002 Google got the support of AOL (the new owner of Netscape and a rival of Microsoft). The Faustian deal for Google's rapid success was AdWords, a pay-per-click advertising system, by far its main source of revenues. Google had started selling "sponsored links" in 2000, a practice already followed by their rivals. This was a manual process involving a salesperson and it mainly targeted large corporations. AdWords, instead, introduced in 2002, was mostly automated and, because it slashed the price of posting an advert on the Web, it targeted medium and small businesses that had been reluctant to advertise on the Web. The days of a commercial-free Web were not only over: Google de-facto turned the Web into an advertising tool which incidentally also contained information. The business model was the ultimate in cynicism: millions of website editors spread all over the world added content to the Web on a daily basis, and Google used that colossal amount of free content as a vehicle to sell advertising services to businesses. Web surfers used Google to search for information, but Google "used" them to create the audience that justified the amount it charged for advertising. Both the producers of content and the consumers of content were getting no money out of this splendid business model. Intermediary had always made money in business, but this case was different: Google was an intermediary of sorts in the flow of content from producer to consumer and was making money even though there was no money transaction between producer and consumer. The money was coming from an external entity that wanted to sell its products to the consumer. Every time someone added a webpage to the Web it made Google more powerful. Unlike traditional intermediaries, which made money by charging a fee per each transaction, Google never charged the user anything for searching. Yahoo and Excite had already understood the power of this business plan but Google was the one that implemented it to perfection.

Except for the few headline stories, the new Silicon Valley start-up was very different from the exuberant ones of the 1990s. The term "ramen profitable" was coined by venture capitalist Paul Graham to refer to a start-up that makes enough money to pay the bills.

Winners and Losers

Many of the established Silicon Valley companies did well through the recession: for example, Oracle, which in 2000 abandoned the client-server architecture in favor of the browser-based architecture and in the first quarter of 2001 posted growing revenues of $2.3 billion, and Siebel, which owned almost 50% of the Customer Relationship Management (CRM) market in 1999.

Advanced Micro Devices (AMD) beat Intel to a historical milestone: in february 2000 its microprocessor Athlon broke the 1000 megahertz (1 gigahertz) barrier. Intel's Pentium III (running at the same speed) came out a few months later. However, 2001 decimated the sector. Revenues for the semiconductor industry plunged more than 30% in 2001, with Intel alone declining 21% from $33.7 billion in 2000 to $26.5 billion.

British chip manufacturer ARM had been selling embeddable RISC chips since 1991, and in 1998 its technology was mature enough that it was licensed by Qualcomm for its cell-phone technology. By 2001 ARM dominated the market for embedded RISC chips, particularly for cell-phone applications. Only Intel, IBM, AMD and Taiwan-based fabless VIA owned a license for Intel's x86 technology, while ARM had made it very easy for anyone to license its technology. Besides the merits of its chip, its business model was friendlier to manufacturers interested in developing their own custom processors. No surprise then that more than dozens of companies had done so.

Palm was in troubled waters: by the end of 2001 its revenues had collapsed 44%.

Progress in smartphones was accelerating. In 2001 Nokia introduced the smartphone 5510 that featured a QWERTY keyboard, SMS, a digital music player, a game console, a calculator and FM radio. In march 2002 Canadian company Research In Motion introduced the BlackBerry, a hand-held device with a real keyboard that allowed users to check e-mail, make phone calls, send text messages and browse the Web. The telephone had just been turned into a wireless email terminal, and email had become a mobile service. Silicon Valley took notice and in october 2002 Danger, founded by former WebTV's employee Andy Rubin in Palo Alto, released the mobile phone Hiptop, later renamed T-Mobile Sidekick.

Meanwhile, in October 2001 the Japanese switched to 3G cellular technology. 3G enabled phones to watch videos on demand and participate in video conferencing. The world split again in two camps: Qualcomm unveiled a 3G technology called CDMA2000, while Nortel and AT&T launched their own project of "wireless Internet" and eventually assembled a group called 3GPP (3rd Generation Partnership Project) that defined its own standard, WCDMA (wideband CDMA), an evolution of GSM (using Qualcomm technology) mainly adopted in Europe and Japan. The 3GPP chose the "turbo code" (developed in 1991 by Claude Berrou in France) to replace the glorious Viterbi decoding algorithm of 1967 that had been used in GSM. Qualcomm made money both by selling its own CDMA2000 chips and by licensing its CDMA to 3GPP members. China didn't choose a winner: each of its three carriers was assigned one standard, with China Unicom adopting WCDMA, China Telecom adopting CDMA2000, and China Mobile adopting a Chinese-developed hybrid called TD-SCDMA.

However, Silicon Valley scarcely cared when in 1998 Ericsson, Nokia, Toshiba, IBM and Intel joined together to form the Bluetooth Special Interest Group to promote the short-range wireless technology invented by Ericsson that would become the standard for the headphones of mobile phones and for many other devices. In 2000 Ericsson introduced the first mobile phone with built-in Bluetooth, the T36, and a few weeks later IBM introduced the first computer with integrated Bluetooth, the ThinkPad A30 .

In july 1999 Hewlett-Packard appointed Carly Fiorina as CEO: she became the first female CEO of a Dow Jones company, another tribute to the Bay Area's propensity to diversity. In may 2002 Hewlett-Packard acquired Compaq, becoming the largest manufacturer of servers, the second largest computer company in the world after IBM, and the only serious contender for Dell in the personal-computer market. It looked like after the breathtaking ups and downs of the personal-computer market, the company that was still standing was one of the old generation. Because Compaq had purchased DEC, HP now contained a division that contained a division that was the old rival DEC. Symbolically, this represented the end of the war between Silicon Valley and Boston. At the peak of that war nobody would have imagined that some day DEC would end being just up a small division within a Silicon Valley company. And DEC had been the very originator of the "Route 128" boom in the Boston area.

The other surviving giant of the old generation, IBM, had pretty much left the personal-computer market, but dominated software services: in 2000 software and services accounted for 50% of IBM's business. The history of computer manufacturing looked like a vast graveyard of distinguished names, from Univac to DEC to Compaq.

Meanwhile, a new discipline was being born, or, at least, named. In 1999 a panel on "Big Data" was held at the Visualization Conference in San Francisco, featuring, among others, presentations by Steve Bryson and David Kenwright.

Consumer Multimedia

A sector that showed promise was the whole consumer multimedia business. Photography had gone digital thanks to ever-cheaper digital cameras. Music had gone digital (especially now that free software allowed music fans to "rip" CDs into mp3 files). And digital formats for videos were beginning to spread. Consumers needed two things: applications to display and play these digital files, and storage to save them. In 1999 IBM releases a 37.5-gigabyte hard-disk drive, at the time the world's largest. In november 2000 Seagate Technology, which had been purchased for $3 billion by Veritas Software, based in Mountain View and specialized in storage management software, smashed that record with the Barracuda 180-gigabyte hard drive. 3PAR was founded in 1999 in Fremont by former SUN's executive Jeffrey Price and Indian-born former SUN's chief architect Ashok Singhal to deliver shared storage devices, utilizing allocation strategies of "just-enough" and "just-in-time" for increased efficiency.

In 1995 the Israeli company M-Systems, founded in 1989 by Dov Moran, had introduced the first flash-memory drive, and in 1999 it introduced the first USB flash drive, marketed as "a hard disk on a keychain". That was the birth of flash-based solid-state drives, an alternative (with no movable parts) to the electromechanical hard-disk drives (with movable parts, including a spinning disk) that was going to revolutionize the industry. M-Systems was acquired by Milpitas-based flash-memory pioneer SanDisk in 2006.

Meanwhile, in 2000 Microsoft demonstrated the Windows Media Player to play both music and videos under Windows. In january 2001 Apple responded with its iTunes software (available also on Windows in 2003). iTunes was simply a repackaging of a product acquired by Apple in 2000, the digital jukebox SoundJam MP, which had been developed in 1998 by two former Apple engineers, Jeff Robbin and Bill Kincaid, whose project at Apple had ironically been terminated in 1996 when Apple had bought Steve Jobs' NeXT. In october 2001 Apple chose a completely different route from the past: it launched a consumer device, named iPod (designed by Tony Fadell), to play music files, basically a "walkman" for mp3 with a five-gigabyte internal hard-disk (a market created in 1998 by South Korea's SaeHan with the world's first MP3 player, the MPMan, and until then dominated by Singapore's Creative Technology with its digital music players Nomad).

Apple also defied common wisdom by launching a chain of fashionable Apple Stores, an idea perhaps modeled on what Olivetti did in the 1950s (Olivetti stores such as the one opened on New York's Fifth Avenue in 1954 and especially the one in Venice's St Mark Square of 1959 were created by celebrated architects, the latter by Carlo Scarpa).

The history of P2P, one of the big innovations of the era, was mostly based outside the Bay Area. Boston's student Shawn Fanning came up with the idea of a Web-based service to distribute mp3 files, i.e. music, over the Internet. His Napster, that went online in june 1999, allowed consumers all over the world to share music files, thus circumventing the entire music industry. The music industry reacted with a lawsuit that eventually shut down Napster in july 2001. It was too late to stop the avalanche, though. Napster inspired a new generation of similar Web-based services, except that the new generation improved Napster's model by using Peer-to-Peer (P2P) architectures. A P2P service basically facilitates the transfer of files between two computers, but does not physically store the file in between the two computers. Kazaa, for example, was developed in Estonia by Ahti Heinla, Priit Kasesalu and Jaan Tallinn, and introduced in march 2001 by the Dutch company Consumer Empowerment. In july 2001 San Francisco resident Bram Cohen unveiled the P2P file sharing protocol BitTorrent, soon to become the most popular service of this kind. It was faster than previous P2P services because it downloaded a file from many different sources at the same time (if multiple copies were available on multiple servers). These whiz kids became heroes of the counterculture for defying the music industry.

In 2000 the former Yahoo scientist Jim McCoy started Evil Geniuses for a Better Tomorrow to provide a P2P platform, MojoNation, inspired by videogames. The Mojo model came out of the debate on "Agoric computing": how to exploit concepts of free-market economics to solve problems in large-scale computation. In fact, the "mojo" was a cybercurrency, even though it was used to provide balanced and secure computation for a network. EGBT also pioneered a new model of P2P. In 2001 SUN would introduce a similar open-source project, XTA (Juxtapose). In 2001 a Jim McCoy associate, Bram Cohen, created BitTorrent, while another EGBT alumnus, Zooko Wilcox-O'Hearn, turned MojoNation into Mnet.

Napster and BitTorrent relied on a central server. However, Justin Frankel's and Tom Pepper's Gnutella (2000) was conceived in Arizona but it relied on a peer-to-peer network. Another decentralized network was Freenet, originally organized in London a few months later by Irish-born Ian Clarke in 2000. Four employees of Microsoft published "The Darknet and the Future of Content Distribution" (2002), acknowledging the increasing power of inscrutable password-protected networks within the Internet. Anonymous peer-to-peer networks started using the Onion Router (TOR), the outcome of a military research project launched in 1995. One of the original scientists, Paul Syverson, helped Roger Dingledine and Nick Mathewson develop the onion router TOR that became operational in 2002. Such dark nets would become very popular with political dissidents in places where the Internet was massively censored, e.g. in Syria before the civil war (but also by child pornographers, drug dealers, counterfeiters and terrorists). The Freenet itself would become a dark net in 2008.

Gaming was also undergoing a dramatic transformation. In 1996 the San Francisco Chronicle's website had introduced "Dreadnot", a game built around a fictional mystery that took place around real locations in San Francisco and featuring real people, and that used phone numbers, voice mailboxes, email addresses and other websites (in other words, an interactive multiplatform narrative). It was the first "alternate reality game" and it was free. A few years later in nearby Redwood City the team of game designers of Electronic Arts began working on "Majestic", that eventually debuted in july 2001, the first "alternate reality game" for sale. In keeping with the theme of the genre, it was credited to two fictional game designers, Brian Cale and Mike Griffin. These games involved the real life of the players, and therefore Electronic Arts marketed with the motto "It plays you". The game that launched the genre on a planetary scale was Microsoft's "The Beast", that debuted a few weeks before "Majestic". It was the brainchild of Jordan Weisman.

Unbeknownst to Silicon Valley, a major revolution was taking place in the Far East. Among the pioneers of smartwatch technology had been Seiko's Data 2000 watch (1983), the first in a line of smartwatches that would last for decades, Casio had been making smartwatches since at least the VDB-1000 (1991). Computerized watches became serious business at the turn of the decade with Samsung's SPH-WP10 (1999), the world's first watch phone, which would be followed by several other models over the years, Casio's WMP-1 (2000), the world's first watch capable of playing MP3 files, Casio's WQV (2000), the world's first watch to include a camera, followed by Casio's GWS-900 G-Shock (2004), the world's first watch capable of mobile payment. In the USA interest for smartwatches had always been tepid, with half-hearted experiments such as IBM's Watchpad (2001), jointly developed by IBM and Citizen Watch and running Linux, and later Microsoft's SPOT (2004).

Ultraviolet Microlithography

The emphasis on software pushed hardware in the background, and Silicon Valley was becoming less "silicon" than other regions of the world. Silicon Valley Group Lithography (SVGL) was a San Jose company which in 1990 had acquired Perkin-Elmer's lithography business (with financial help from IBM) and in 1993 (with financial help from Sematech) had built the world's first step-and-scan system (for MIT), the Micrascan, operating at 193 nanometers. Microlithography had reached 193 nanometers but it soon became necessary to go even lower. Extreme UV (EUV) was developed during the 1990s at several laboratories (Livermore, Berkeley and Sandia), funded by the US government plus Intel, Motorola and AMD. EUV machines required an entirely different technology. Only two companies licensed the technology: the Dutch company ASML and SVGL. Due to protectionist restrictions by the US government, Nikon and Canon of Japan were not allowed to license the system. But in 2001 SVGL was acquired by ASML, leaving ASML as the sole beneficiary of the huge investment in EUV.

Wireless

Intel, Siemens, Motorola and Philips had formed an alliance for wireless networking called HomeRF, that was also supported by AT&T, but in 1999 Richard van Nee of Lucent and Mark Webster of Harris Semiconductor (later renamed Intersil) debuted their 802.11b standard, operating in the 2.4GHz band. The consortium including these two companies as well as 3Com and Nokia formed the Wireless Ethernet Compatibility Alliance that competed with HomeRF. In 1999 Lucent delivered the world's first 802.11b card to Apple and Apple incorporated it into its iBook laptop. In 2000 IBM debuted 802.11b on its notebooks, Intel switched to Wi-Fi in 2001 and in 2002 the group changed name to "Wi-Fi Alliance". The final blow to HomeRF came in 2001 when the coffee-house chain Starbucks chose Wi-Fi for its stores. Apple and IBM had understood that Wi-Fi would soon become a household term, allowing millions of home computers to communicate with the modem that connected them to the Internet.

Radio Frequency Identification (RFID), a wireless technology that used radio waves to track items, had been invented in the 1970s but largely forgotten. In 1998 MIT professors David Brock and Sanjay Sarma developed Internet-based UHF RFID that made it feasible for top retailers such as Wal-mart (that had pioneered bar coding in the 1980s) to deploy RFID technology extensively (typically for inventory purposes within supply chain management). Wal-mart eventually mandated RFID to all its suppliers by 2005. Among the early manufacturers of RFID products was Alien Technology, founded in 1999 in Morgan Hill (south of San Jose) by Stephen Smith, a professor of electrical engineering at UC Berkeley.

RFID found another application in contact-less credit cards (or "blink technology"). These were credit cards with an embedded RFID microchip that didn't need to be swiped but simply waved. The idea was pioneered by the Upass card, based on Mifare technology, which was introduced in Korea in 1996, while in Hong Kong it was the Octopus card, based on the Felica standard, in 1997, both to pay for public transportation, and by oil company Mobil's Speedpass keychain in 1997 for gasoline pumps.

The leaders were in Europe and in Japan. The European market was dominated by Mifare, developed in 1994 in Austria by Mikron (Mifare meaning "MIkron Fare Collection System") and acquired by Dutch conglomerate Philips in 1998, while Sony's FeliCa, introduced in 1996, ruled in Japan. Both were proprietary technologies because they had been introduced before the international standard was decided.

Encryption for digital communications had been born in the Bay Area (PKI, in 1976) but owed its improvement to Israeli scientists. Adi Shamir (who had invented the crucial RSA algorithm for PKI) proposed a simpler form of encryption in 1984: Identity-Based Encryption (IBE), in which the public key is some unique information about the identity of the sender (typically, the person's email address). The first practical implementation of IBE was the work of Stanford's Israeli-born Computer Science professor Dan Boneh in 2001. Two of his students, Rishi Kacker and Matt Pauker, started Voltage in 2002 in Palo Alto (or, better, in Stanford's dormitory) to develop security software for corporate customers.

Meanwhile, the Europeans and the Japanese kept improving the mobile device that used to be a voice communication device. In 1999 Japan's Kyocera introduced the first mobile phone that contained a video recorder and player, the VP-210 Visual Phone, the first phone that could make videos. In 1999 Finland's Benefon introduced the first mobile phone that contained a GPS unit, the first phone that knew its location: the Esc. But the GPS boom started in earnest in may 2000 when the US government removed the military secret from the GPS system and provided full commercial access to the satellites, thus opening up the GPS for use by hikers and motorists. In November 2000 Japan's Sharp introduced the first mobile phone that contained a digital camera, the J-SH04. (Five months earlier Samsung in North Korea had already released the camera phone SCH-V200, but its camera was not truly integrated with the telephone function. Olympus' 1994 Deltis VC-1100 was a camera allowed users to upload digital photos over cellular and analog phone lines, but wasn't quite a camera phone, and Philippe Kahn's 1997 home-made prototype was never commercialized although it served as the blueprint for the service launched in Japan by J-Phone in November 2000 with Sharp's camera phone).

Wikipedia

The website that would have the biggest impact on society originated from the Midwest: Wikipedia. Chicago's day trader Jimmy Wales had co-founded in 1996 Bomis, a website of pornographic content for the male audience. At the same time he was preaching the vision of a free encyclopedia and, using Bomis as his venture capitalist, he had hired Ohio State University's philosopher Larry Sanger as the editor-in-chief of this Nupedia, which debuted in march 2000. The concept was aligned with Richard Stallman's Free Software Foundation, except that it was not about software but about world knowledge. In january 2001 Sanger decided to add a "wiki" feature to let contributors enter their texts. Wikis had become popular in company's intranets as ways to share knowledge, basically replacing the old concept of "groupware". This method proved a lot more efficient than the traditional process of peer review, and therefore "Wikipedia" (as Sanger named it) was already surpassing Nupedia in popularity when Bomis decided to pull the plug. Wales realized that Wikipedia was the way to go, abolished Nupedia and opened Wikipedia to everybody: formally established in 2003 as a non-profit foundation based in San Francisco, Wikipedia became a free multilingual encyclopedia edited collaboratively by the Internet community. Within a few years it would contain more information that the Encyclopedia Britannica ever dreamed of collecting. It was another example of how utopian ideals percolated into the Internet world.

Larry Sanger, instead, joined the Digital Universe Foundation based in Scotts Valley. It was founded in 2002 by Utah-based entrepreneur Joe Firmage (a former Novell executive) and by German-born astrophysicist Bernard Haisch, who in 1999 had founded the California Institute for Physics and Astrophysics in Scotts Valley. Its mission was to create a more reliable web-based encyclopedia, Digital Universe (originally called OneCosmos).

A new intellectual trend was being shaped on the Web that rejected the praxis of big media corporations to retain all rights and thus stifle creativity ("the copyleft movement"). Larry Lessig, a professor of law at Stanford Law School, founded the Creative Commons in 2001 in San Francisco to promote sharing and diffusing creative works through less binding licenses than the traditional "all rights reserved" copyright. Lessig went on to found at Stanford in 2008 the Center for Internet and Society (CIS) "to improve Internet privacy practices".

In the second half of the 1990s the Zapatistas, a small rebel group in Mexico, had staged a publicity coup by using a network of alternative media to publicize their political agenda on the Internet. Following their lead, in late 1999 political activists spread around the world started the Indymedia project to provide and distribute alternative coverage of the protests against the World Trade Organization that were taking place in Seattle. Indymedia used software originally developed by hackers in Sydney that allowed individuals to update the website in real time. In 2001 violent anti-G8 protests erupted in the Italian city of Genoa, and Indymedia played an even bigger role in disseminating agit-prop news. Indymedia would die out after Facebook and other social media began providing more professional platforms.

In 1996 Brewster Kahle, the founder of Alexa (and previously the first man to get married at the Burning Man Festival in 1992), began to archive the World-wide Web. In 2001 he launched the Internet Archive that initially simply allowed the public to browse the archived pages, but rapidly expanded to offering free digital copies of texts, images, films, etc.

Uploading

The key difference between this generation of the Web and the previous generation was not so much in the number of people who were browsing it but in the number of people who were uploading content to it.

Digital content had been around for decades. The vast majority of archives in the developed world had already been converted to databases. Large amounts of text had been scanned and digitized. New text was almost all in digital form. No other appliance since the ice box had disappeared so rapidly from households like the typewriter did in the 1990s. The telex had been replaced by e-mail. Newspapers and magazines had converted to digital composition. And now an increasing number of individuals were producing digital texts at an exponential pace, whether students writing their essays for school or adults writing letters to friends. Digital cameras and digital recorders were flooding personal computers with digital images and sounds. Napster-like services were popularizing the idea that a song is a file.

All this digital material was available, but it had been largely kept private on someone's home or work computer. The browser and the search engine, by definition, had encouraged people to "download" information from the Web, not to "upload" information to it. Wikipedia, blogs, P2P tools, social networking sites and soon YouTube and Flickr heralded an era in which the rate of uploading was going to almost match the rate of downloading. In fact, uploading was becoming a form of entertainment in itself. The 1990s had been the age of democratizing the Internet. The 2000s witnessed the democratization of the "uploading": more and more individuals began to upload their digital content to the Web in what became one of the most sensational processes of collective knowledge creation in the history of humankind.

In this scenario the business model of America OnLine was terribly outdated. Warner, originally a film production company, had merged in 1990 with publisher Time. Then Time Warner had entered the cable television business by acquiring Ted Turner's TBS in 1996. The media conglomerate now owned films, tv shows and articles. Then in 2000 America OnLine (AOL) purchased Time Warner and AOL Time Warner was born, the first media conglomerate of the Internet age that was making everything from cinema to email. The idea was to couple content and online distribution. It failed because AOL was rapidly losing its grip on the World-wide web as the era of dial-up access was being replaced by the era of Digital Service Lines (DSL) and cable broadband. There was no need anymore for AOL's dial-up service (that came with the limitation of being able to see only the part of the World-wide Web that AOL owned).

Biotech

Unfortunately the dotcom crash also affected the biotech industry. Funding for biotech start-ups collapsed after reaching a peak of $33 billion in 2000. It didn't help that Bill Haseltine's Human Genome Sciences (one of the most hyped start-ups on the East Coast) turned out to be an embarrassing bluff: it raised a huge amount of money before the dotcom crash, but it had not introduced any pharmaceutical product yet. Luckily, philanthropy offset the retreat of the venture capitalists. In 2000 Victoria Hale, a former Genentech scientist, started the first non-profit pharmaceutical company, the Institute for OneWorld Health, in San Francisco. In 2000 the Bill and Melinda Gates Foundation became the world's largest foundation, and specifically addressed biotechnology. Biofuels too (not only medicines) began to attract capital: in 2002 Codexis was founded as a spin-off from drug developer Maxygen, a biotech company founded in 1997 by Uruguayan-born Alejandro Zaffaroni, a former biochemist at Syntex; and Amyris Biotechnologies, founded in 2003 by Berkeley scientists including Jay Keasling and Kinkead Reiling, raised over $120 million in venture capital in a few years thanks, in particular, to the Institute for OneWorld Health. Solazyme, based in South San Francisco and founded in 2003 by Jonathan Wolfson and Harrison Dillon, specialized in fuel derived from algae.

In 2000 the government-funded Human Genome Project and the privately funded Celera made peace and jointly announced that they had succeeded in decoding the entire human genome.

Enumerating the genes of human DNA enabled a new discipline: genomics, the study of genes. In particular, biologists were interested in finding out which genes cause which diseases, and how to create a predictive medicine or just how to develop smarter diagnoses. In other words, Celera and the HGP had produced billions of bits of information, but now the task was to interpret that information and apply it to understanding human diseases. This implied that someone had to scavenge that mass of information looking for useful bits. Silicon Genetics was founded in 2000 in Redwood City by Australian-born mathematician Andrew Conway of Stanford's Biochemistry Department to focus on "expression software", software tools to investigate gene expression. DoubleTwist, Human Genome Sciences and Invitron were examples of companies that revised their business plan accordingly. They were biomedical companies trying to sell not pharmaceutical drugs but information and analysis. That was the bioinformatic side of the biotech business. Then there was the genomic side of the business, which analyzed a specific person's genome against that mass of annotated information.

An important step towards personal genomics was the establishment in october 2002 of the International HapMap Consortium, a collaboration among research centers (Canada, China, Japan, Nigeria, Britain, and four in the USA, including U.C. San Francisco) to create a "haplotype map" of the human genome. This map was about SNPs (Single-Nucleotide Polymorphisms). SNP means that a nucleotide of the DNA can have different values in different individuals of the same species. The 10 million or so SNPs that exist in human populations explain why some people are more likely to develop a certain disease than others.

In 2003 MIT's Tom Knight envisioned a catalog of standardized "biobricks" (biological parts) that synthetic biologists could use to create living organisms. His model clearly mirrored the way the personal-computer industry got started, with hobbyists ordering kits from catalogs advertised in magazines and then assembling the computer in their garage.

In 2002 Jeffrey Trent of the National Human Genome Research Institute, established the non-profit Translational Genomics Research Institute in Arizona.

The sequencing of the human genome made the database of Palo Alto-based pioneer Incyte partially irrelevant, but Incyte continued to acquire biotech start-ups and in 2000 launched an online version of Lifeseq that offered information about gene's functions at a more affordable price. DoubleTwist, instead, in 2002 succumbed to the double whammy of the dotcom crash and the sequencing of the human genome.

The media seized on the announcement (in october 2001) by Advanced Cell Technology (ACT) that it had cloned the world's first human embryo, causing the ire of right-wing president George W Bush who opposed human cloning on religious grounds. ACT was a spin-off of the University of Massachusetts founded by James Robl whose lab there had been the first to clone calves from somatic cells. The team included Jose Cibelli, a pupil of Robl who had experimented with nuclear transfer to rejuvenate cells, Robert Lanza, who was working on clones of endangered species, and even Michael West of Geron fame (who had joined ACT in 1998). Their goal was actually to generate embryonic stem cells that were needed for controversial medical research.

Meanwhile, an important milestone was achieved by synthetic biology: in july 2002 Eckard Wimmer's team at University of New York at Stony Brook created the first synthetic virus by cloning the polio virus from its chemical code that they had simply downloaded from the Web.

The next step in biotech automation after the DNA chip/microarray was the "lab on the chip". Since the 1960s there had been a lot of progress in "micro-electro-mechanical systems" (MEMS). These devices were already around before the invention of the microprocessor. In 1964 Harvey Nathanson at Westinghouse made the first MEMS, and the first success story of MEMS was the "thermal inkjet" technology that Canon and Hewlett Packard pioneered in the late 1970s, followed in 1993 by Analog Devices' micro-accelerometer (widely used in many industries today, for example in airbags). Initially MEMS simply exploited the fabrication technologies of the semiconductor industry, but in 1999 Lucent introduced the all-optical router and triggered the boom of optical MEMs of the 2000s. But the enabling technology was "microfluidics", the ability to make millions of microchannels ("micro" in the sense that they measured micrometers in diameter) that handle very tiny quantities of fluids. This was the result of a US military program: the Defense Advanced Research Projects Agency (DARPA) wanted a system to quickly detect biological and chemical weapons and so in 1997 created a program called "Microflumes" to fund research in microfluidics. In 1978 James Angell at Stanford had already been working on "micromachines" and one of his students, Stephen Terry, in 1979 had unveiled what can be considered as the first "lab on a chip", a device for separating, identifying and analyzing the components of a gas (originally, it was commissioned by NASA and meant to analyze the atmosphere of Mars. But progress in MEMS and in microfluidics led to today's "lab-on-a-chip" products. In 1999 Hewlett-Packard's spinoff Agilent introduced the first commercial "lab-on-a-chip" product, the 2100 Bioanalyzer. Even more important was the Agilent 5100 of 2004. These constituted the vanguard of systems that enabled biotech startups to conduct analysis of thousands of DNA and protein samples per day. After the success of the Human Genome Project, the goal shifted to putting the whole human genome on a microarray. In 2002 Wilhelm Ansorge at the European Molecular Biology Laboratory (EMBL) in Germany succeeded.

The pharmaceutical industry kept growing in the area around South San Francisco, fueled in no small part by Stanford's School of Medicine. For example, in 2002 Israeli-born professor Daria Mochly-Rosen started KAI Pharmaceuticals.

Plexicon, founded in december 2000 in Berkeley by Croatian-born Israel biochemist Joseph Schlessinger and Korean-born structural biologist Sung-Hou Kim was one of the many "drug discovery" startups, i.e. companies that were not focused on one specific biotech experiment but on discovering a new metod of accelerating drug discovery in general.

In 2000 UC Berkeley chemist Henry Rapoport and others founded Aduro Biotech (later renamed Oncologic) to treat cancer, one of the first startups to marry nanotechnology and biotechnology.

Neurotech

There was also renewed interest in Artificial Intelligence, a field that had markedly declined since the heydays of the 1980s. However, the action came mostly from futurists and intellectuals rather than from practitioners, and the funding came from philanthropists rather than the academia or the government. The Singularity Institute for Artificial Intelligence (SIAI), devoted to super-human intelligence, was founded in 2000 in San Francisco by Eliezer Yudkowsky, while in 2002 Jeff Hawkins (of Palm Computing fame) founded the Redwood Neuroscience Institute in Menlo Park.

The Stanford Artificial Intelligence Laboratory was quietly experimenting with robots and in 2005 would win the "Grand Challenge", a race of driver-less cars funded by the DARPA in the Nevada desert, with its Stanley, a modified Volkswagen Touareg.

Greentech

Times of economic crisis had always been good for imagining completely new business sectors. An emerging theme in Silicon Valley was energy, especially after the 2001 terrorist attacks that highlighted how vulnerable the USA was to the whims of oil-producing countries. In 2001 KR Sridhar, an alumnus of India's prestigious National Institute of Technology who had worked on NASA's Mars mission, founded Ion America (renamed Bloom Energy in 2006) in Sunnyvale to develop fuel-cell technology to generate environmentally-friendly electricity. Within a few years it had raised $400 million in venture capital money. Its fuel cells (eventually unveiled in february 2010) were based on beach sand. Each unit cost between $700,000 and $800,000. Google and eBay were among the early adopters of these fuel-cell power generators. Fuel cells opened up the possibility of liberating individuals from the slavery of power plants and transmission grids.

Founded in 2002 in San Jose by Stanford engineering student Brian Sager and Martin Roscheisen of FindLaw fame, NanoSolar (the first solar energy company based in Silicon Valley) collaborated with Stanford University and Lawrence Berkeley National Laboratories on the development of an ink capable of converting sunlight into electricity, the foundation for its family of flexible, low-cost and light-weight solar cells.

Nanotech

Nanotechnology, on the other hand, seemed to benefit from the dotcom crash, as venture capitalists looked elsewhere to invest their money. The USA government enacted a National Nanotechnology Initiative in 2001, which helped fuel the sector. New start-ups in Silicon Valley included Nanosys, founded in 2001 to produce "architected" materials, and Innovalight, founded in 2002 and specializing in solar cell technology. Venture capitalists saw opportunities in the convergence of biotech, infotech and nanotech. However, the nano hype mainly resulted in books and organizations with pompous titles, such as "Kinematic Self-Replicating Machines" (2004) by Robert Freitas and Ralph Merkle, and the MIT Stanford Berkeley Nanotechnology Forum (2003).

In the old tradition of government intervention to boost high-tech investments, in 1999 the Central Intelligence Agency (CIA) set up an odd not-for-profit venture-capital firm in Menlo Park, In-Q-Tel, to invest in leading-edge technologies such as nanotech and biotech. One of their investments in Silicon Valley was Keyhole, founded in 2001 by Brian McClendon and John Hanke, that developed a geospatial data visualization tool (software to display three-dimensional representations of satellite images).

Creative Destruction

By the year 2000 something monumental had happened in the USA, despite the "dotcom crash". The high-tech economy of computers, biotech and telecommunications had passed in size the industries that had dominated manufacturing for almost a century: auto, oil, steel and aircraft. In fact, by 2000 the semiconductors industry had become the largest manufacturing industry in the USA. The USA was no longer the leading country in autos and steel, but it was in hardware, software and biotech. In these fields a revolution within the revolution had taken place: small young companies (such as Compaq, Dell, Cisco, SUN, Gateway, EMC, Apple, and Microsoft) had successfully eroded the market shares of the big older companies (such as IBM, HP and Cullinet) that had dominated in the previous decades. Many of the dominant high-tech companies of the 1970s and 1980s had actually disappeared, notably DEC. Something similar had happened within the semiconductors industry when Intel went past the old dominant players, Texas Instruments and Motorola. Pundits misquoted Schumpeter's "creative destruction" (1942), when in fact Schumpeter never praised the small startups (he thought that the big companies were "the most powerful engine of economic progress"). The "creative destruction" of the high-tech industry was almost the antithesis of Schumpeter's "creative destruction": it destroyed the very "engine of economic progress" while creating a bigger engine. Schumpeter had not foreseen that a proliferation of small dynamic firms could provide more momentum to progress than a group of large multinationals, and even displace the latter.

Innovation transformed into an erotic fetish. Richard Florida's book "The Rise of the Creative Class" (2002) created the myth of Silicon Valley. Clayton Christensen's "The Innovator's Dilemma" (1997) had turned Joseph Schumpeter's concept of "creative destruction" into "disruptive technologies" to explain Silicon Valley's boom. The word "innovation" spread from US publishing houses to the most remote corners of the world, becoming a quasi-religious mantra that was supposed to solve all problems.

Bionics

The Bay Area's passion for creative a society as inclusive as possible had always made it a friendly place for disabled people, particularly around the Berkeley campus. In 2000 the DARPA decided to fund a project at UC Berkeley's Robotics and Human Engineering Laboratory, to build technology capable of mobilizing paralyzed people. These "exoskeletons", named BLEEXes (Berkeley Lower Extremity Exoskeletons), were lightweight battery-powered artificial legs that helped a disabled person not only to walk but also to carry a heavy load. The first BLEEX was introduced in 2003. In 2005 the director of that laboratory, Homayoon Kazerooni, founded a company, Berkeley ExoWorks (later renamed Berkeley Bionics), to commercialize the devices. In 2010 Berkeley Bionics would introduce eSuit, a computer-controlled suit to make paraplegics walk. Berkeley Bionics would be later renamed one more time, to Ekso Bionics, and in 2012 Homayoon Kazerooni would found another startup of the same kind, Suitx.

This opened the era of "wearables" that had been pioneered at the MIT by the likes of Thad Starner and Steve Mann at the MIT Media Lab, although most of the progress took place in Europe. In 1998 Sundaresan Jayaraman's team at Georgia Tech developed the first version of the "Wearable Motherboard", a smart shirt. In 2000 a collaboration between Phillips Electronics and Levi Strauss resulted in the first commercially-available wearable electronic garments, the ICD+ jacket, incorporating GSM cellular communications and MP3 music player. In 1998 Finland's Clothing+ (aka Mikko Malmivaara) developed a heart-rate sensing shirt and in 2000 Reima commercialized Clothing+'s "Smart Shout", a body belt for hands-free mobile phone communications. In 2000 SoftSwitch (England) introduced a fabric-control keypad to incorporate audio communications and heating systems into jackets for winter sports. In 2002 an MIT Media Lab alumna, artist Maggie Orth, founded International Fashion Machines (IFM) to make digital interactive textiles.

Anthropology of E-socializing

Email became pervasive in the 1990s. Email and the web came rapidly to be used not only for business but also for personal matters, fun and family. High-tech tools were not social in themselves but the people who produced them and used them daily discovered their "social" value. They soon came to be used as social tools too to replace the lacking social life of Silicon Valley; to create networks of soccer players, hikers, music listeners, etc. This "discovery" that high-tech tools were valuable for building social networks would have widespread implications. It also meant that individuals came to be progressively more and more often plugged into the network.

Email developed a new mindset of interaction with the community: it allowed people to delay a response, to filter incoming messages, to assign priorities. Of course, this had already been made possible by answering machines; but email was written and was stored on the computer. Email also allowed a message to have multiple recipients, which led to the creation of mailing lists. In other words, email increased control over personal communications.

Electronic socializing also helped remove some of the ethnic barriers that still existed. After all, in 1997 more than 80% of newborn babies had parents of the same ethnic group. For a region with such a large percentage of foreign-born people, it was surprising that the ethnic background still mattered so much. The advent of e-socializing removed some of the formalities that one expected from friends.

High-tech also fostered mobility of information at the expense of mobility of bodies. Telecommuting became more common, if not widespread. Telecommuters lived and worked in the same place: the home office. And they worked when inspired. They had no work hours, as long as they delivered their task by the deadline. Teleconferencing and telecommuting further reduced personal interactions. Co-workers became invisible although present 24 hours a day. On one hand, high-tech tools made one reachable anywhere any time. On the other hand, it also estranged one from the community.

Silicon Valley engineers were the first users of their technologies. But the opposite was also true: Silicon Valley engineers could also be the last users of technologies developed in other regions. For example, the rate of adoption of cellular phones was a lot slower than in Europe.

One reason why the virtual community was so successful was that the physical community was so unsuccessful: Silicon Valley was a sprawling expanse of low-rise buildings but did not have an identity (other than the identity of not having an identity). San Jose, in particular, had become one of the richest cities in the nation. It was, however, not a city: just an aggregate of nondescript neighborhoods that had grown independently over the years. In true Silicon Valley fashion, the city decided to create a downtown area overnight that would introduce a flavor of European lifestyle. It spent almost one billion dollars to create a large shopping, dining and entertainment complex right in the middle of the city: in 2002 Santana Row opened for business. It was totally artificial, of course.

The Myth

Throughout the 1990s, thanks to the dotcom boom and to the favorable global political environment (the Cold War had ended in 1991 with a complete conversion of the communist world to capitalism), there was much discussion about how to export Silicon Valley's model to other regions of the world, notably Singapore, India and China. Nobody found quite the right formula, although they all took inspiration from many aspects of Silicon Valley. In the USA itself there were different opinions on how to "create". For example, in 2000 Microsoft's former research boss Nathan Myhrvold founded Intellectual Ventures, a secretive enterprise with a business model to purchase as many patents as possible in just about every imaginable field; but the emphasis was more on filing patents than on incubating viable companies. Owning a huge portfolio of patents is mainly a legal business, involving armies of intellectual-property attorneys, and it can be lucrative because any other company trying to develop a product in the same sector will have to buy the related patents. Of course, this "patent farming" business hampers rather than fostering innovation because many smaller companies will never even think of venturing into a field for which a billion-dollar venture fund owns a patent. Regardless of the philanthropic merits of encouraging inventors from all over the world, Silicon Valley looked down on this kind of approach: it was artificial and it did not create a real integrated networked economy.

This business was so lucrative that soon other "patent trolls" would appear, notably VirnetX, that sued Microsoft in 2010 and Apple in 2012, founded in 2006 near Santa Cruz by Kendall Larsen, and Virginia-based Straight Path IP, that sued Sony, LG, Toshiba, Apple and Verizon within one year of being founded in 2013. These "non-practicing entities" used questionable tactics to extort licensing fees from rich corporations using ambiguous patents as their legal weapons. Apple wrote: "Smartflash makes no products, has no employees, creates no jobs, has no US presence, and is exploiting our patent system to seek royalties for technology Apple invented." This was actually the one case that had some merit (Smartflash was founded by British inventor Patrick Racz, who had really invented the technology he was defending), but the description of the patent troll well reflected the perception in Silicon Valley about these lawyer-dominated companies (or non-companies). The fact that it was legal to be a "patent troll" did not diminish the fact that it was amoral. Raping women used to be legal in prehistoric time, and so was slavery not long ago. That doesn't mean it was moral. It simply means that governments took a long time to enact appropriate laws to punish those practicing rape or slavery. Straight Path's website claimed "Straight Path, its executives, directors, and employees operate with the highest levels of personal and professional integrity." Not many were impressed by the "personal integrity" of patent trolls.


(Copyright © 2010 Piero Scaruffi)

Table of Contents | Timeline of Silicon Valley | A photographic tour
History pages | Editor | Correspondence