A History of Silicon Valley

Table of Contents | History pages | Editor | Correspondence
Purchase the book

(Copyright © 2014 Piero Scaruffi)

17. The Sharks (2007-10)

by Piero Scaruffi

The Decline of the Computer

The late 2000s were the age of streaming media, smart phones and cloud computing; all trends towards the demise of the "computer" as it had been known for 50 years.

In 2008 music digital downloads grew by 25% to $3.7 billion (including 1.4 billion songs), accounting for 20% of all music sales, but the music industry estimated that over 40 billion songs were illegally file-shared, which meant that 95% of the market for digital music downloads was underground. In 2009 file-sharing services ranked among the Internet's most popular websites: Rapidshare was 26th and Mediafire was 63rd. In 2009 BitTorrent accounted for at least 20% of all Internet traffic. As images, music and videos proliferated (at increasingly higher resolutions), the demand for storage became prohibitive. At the same time that the demand for remote downloads increased, network bandwidth was increasing rapidly. It then became sensible to think of keeping the files on the Internet instead of downloading them on the home computer. In the past the main reason to download them before viewing them had just been that the network was "slow". Once the network became fast enough, using a home computer to store multimedia files became redundant, and consumers started playing them directly from the Internet. For example, Netflix (that had become the main movie rental-by-mail service) launched its streaming feature in january 2007. In just two years its catalog grew to more than 12,000 titles (movies and television episodes). In march 2010 YouTube broadcast the Indian Premier League of cricket live worldwide, the first major sport competition to be broadcast live on the Internet.

Smart phones that allowed browsing the Web and exchanging e-mail had become very popular, notably the BlackBerry and the Razr, but the brutally competitive field was not for the faint of heart. In june 2007 Apple, which after the iPod had become the master of fashionable devices, introduced the iPhone, which immediately captured the imagination of the young generation. In 2008 Apple also introduced the "App Store" where independent software developers could sell their applications for the iPhone (by april 2010 there were more than 180,000). In 2007 Google started distributing for free Android, a Linux-based open-source operating system for mobile phones that had originally been developed by a Palo Alto stealth start-up named Android and founded by Andy Rubin and others. Google also created the Android Marketplace to compete with the App Store (by april 2010 it had 38,000 applications for Android-based devices). Motorola was the first company (in late 2009) to deliver an Android-based smart phone, the Droid, and reaped the benefits: it shipped 2.3 million units in the first quarter of 2010, thus resurrecting after the collapse of the Razr. In 2010 Taiwanese cell-phone manufacturer HTC entered the Android market with its Indredible, and so did Samsung with its Galaxy S. Every Android manufacturer had little control on its future, though, because the success of its device depended on Google's whims and on the whims of the carrier (Verizon, AT&T and Spring being the main ones). At the beginning of 2010 Research In Motion was estimated to have 42% of the market, followed by Apple with 25%, Microsoft with 15% and Android-based devices with 9%. Android sales in the first three months of 2010 accounted for 28% of all smart phone sales, ahead of iPhone's 21% but still behind BlackBerry's 36%. This would change more rapidly than anybody predicted. The loser was Palm, which released its Pre at the same time as Apple released the iPhone: it didn't have the charisma to compete with Google and Apple, and in april 2010 was sold to HP. In march 2008 John Doerr of Kleiner-Perkins launched a $100 million venture-capital fund (the iFund) for iPhone applications, crediting the iPhone as an invention more important than the personal computer because it knew who the user was and where s/he was.

Apple's iOS was a hit because of its multi-touch technology, but that was hardly an Apple invention. Touch-screen technology had come from Europe, and it had been in Britain (at Xerox's EuroPARC) that sixteen years earlier (in 1991) Pierre Wellner had designed his multi-touch "Digital Desk" with multi-finger and pinching motions. Then a student Wayne Westerman at the University of Delaware and his professor John Elias had founded (in 1998) Fingerworks to commercialize a technology to help people with finger injuries use a computer. Fingerworks had gone on to sell a full line of multi-touch products, notably the iGesture Pad in 2003. Apple had acquired Fingerworks' multi-touch technology in 2005. The iOS, unveiled in January 2007 (although it was not named that way until March 2008), was a Unix-like operating system that incorporated that technology: one-finger swipe to scroll horizontally, a finger tap to select an object, A reverse two-finger pinch (an "unpinch") to enlarge an image, etc. Incidentally, in 2001 Paul Dietz and Darren Leigh at Mitsubishi Electric Research Labs (MERL) in Boston had even developed a multi-touch interface that could even recognize which person is touching where ("DiamondTouch").

It was easy to predict that soon more users would access the Internet from mobile devices than from desktop computers (and probably many of them would be unaware of being on the Internet). In 2009 almost 90% of households in the USA owned a cell phone but the average of voice minutes per call (1.81 minutes) was lower than in 2008, and this despite the fact that millions of households were disconnecting their land phone lines. At the beginning voice conversation was the only application for cell phones. Then came text messaging, then Web browsing, navigation and so forth. Text messaging, in particular, proved to resonate with the psychology of the digital age: it was less time-consuming and disrupting than a voice call, even if it took longer to type a message than to say it. Voice communication was rapidly becoming an afterthought, while a myriad applications (most of which had nothing to do with telephone conversations) were becoming the real reason to purchase a "phone". The purpose of a cell phone was more data than voice. Handset design was less and less "cheek-friendly", more and more palm-friendly, because people were not supposed to "hear" but to "see" with their cell phone. In 2010 the cell phones from Nokia (that were truly meant to be telephones) were to the Apple and Android smart phones what typewriters were in the age of the personal computer.

In 2008 Chrysler pioneered the router system that installed the Internet even in cars (by connecting a cellular device to a wireless local-area network). At the end of 2009 there were already 970,000 cars equipped with Internet access.

In may 2010 a symbolic event took place when Apple's market capitalization ($227 billion) passed Microsoft's ($226 billion). In august 2011 Apple passed ExxonMobil to become the most valuable company in the world based on market capitalization.

Towards the Computing Utility

At the same time the traditional computer was attacked by the paradigm of cloud computing, i.e. Internet-based computing in which the computers are hidden from the users and computing power is delivered on demand, when needed, just like electricity is delivered to homes when and in the amount needed.

A classic example was Spotify, a service launched in 2008 by Swedish serial entrepreneur Daniel Ek that allowed subscribers to stream music. Napster had figured out a way that Internet users could simply gift each other music (alas, this was deemed illegal). Apple's iTunes had created the virtual music store, from which customers could buy (and download) their music. Spotify did not sell music, and its subscriber did not pay to "own" the music: they paid to listen to it. The music remained on the cloud, owned by the music companies or by the musician. Spotify did not require any hardware to play music, just a regular computer or smartphone (all of which were by now capable of playing the most popular music formats).

Public cloud storage had been legitimized in 2006 when Amazon had introduced its Simple Storage Service, or S3, service that anyone could use. Box, founded near Seattle in 2005 by Aaron Levie and Dylan Smith but soon relocated to Silicon Valley (Los Altos), decided to use its own servers (not Amazon's) and to target the corporate world. Dropbox, founded in 2007 by MIT students, created a friendlier service based on S3.

Cloud computing was typically built on "virtual" infrastructures provided by "hypervisors", or virtual-machine monitors, that allowed multiple operating systems to run concurrently on a single computer, enabling any application to run anywhere at any time. The pioneer and leader of virtualization had been Vmware, whose 2008 revenues increased 42% to $1.9 billion, and new hypervisors were offered by Oracle (VM, introduced in 2007 and based on open-source Xen technology), Microsoft (Hyper-V, introduced in 2008), and Red Hat (Enterprise Virtualization, introduced in 2009). The virtual-machine monitor Xen was developed in 2003 at the University of Cambridge by Ian Pratt's team and acquired by Florida-based Citrix Systems in october 2007. In august 2010 Hewlett Packard acquired 3PAR, specialized in data storage for could computing shared by multiple companies ("utility storage"). There were only four companies selling the kind of storage required by cloud computing: IBM, Hitachi, 3PAR and EMC. EMC was based in Boston, but in 2003 it had acquired three Silicon Valley success stories: Documentum, Legato and VMware.

The next step after virtualizing the operating system and the databases was to virtualize the network, which was precisely the mission of Palo Alto-based Nicira Networks, a 2007 spin-off of a joint Stanford-Berkeley project around the dissertation of Stanford student Martin Casado supervised by British-born Stanford professor Nick McKeown and UC Berkeley professor Scott Shenker.

In 2011 Oracle acquired Montana-based RightNow Technologies, a provider of cloud-based Customer Relationship Management (CRM) applications to compete directly with SalesForce. Locally, similar services were offered on the cloud by Pleasanton-based Workday, the new company of PeopleSoft's founder Dave Duffield (started in 2005).

Virtualization allowed a "farm" of computing power to create a virtual machine for a customer, no matter where the customer was located. A computing environment (made of disparate software and hardware components) could be dynamically configured to represent several different machines, each assigned to a different customer. This "multi-tenant system" was conceptually similar to a power plant: it supplied computing power over the Internet to multiple customers the way a power plant supplied electricity over the electric grid to multiple customers.

For decades software and hardware manufacturers had relied on business plans that envisioned selling the very same system to multiple customers who were using it to perform identical tasks. Ubiquitous and cheap broadband was fostering an era in which a "computing utility" could provide that service (hardware and software) to all those customers over the Internet, with no need for those customers to purchase any of the components (just pay a monthly fee). Knowledge production was becoming centralized the way energy production had become centralized with the invention of the electrical grid (spreading the power created by a power plant) and the way food production had become centralized 5000 years earlier after the invention of the irrigation network. Each "network" had created a new kind of economy, society and, ultimately, civilization.

Social networking too was destined to become a "utility" of sorts for businesses. The idea of collaboration software over the Web had been pioneered by startups such as Jive Software, founded in 2001 by two students from the University of Iowa, Matt Tucker and Bill Lynch, and originally based in New York. In the age of Facebook it was renamed "enterprise social network service" and embraced by startups such as Yammer, originally developed in 2007 by Adam Pisoni and David Sacks (a Paypal alumnus) at Geni in San Francisco, and spawned off in september 2008 (and bought by Microsoft in 2012). This platform hinted at the first major revolution in ERP since SAP's glorious R3.

Google vs Microsoft

The 1990s had been the decade of Microsoft. Microsoft owned the operating system, and it owned the most popular applications. Microsoft's success relied on the concept of the personal computer: one user, one application, one computer. Web-based computing represented the first serious challenge to Microsoft's business model. Thanks to increased bandwidth and more sophisticated Internet software, it had become possible to create applications that ran on websites and that users could access via their Web browser. It was basically a client-server architecture, except that the "server" was potentially the entire Internet, and the client was potentially the entire human population. Cloud computing is on-demand Web-based computing. The concept was pioneered by start-ups such as Saleforce.com. In february 2007 Google targeted Microsoft's core business when it disclosed a humbler version of cloud computing, Google Docs. That suite could be accessed by any computer via a Web browser, and included a word-processor and a spreadsheet program (both acquired from independent companies, respectively Upstartle and 2Web Technologies). One could already imagine the end of the era of the operating system, as the Web replaced the need for one. A computer only needed a browser to access the Web, where all other resources and applications were located.

In 2008 Microsoft Windows owned almost 90% of the operating system market for personal computers, while Google owned almost 70% of the Internet search market. The future, though, seemed to be on Google's side. In april 2010 Microsoft's IE commanded 59.9% of the browser market, followed by Firefox with 24.5% and Google's own Chrome (first released in september 2008, and based on Apple's WebKit) with 6.7% That was a steep decline from the days when IE was ubiquitous.

On the other hand, Google's revenues depended almost entirely on third-party advertising. The "war" between Google and Microsoft was well publicized, but the war between Google and Yahoo was probably more serious for Google's immediate future. In 2007 Google paid $3.1 billion for DoubleClick, the New York-based company that dominated "display advertising". This was the method favored by Yahoo when Google was piling up millions with its keyword ads. As the quality of browsers and computers improved, Yahoo's glamorous ads became more and more popular, and they were served by DoubleClick. By acquiring it, Google struck at the heart of Yahoo's business. And now Google had all the pieces to create an "advertising operating system". In 2009 Yahoo owned 17% of the market for display ads, followed by Microsoft at 11% and AOL at 7%.

Google's strategy became even more aggressive in 2010 when it launched into an acquisition spree. Notably, it purchased BumpTop, the three-dimensional GUI developed since 2006 by University of Toronto's student Anand Agarawala, and Plink, a visual search engine developed by two Oxford University's students, Mark Cummins and James Philbin. Chrome OS, first released in november 2009, was basically a variant of the Linux-based operating system Ubuntu, developed in Britain by Mark Shuttleworth's Canonical. At the same time, the products that Google engineers truly invented and that the company widely hyped, such as the social networking platform Buzz and the groupware Wave, were embarrassing failures.

Google's technology was consistently inferior to what the Silicon Valley landscape was producing. For example, Superfish, started in 2006 in Palo Alto by Israeli-born semiconductor-industry veteran Adi Pinhas and A.I. guru Michael Chertok, launched in 2010 a visual search tool that, unlike Google Goggles, was able to recognize pictures regardless of perspective, lighting, distance and so forth.

The effect of Google's domination of the search-engine market was the same as the effect of Microsoft's domination of the personal-computer operating-system market: to stifle innovation. Google had built its reputation on returning webpages based on their "relevance". Now it was mostly returning commercial websites that had little relevance to the search string, and the ubiquitous Wikipedia (for which one was better off just searching within Wikipedia itself). A very relevant article written by a scholar was very unlikely to show up in the first page of results. The "irrelevance" of Google's searches was mainly driving an entire economy based on advertising products and services. And, of course, the fact that the Web was being littered with the ubiquitous "Google Adwords" hardly represented a welcome change. Most of the text and images that flooded a user's screen constituted commercial ads (in ever more creative and invasive fashions). Thanks to the proliferation of Google AdWords, the Web was becoming not only unsearchable but also unreadable. Alas, Google had a virtual monopoly on web search: any search engine that was truly committed to "relevance" had scarce chances of success against Google, just like any operating system for personal computers against Microsoft.

In 2009 Microsoft was still the largest software company in the world with revenues of $50 billion, while Google's revenues were "only" $22.8 billion. Google had already passed IBM's software revenues ($22 billion), Oracle ($17.5 billion) and SAP ($15.3 billion, and the most troubled of all of these). (By comparison, the wildly popular Facebook only made $550 million in 2009). At one point in march 2010 Microsoft was the second largest company in the USA, with a market capitalization of $256 billion, following oil producer Exxon ($315 billion) and followed by a fast-growing Apple ($205 billion) that had passed the retail giant Walmart ($205 billion). Google's market capitalization was $184 billion. Those numbers sent two messages, both adverse to Microsoft: the personal computer was being attacked simultaneously on two fronts, by the smart phone and by web-based computing. In april 2010 Apple further weakened the personal computer when it introduced the tablet computer iPad, which sold one million units in less than one month.

In 2010 Google debuted an open platform for television sets that was the home-video equivalent of its mobile-phone platform Android. It was another attemp to marry content and hardware, something that only Apple had successfully achieved so far but with a proprietary platform. Not being a maker of hardware, Google chose again to offer an open platform. The first hardware partner to sign up was Japanese conglomerate Sony. Sony had a history of championing networked television. An early adopter of WebTV, Sony had introduced already in 2000 a product called AirBoard, a tablet computer that let users watch television, surf the Internet, view photos and wirelessly control several gadgets. Sony wasn't interested as much in TV-based web surfing (the feature provided by the alliance with Google) as in video and music streaming, i.e. selling its content (not just a piece of hardware) to the consumers. Sony owned a vast library of films and music. Google did not make any hardware and did not own any content. Google had become a giant by simply facilitating the connection between the hardware and the content, but now it was debatable who owned the future: the company that laid the cable connecting a device to some content or the companies making the device and providing the content?

The wild card in this scenario was Facebook, which began the year 2009 with 150 million users and grew by about one million users a day, the fastest product ever to reach that many users in just five years. In 2010 it was approaching half a billion registered users. In june it was being valued at $23 billion, despite not being profitable yet. In may 2007 Facebook had announced an open platform for third parties to develop applications. This amounted to a Copernican revolution: applications were no longer written for an operating system but for a social network. This event spawned a new industry of widget makers for the Facebook platform, notably RockYou (originally RockMySpace), founded in Redwood City in 2006 by Lance Tokuda and Jia Shen. However, Facebook was also being widely criticized for its ever-changing privacy policies after having contributed to distribute all over the world sensitive personal information of millions of people. It had basically become the premiere stalking tool in the world.

The price paid by users of social networking platforms was a massive dissemination of private information. By the late 2000s there was so much information available on the Web about individuals that it was natural to put together a person's lifestyle just by browsing her name. A new kind of application, social network aggregators, was soon born to help that process. In 2006 a group of Stanford students based in Mountain View, including Harrison Tang, came up with such an idea and developed Spokeo. In october 2007 in Mountain View former Google employees including Paul Buchheit (the creator of Gmail and AdSense) and Bret Taylor (the creator of Google Maps) launched FriendFeed, capable of integrating in real time information posted on social media. In july 2009 it was acquired by Facebook.

Facebook indirectly also set a new (low) standard for creating a startup. In the fall of 2007 B. J. Fogg, an experimental psychologist who was running the Persuasive Technology Lab at Stanford, instructed his students to create Facebook applications with the only goal of having as many people as possible use them in as short a period of time as possible. The students were forced to create no-frills applications whose main asset was that they were easy to use and spread around. That class alone created a number of millionaires because many of those applications became hits of the Facebook world and their authors went on to join successful companies. These Stanford students had just hit on a new formula to create a successful product: just make it easy to use and to spread virally. You can always refine it later. In april 2012 Facebook paid $1 billion for tiny San Francisco-based startup Instagram, a mobile photo-sharing service that had been introduced in october 2010 by Kevin Systrom (formerly at Google) and Brazilian-born Mike Krieger, both Stanford graduates, for the iPhone. Another social networking platform for photo sharing and messaging, specifically designed for the iPhone (and later mobile devices in general) was Path, launched in november 2010 in San Francisco by Shawn Fanning of Napster fame and former Facebook executive Dave Morin. What initially distinguished Path from Facebook was that it limited the number of friends to 50.

Location was becoming important again. Craigslist had been the last major Web-based service to address the geographic community, while almost every other major service addressed the virtual community of the whole Web. The trend was been reversed by the end of the decade. In 2010 Facebook belatedly added Places, a location-based service of the kind pioneered by Foursquare, founded in 2007 in New York by Dennis Crowley and Naveen Selvadurai, and Gowalla, launched in 2009 from Austin (Texas) by Josh Williams and Scott Raymond, that basically let friends equipped with mobile devices know each other's location. Meanwhile, Google rolled out Realtime Search that performed location-based filtering of status updates, for example to find out what is going on in a town. In 2011 Google tried to acquire discount-coupon site Groupon, launched in November 2008 in Chicago by Andrew Mason. Groupon brokered deals between consumers and local stores. Maybe it was not "Google vs Microsoft" after all, but "Google vs Facebook".

Incidentally, in 2010 EBay acquired shopping engine Milo, founded in 2008 in Philadelphia by Jack Abraham. Milo kept track of which goods were available in neighborhood stores.

Because of Google and Facebook the way people used the Internet had changed dramatically since the day that Marc Andreessen had created the Netscape browser. Nonetheless, the browser had not changed much since those days. Microsoft's Internet Explorer, Mozilla's Firefox, Google's Chrome and Apple's Safari had simply copied the concept, the look and the buttons of the original, barely introducing collateral features. A startup that tried to "upgrade" the browser to the age of Facebook was Mountain View-based RockMelt, founded in november 2008 by Eric Vishria and Tim Howes, both former employees of networking company Opsware before it was acquired by Hewlett-Packard. It also marked Marc Andreessen's returns to his roots, since he was the main financial backer of RockMelt. RockMelt represented the typical Copernican change that periodically shook Silicon Valley. In this case the victim was Facebook. Instead of having the Facebook user look at the Internet through the filter of his Facebook page and his friends' Facebook pages, RockMelt allowed the user to view the Facebook world and many other popular services (e.g., real-time news) as an extension of the browser.

The other thing that had not changed much since the invention of the Web was the search engine. While Google dominated the field, its search engine was largely agnostic about the contemporary boom of social networks. The emergence of "social search" technology was well represented by Blekko, founded in Redwood Shores in June 2007 by Rich Skrenta (the high-school hacker who had created the first personal-computer virus in 1982, the SUN guru who had created the Open Directory Project, the entrepreneur who had created Topix). It was basically a hybrid approach that mixed the traditional machine-powered search engine with the human-powered wiki.

Nor was Facebook a dogma in its space. In march 2010 former Google employee Ben Silbermann launched the image bookmarking system Pinterest out of Palo Alto. Within two years it was second only to Facebook and Twitter among social networking platform. The key difference was that it organized networks of people around shared interests not social connections.

For the record, in 2007 independent consultant Chris Messina (cofounder of the idealistic Citizen Agency) invented the Twitter hashtag, a way to group conversations on Twitter. It became even more useful in 2008 after Twitter acquired the search tool Summize (a two-year old Washington-based startup founded by Eric Jensen and Abdur Chowdhury and Ajaipal Virdy).

Bay Area Giants

The other giants of Silicon Valley were left scrambling for an exit strategy during these times of upheaval. In 2008 Hewlett-Packard purchased Texas-based giant Electronic Data Systems (EDS) in a shift towards services, and in 2009 Xerox followed suit by purchasing Affiliated Computer Services. HP also purchased Palm, now a struggling smart-phone maker (april 2010). The value of Palm was twofold: an operating system designed for cloud computing; and an application development environment based on a drag-and-drop metaphor. This was also a last-ditch attempt to enter the market for mobile operating systems, that was rapidly becoming a duel between Apple (iOS) and Google (Android). After all, Palm's WebOS had pioneered the phone software platform based on Web technology. In 2006 HP had finally passed Dell in worldwide personal-computer shipments (16.5% market share versus Dell's 16.3%), while Gateway had disappeared, having been bought in 2007 by Acer of Taiwan: for the first time a Silicon Valley company ruled the personal-computer market. However, the glory of becoming the biggest personal-computer maker in the world did not mean much at a time when Apple's tablet was crippling personal-computer sales, and in August 2011 HP announced that it was going to spin off (i.e. exit) its personal computer business. This came at the same time when HP dumped the recently acquired Palm (and its WebOS mobile platform introduced in January 2009) and at the same time that HP acquired British database application company Autonomy. Autonomy and EDS represented the future of HP, not personal digital assistants and personal computers.

In 2010 a new player emerged in the battle for the mobile operating system, and, while it was not based in the Bay Area, it relied on technology developed by a Bay Area startup. Long ridiculed for its lack of design style and and for its long list of flops (the Bob desktop metaphor, the Spot wristwatch, the Tablet PC, the Ultimate TV, the Zune music player, the Kin smartphone), Microsoft (partnering with Nokia) finally got it right with the mobile operating system Windows Phone: it was derived from the Danger technology that they had acquired from the same man who sold Android to Google, i.e. Andy Rubin.

Oracle, instead, still aiming mainly at the corporate world, acquired middleware experts BEA in 2008 and decided to enter the hardware market by purchasing a struggling SUN in 2009. In 2009 Cisco partnered with Massachusetts-based EMC, the largest provider of networked storage in the world, to found Acadia and convert the old data-centers of the corporate world to cloud computing. In the same year Cisco introduced its first line of servers, the Unified Computing System, competing directly with HP and IBM while upping the ante by integrating VMware's virtualization software (that now allowed to move applications from one server to another at the click of a mouse). At the beginning of 2010 Cisco posted its best quarter ever, with sales of $10.4 billion. Intel was largely immune to the turmoil: in 2010 it posted record sales again. In march 2009 AMD spun off its manufacturing unit to create GlobalFoundries. Within a year it became the world's second largest silicon-chip foundry company after Taiwan's TSMC.

British chip manufacturer ARM still dominated cellular-phone applications which required all-day battery life. Third-generation cellular phones were being integrated with video, music and gaming and this triggered a boom in chips for mobile devices. Ironically, both Intel and AMD had exited that market in june 2006. Intel had sold its ARM-based technology to Marvell, and AMD had sold its MIPS-based technology to Raza Microelectronics. In 2010 Intel reentered the market with both an in-house project (Moorestown) and the acquisition of German-based Infineon. ARM chips were already powering devices such as Sony television sets, the Amazon e-reader Kindle, and hotel keycard locks, and analysts foresaw a future in which thousands of small devices would need ARM chips to interact among themselves and to retrieve information from the Internet. In 2009 ARM was a $0.5 billion corporation versus Intel's $35 billion, but the trend was towards ARM's small low-power chips. Intel took action in november 2010: for the first time it accepted to manufacture someone else's chips (San Jose-based fab-less Achronix, a competitor of XIlinx and Altera in the high-end accelerating chip market). For the first time Intel entered the market traditionally dominated by Taiwanese and Chinese factories. It was unlikely this was due to a desire to enter the brutal market of contract manufacturing and more likely a desire to learn new markets. And, when in september 2011 Microsoft announced that the Windows operating system would for the first time also run on the ARM platform, Intel signed a deal with Google to use Intel chips in Android phones. Intel had been incapable of adapting to ARM's licensing business model.

The boom of smartphones also benefited makers of Wi-Fi chips, such as Atheros and Marvell: 56 million Wi-Fi-enabled cellular phones shipped in 2008 (a 52% increase over the previous year). More than 54 million smartphones were sold in the first quarter of 2010, an increase of 56% over the previous year.

It was not unusual for Silicon Valley to be late on catching the train, but this was embarrassing for a region named after silicon: all the big providers of chips for smartphones (San Diego-based Qualcomm, Texas Instruments, Korea-based Samsung, Irvine-based Broadcom) except for Marvel were based outside the Bay Area. Even the iPhone used a chip manufactured by Samsung.

Ditto for all the providers of wireless telecommunications networks: Verizon (started in 1983 as Bell Atlantic) was based in Philadelphia, AT&T was based in New Jersey, Sprint was based in Kansas (although its origins were in the Bay Area), T-Mobile was the subsidiary of Deutsche Telekom, and Clearwire was founded in 2003 in Seattle.

Beyond Moore's Law

A bigger problem lay ahead for the semiconductor industry in general. In ultimate analysis a lot of computer technology's accelerating progress was due to its fundamental component: the transistor. Transistors were expensive in the 1960s. In 1982 they were already cheap enough that $1 could buy a few thousand of them. Twenty years later in 2002 the same dollar bill could by more than two million of them. Thirty years later in 2013 the same dollar bill could buy 20 million transistors. The semiconductor industry was very aware that Moore's law was rapidly coming to and end. As transistor gets smaller, it gets harder to make them. For the first time in 50 years there was a real chance that the price of transistor would stall or even increase. The industry began experimenting with new materials, such as carbon and graphene, and even with quantum computing.

Social Games

The audience for pioneering virtual worlds such as Gaia Online and Habbo had now passed five million monthly active users. The next front was the social network. Launched in may 2008 and only accessible as an application on Facebook, the virtual world YoVille passed five million monthly active users in march 2009. By allowing people to create "second-life" avatars and interact with each other, YoVille de facto created a new concept: a virtual social network within the real social network. The success of YoVille spawned a generation of browser-based "social games" running on Facebook. YoVille itself was acquired in july 2008 by Zynga, founded in july 2007 in San Francisco by serial entrepreneur Mark Pincus. In june 2009 Zynga released FarmVille, a shameless clone of the Facebook-based social game Farm Town that was introduced by Florida-based Slashkey a few months earlier. In 2010 FarmVille had become the most popular Facebook game, boasting more than 50 million users. Zynga also introduced Mafia Wars (2009), a shameless copy of another pioneering social game, David Maestri's Mob Wars, that had debuted on Facebook in january 2008 (Maestri was still an employee at Freewebs). SIx months after the release of Playfish's Restaurant City from Britain, Zynga released Cafe World. Soon competitors started retaliating by copying Zynga's games too. While Zynga kept churning out clones of popular games, the founder of Freewebs (now renamed Webs), Shervin Pishevar, opened Social Gaming Network (SGN) in january 2008 in Palo Alto to develop original social games for the Facebook platform. Whatever the strategy, the Bay Area was becoming again the epicenter for videogame evolution.

By 2010 Apple was promoting the iPhone itself as a gaming platform, further eroding Nintendo's console-based empire.

Meanwhile, Activision continued setting one record after the other, selling 4.7 million copies of "Call of Duty - Modern Warfare 2" on its first day in 2009 and 5.6 million copies of "Call of Duty - Black Ops" on its first day in 2010. In december 2010 Activision's Cataclysm sold 3.3 million copies in the first 24 hours. Note that the "Call of Duty" games were developed by Infinity Ward in southern California (Encino) using the id Tech 3 engine developed by in Texas by PC game pioneer John Romero and others at Id Software. The technological breakthroughs were coming from somewhere else.

Videogames had become a huge industry, a phenomenal economic success. However, the state of the videogame craft was not all that advanced if one considered that videogames still relied on the original idea: create something that stimulates interest via the two primal impulses (kill and make money, given that sex was off limits) and make it as addictive as possible so people will come back for the sequel. Zynga's "FarmVille" was an addictive game exploiting the pulse to get rich, and hits like "Call of Duty" were addictive by exploiting the pulse to kill. Not exactly high art. In august 2008 San Francisco-based game designer Jonathan Blow released "Braid", a videogame for Microsoft's Xbox. The concept was completely different. Blow focused on his own aesthetic values, on sophisticated scripting and subtle psychology. Blow conceived the videogame as an art (not "art" as in "craft" but "art" as in "Michelangelo" and "Van Gogh"). He then invested the profits into producing an even more "artistic" game, "The Witness", introduced in march 2012.

The very notion of how to play a videogame was changing. OnLive was launched in march 2009 in Palo Alto by Apple alumn and WebTV's founder Steve Perlman after seven years of work in stealth mode. OnLive, that went live in june 2010, was out to kill the videogame console by providing videogames on demand to any computer. The videogames were hosted on the "cloud" of the Internet instead of requiring the user to purchase them on physical cassettes. This was advertised as "5G wireless": 2G (1992) marked the transition from analog to digital voice; 3G (2001) enabled Internet browsing; 4G provided basic video streaming; and 5G was meant to provide high-definition video streaming. OnLive didn't succeed, but Perlman continued the experiment within his incubator Rearden, eventually spawning Artemis in 2014 just when mobile video began to account for more than 50% of all mobile data. The big service providers were trying to increase capacity by adding more antennas (i.e. by shrinking the size of the wireless cells) but there was a physical limit due to interference, and Perlman was instead focusing on exploiting that very interference.

In 2010 the first blockbuster of iPhone-based videogaming took off: Angry Birds, developed by Finnish game developer Rovio Mobile, founded in 2003 by three students from Helsinki University of Technology (Niklas Hed, Jarno Vakevainen and Kim Dikert). Introduced in December 2009, it sold over 12 million copies in just one year.

User Experience

As 3D motion-control and motion-sensing products started to become affordable and therefore popular (e.g., Microsoft's Kinect, a motion sensing input device introduced in 2010 for the Xbox videogame console, based on technology originally developed by PrimeSense in Israel), the idea was copied in the Bay Area by a new generation of startups, notably San Francisco-based Leap Motion, founded in 2010 as OcuSpec by David Holz and Michael Buckwald. Leap Motion introduced motion-controlled input devices that, unlike Microsoft's ruling Kinect, reacted to natural gestures to manipulate objects on the screen. In 2013 Hewlett Packard was the first major manufacturer to embed that user interface in its computers. Meanwhile, Apple acquired PrimeSense, the company that developed Kinect, and Google acquired gesture-recognition startup Flutter, also founded in San Francisco in 2010 (by Navneet Dalal and Mehul Nariyawala).

The most significant improvement in human-computer interface of the time (as far as related to smartphones and tablets) was the Swype keyboard, originally developed for the Windows Mobile operating system by Seattle's Swype, founded in 2003 by Cliff Kushler (co-inventor with Dale Grover and Martin King of the T9 predictive text system used on cell phones of the 2000s) and by Randy Marsden (developer of the Windows Mobile touchscreen keyboard). Swype is capable of guessing the intended word when the user slides his finger on a touchscreen from the first letter to the last letter of the word. First introduced by Samsung in late 2009 and acquired by Nuance in 2011, Swype spread like wildfire on Android devices. In 2010 SwiftKey, started in London in 2008 by Jon Reynolds and Ben Medlock, debuted a system capable of predicting the next word the user intends to type. These were all part of a trend to improve the speed of typing on ridiculously small devices. As the keyboards got smaller and smaller, the speed of the skilled typist had been lost. People were writing more but much slower than the older generations. This opened opportunities for a deluge of startups to offer systems to speed up typing.

In 2010 Steve Jobs of Apple largely decreed the demise of Adobe's multimedia platform Flash when he hailed HMTL5 as the standard of the future for video on the Web (and one year later Adobe itself adopted HTML5 instead of Flash on mobile devices). HTML5 had been formally introduced in 2008 by the Web Hypertext Application Technology Working Group (WHATWG), a group formed by Apple, Mozilla and Opera Software (a Norwegian maker of Web browsers since 1997).

Improvements in "user experience" had a split-personality kind of effect. On one hand they were truly improving the life of the user, while on the other hand they were turning that improved experience into an increasingly lucrative opportunity for the business world. The proliferation of sensors, location-based applications and social media (continuously capturing information about the user) was leading to increasingly more sophisticated, personalized (and intrusive) personal assistants, capable of "anticipating" the user's behavior and the user's needs.

Cyber-currency

In 2009 a person disguised under the moniker Satoshi Nakamoto introduced the digital currency Bitcoin. This became the first successful currency not to be printed by a government. From an engineering point of view, its main achievement was that the system was capable of creating copies that could not be copied. Bitcoin seemed to prove again the theorem that major progress in peer-to-peer technology only came from independents. The previous proof had come from Napster.

The Empire

Looking at the numbers, Silicon Valley had never been healthier than it was in 2008, before the big worldwide financial crisis struck: it had 261 public companies and countless start-ups. eBay was selling $60 billion worth of goods in 2007, a figure higher than the GDP of 120 countries of the world. In 2007 venture capitalists invested $7.6 billion in Silicon Valley and an additional $2.5 billion in the rest of the Bay Area. The Bay Area boasted the world's highest concentration of venture capitalists. Silicon Valley's share of venture-capital investment in the USA reached 37.5% at the end of 2009 (compared, for example, with New York's 9.2%). Silicon Valley had 2.4 million people (less than 1% of the USA's population) generating more than 2% of the USA's GDP, with a GDP per person of $83,000. The rest of the Bay was equally stunning: by 2009 the Lawrence Berkeley Labs alone boasted 11 Nobel Prize winners, more than India or China, and U.C. Berkeley boasted 20, so the tiny town of Berkeley alone had 31, more than any country in the world except the USA, Britain, Germany and France. Add Stanford's 16 and UCSF's 3 for a grand total of 50 in a region of about 19,000 square kilometers, smaller than Belize or Slovenia. In 2006 the Bay Area received three Nobel prizes out of nine: one each to Stanford, Berkeley and UC San Francisco. In the last 20 years the winners included: Richard Taylor of Stanford University (1990), Martin Perl of Stanford University (1995), Douglas Osheroff of Stanford University (1996), Steven Chu of Stanford University (1997), Robert Laughlin of Stanford University (1998), and George Smoot of the Lawrence Berkeley Labs (2006) for Physics; William Sharpe of Stanford University (1990), Gary Becker of Stanford University (1992), John Harsanyi of UC Berkeley (1994), Myron Scholes of Stanford University (1997), Daniel McFadden of UC Berkeley (2000), Joseph Stiglitz of Stanford University (2001), George Akerlof of UC Berkeley (2001), and Oliver Williamson of UC Berkeley (2009) for Economics; Sydney Brenner (2002), the founder of the Molecular Sciences Institute in Berkeley, Andrew Fire of Stanford University (2006), and Elizabeth Blackburn of UC San Francisco (2009) for Medicine; Roger Kornberg of Stanford University (2006) for Chemistry. Another Nobel prize in Physics came in 2011: Saul Perlmutter of Lawrence Berkeley Lab. And then Brian Kobilka of Stanford in Chemistry in 2012. And then in 2013 Michael Levitt for Chemistry (Stanford), Randy Schekman for Medicine (Berkeley) and Thomas Sudhof for Medicine (Stanford).

The Bay Area was more than a region: it was an economic empire. The number of jobs that had been outsourced to countries like India probably exceeded the number of jobs created in Silicon Valley itself. And the relationship with the rest of the world was as deep as it could be: a study by Duke University found that 52.4% of Silicon Valley's high-tech companies launched between 1995 and 2005 had been founded by at least one immigrant. This phenomenon was probably a first in history. At the same time investors flocked from all over the world. For example, in 2008 Taiwanese conglomerate Quanta invested into two Silicon Valley's "fab-less" start-ups: Tilera (founded in october 2004 by Anant Agarwal) and Canesta (founded in April 1999 by Cyrus Bamji, Abbas Rafii, and Nazim Kareemi).

India's Silicon Valley

The history of India's involvement with computers started with the five Indian Institutes of Technology (IITs), established in Kharagpur (1950), Mumbai (1958), Chennai (1959), Kanpur (1959) and Delhi (1961), and supported by five different countries from both the democratic and communist worlds (Kanpur was the USA-affiliated one). Meanwhile, the Indian families that could afford it were sending their children to graduate in Britain and in the USA. The Indian-born historian Peter James Marshall at Cambridge encouraged Indian students to become entrepreneurs not academics because he saw that as the most pressing need for post-independence India. The main destination in the USA was the MIT and the Indian graduates from the MIT mostly remained in the USA, but they created a network that provided the first link between the high-tech industry of the USA and India. In 1972 MIT-graduate Narendra Patni founded the software enterprise Data Conversion (later renamed Patni Computer Systems) with offices both in Boston (USA) and Pune (India). In 1981 some of Patni's engineers (mostly from the MIT too) founded Infosys in Pune. Meanwhile, the government had decided to install high-speed satellite communications in Bangalore, the site of the Indian Institute of Science (founded in 1909 thanks to steel industry magnate Jamsetji Tata). Because of that decision, in 1985 Texas Instruments opened a software laboratory in Bangalore. Within a few years Infosys and Wipro too moved their operations to Bangalore. In 1986 Patni obtained the first contract from a USA company (Data General) for offsourcing software to India. Meanwhile, in 1991 India liberalized its protectionist economy, that had long been influenced by anti-capitalist precepts. During the 1990s Infosys experienced double-digit growth rates. By 2010 it would become one of the largest software companies in the world, with over 100,000 employees. However, Infosys well represented the problem, not the solution: by 2010 this industrial colossus had been granted only twelve patents in the USA Microsoft had fewer employees, but had passed the 5,000 patent mark already in 2006.

In general, graduates from the IIT lacked the counterculture spirit that was pervasive in the Bay Area, the passion for nonconformist behavior (and ideas), the curiosity about nontechnical topics (literature, philosophy, visual arts, history, etc), the push to innovate rather than just execute orders. Many graduates from the IIT did extremely well around the world (and notably in Silicon Valley) but many more became simple hired hands in the software industry, and the vast majority held humble (if well-paid) jobs back in India contributing little in terms of imagination. Most of them were trained to be not intellectuals but skilled craftsmen; not the successors to the great philosophers of India but the successors to their grandparents' world of religious sculptors and shoemakers.

The Asian Miracle

East Asia is a more complex story.

Asian economies managed to progress from starvation in the 1960s to the top tier of development and wealth. In 1981 East Asia still had the highest poverty rate in the world, higher than Africa. In 2011 two of the top three economies in the world were from East Asia, and very soon they might have three out of four including the number one.

There is no single cultural/political explanation for the Asian miracle because the countries of Asia spanned a broad spectrum of philosophies (Confucian, Shinto, Buddhist, Hindu, Muslim) and of political systems: fascism (South Korea, Singapore, Taiwan, Thailand), communism (mainland China), socialism (India), democracy (Japan, Hong Kong).

There were actually two Asian miracles. The first Asian miracle was low-tech and largely based on cheap labor and loose business regulations. The first Asia miracle relied on a simple business plan: do what the West does but do it cheaper and possibly better. This became an art in itself. It also caused the rapid Westernization of all US allies in Asia (that until World War II had largely held on to their traditions). The second Asian miracle was high-tech and based on improving Western high technology. By this time Asia had become very Westernized and, in particular, democracy had become widespread.

The first Asian miracle was largely the consequence of the Cold War between the USA and the Soviet Union. The USA helped the Asian (and Western European) economies develop in order to create stable and strong defenses against the communist world. The USA encouraged trade with its oversea allies, and tolerated that they competed unfairly with US-based manufacturers by keeping their currencies very low.

Government intervention was crucial to the development of these economies: they all boomed under the supervision of a benevolent dictatorship that nurtured and guided the private sector.

Most of the political leaders of the first boom came not from the aristocracy but from the very poor class. These leaders were interested in copying Western industrial methods but ignored Western economic theory: they used common sense and traditional values. They nonetheless became enlightened leaders who somehow made mostly good choices to turn their starving countries into economic tigers. They in fact greatly improved the economies of their respective countries.

All the early boomers lacked natural resources. In order to pay for them, they had to focus on exporting. The big open market was the USA. Therefore it was natural that they focused on exporting to the USA. The USA also provided the military protection that helped these countries focus only on the economy. Security was guaranteed and paid for by the USA. What these early boomers had in common was not a cultural or political background but that they were all allies of the USA during the Cold War.

The first "Asian tigers" were Japan, South Korea, Taiwan, Singapore and Hong Kong. They embodied different philosophies (Shinto, Buddhist, Confucian) and came from different histories: Japan was already westernized, Taiwan partially (as a Japanese colony), Korea never was, Hong Kong and Singapore were British colonies The role of the state varied from ubiquitous (Japan) to totalitarian (Singapore, South Korea, Taiwan) to indifferent (Hong Kong). The governments were run by people who were mostly of humble origins and uneducated. A widespread figure was the "benevolent dictator", whose main goal was not his own wealth or a dynasty but the good of the country. East Asian countries lacked natural resources. To pay for imports, they exported cheap goods to the USA. They developed a knack for understanding Western consumer behavior and social trends. The USA de facto paid for their development . The geopolitics of the Cold War (need to contain the Soviet Union) de facto drove the early Asian boom. What they had in common is not cultural or political background but that they were allies of the USA during the Cold War. In these countries technological progress remained largely in the hands of large national conglomerates. Singapore, however, established a unique model: it favored foreign investment by multinationals over nurturing national conglomerates. They ignored Western economic theories and used common sense and traditional values. They also ignored social and economies ideologies of the Third World. Centralized economies enabled long-term planning and short-term decisions. The government rewarded with favors the companies that achieved world-class results. Asian people demanded economic growth before democratic growth. These were not "nice" places to live in: they were overcrowded and poor for a long time. Nor did they benefit from high education: relatively few people were able to obtain a computer-science degree (or a science degree in general) unless they were able to pay for an education abroad.

The second Asian miracle was largely the consequence of the end of the Cold War. The collapse of the Soviet Union and of its communist block turned the whole world into one colossal capitalist market. The Asian countries applied their learned skills on a larger scale to a larger market. Globalization favored Asian products over the more expensive products from Western Europe and the USA. However, by now Asia had learned sophisticated industrial and business methods, and was not restricted to cheap goods. Its factories had developed into high-tech factories by serving US manufacturers, but the know-how could now be recycled internally to create factories that served Asian manufacturers. Therefore Asia found itself capable of competing with the West in the much more lucrative high-tech field.

This second Asian miracle introduced two colossal players. Mainland China simply followed the path of Western imitation that had been tested by the old "tigers". India, however, could also sell a widespread knowledge of the English language and Western-style universities that graduated scores of mathematically-inclined students, two factors which created an opportunity for the service industry. Again, the natural customer was the USA, in this case the booming computer and Internet industry. The USA, again, provided the military protection that helped these economies grow in a world largely at peace. West of Pakistan, Asia was relatively at peace. The real trouble was in the Middle East and East-Central Asia, i.e. in the Islamic world. The USA and its Western allies of NATO were busy maintaining some kind of order there, which also provided for safe trade routes. Even mainland China, that quickly became the USA's main rival on the world stage, relied on trade routes from Africa and the Middle East that were guarded by the mighty military of the USA.

The second generation of "Asian tigers" (in the 1990s) includes China, India, Thailand and Malaysia. Again, they came from different philosophies: Confucian, Hindu, Buddhist, Muslim. The role of the state was significant in that liberal reforms reduced the power of the state (especially in China and India). Their government was mostly run by well-educated technocrats.

Of the two Asian colossi, China probably had the better infrastructure. India had the disadvantage of poor transportation and electricity infrastructure. That, combined with restrictive labor laws, discouraged the kind of labor-intensive sectors that were booming in its eastern neighbors. While Taiwan, Singapore, South Korea, etc, China capitalized on cheap labor to capture offsourced jobs from the USA, India capitalized on English-speaking, college-educated and cheap engineers to capture offsourced jobs from the USA. The USA lost blue-collar (low-paying low-skills) jobs to East Asia and ended up losing white-collar (high-paying high-skilled) jobs to India.

A popular joke is that in China the conglomerates succeeded because of the government, whereas in India the conglomerates succeeded despite the government. Indian conglomerates had to aim for self-sufficiency. Jindal produced steel and power through backward integration from its own captive coal and iron-ore mines. Gautam Adani bought port and coal mines, built a power plant and a railway. Unlike China's labor-intensive exports, India became an exporter of capital-intensive items that require skilled workers

Asia's High-Tech Industry

In the second half of the 20the century the only country that competed with the USA in terms of mass-scale innovation that changed the daily lives of billions of people was Japan: the transistor radio (1954), the quartz wristwatch (1967), the pocket calculator (1970), the color photocopier (1973), the portable music player (1979), the compact disc (1982), the camcorder (1982), the digital synthesizer (1983), the third-generation videogame console (1983), the digital camera (1988), the plasma TV set (1992), the DVD player (1996), the hybrid car (1997), mobile access to the Internet (1999), the Blu-ray disc (2003), the laser television set (2008). However, Japanese innovation mostly came from conglomerates that were very old: Mitsubishi (1870), Seiko (1881), Yamaha (1887), Nintendo (1889), Fujitsu (1934), Canon (1937), Toyota (1937), Sony (1946), NTT (1952), etc.

With the exception of the media hub in Tokyo (mostly devoted to pop culture), there was no major industrial cluster in Japan that could compare with Silicon Valley. Tokyo's "high-tech" regions were the traditional industrial hub that grow around big companies, like Aichi, the location of Toyota's main plant and of many Toyota suppliers, Hiroshima, Sendai, Yonezawa. Later a sort of Silicon Valley emerged in Fukuoka, in the far south of Japan (Kyushu island) thanks to a cluster of universities (Kyushu University, Kyushu Institute of Technology, Kitakyushu University, the Institute of Systems & Information Technologies) and mostly for the semiconductor industry.

South Korea followed a similar path to high-tech innovation. Like Japan it relied mainly on well-established companies like Samsung (1938). Like Japan the environment was hostile to foreign companies. Like Japan there was little interaction between universities and industry, with professors more similar to government bureaucrats than to incubators of ideas or coaches of entrepreneurs.

However, South Korea also had two regions that looked a bit like Silicon Valley. In 1973 South Korea established the Korea Advanced Institute of Science and Technology at Daedeok, south of Seoul. This area began to attract R&D labs and eventually to spin off startups. It therefore came to be known as "Daedeok Valley" and officially as Daedeok Science Town. The software industry, instead, assembled around Teheran Road in Seoul (between Gangnam Station and Samseong Station), nicknamed "Teheran Valley", a three-km stretch that probably got the majority of South Korean venture-capital investment during the dotcom boom.

In terms of process, however, Taiwan was the real story. In 1973 Taiwan established the Industrial Technological Research Institute (ITRI) to develop technologies that could be turned into goods for foreign markets, an institute that would spawn dozens of semiconductor firms (starting with UMC in 1980 and TSMC in 1987), and then the pioneer of Asian state-founded high-tech parks, the Hsinchu Science-based Industrial Park, established in 1980 about 88 kms from Taipei near four universities. In 20 years more than 100 companies were formed by graduates of those universities. Taiwan was also the place where venture capital took off in Asia and was also the first place that implemented the feedback loop between university and corporate R&D typical of Boston and Silicon Valley. TSMC launched the independent silicon-chip foundry, that turned Taiwan into Silicon Valley's main destination for outsourcing chip manufacturing, and which in turn helped create a vibrant chip-design industry with companies such as MediaTek (1997) and NovaTek (1997).

Neither Japan nor Korea nor Taiwan attracted highly educated immigrants from elsewhere like Silicon Valley did. The only Asian country to do so was Singapore. It is not a coincidence that Singapore was also the country that invested the most in attracting foreign businesses (and their know-how). Singapore too developed an advanced venture-capital industry and fostered interaction between its universities and the private industry.

Neither of these countries created a vibrant software industry like Silicon Valley.

China was a late-comer and capitalized on its neighbors' many experiments, avoiding Japan's model because it simply did not have the kind of old established manufacturing and financial giants that Japan had. The idea of a "Chinese Silicon Valley" came to nuclear scientist Chen Chunxian of the Academy of Sciences after a 1979 trip to California. Back home in 1980 he tried unsuccessfully to start a privately-funded Advanced Technology Service Association just outside Beijing. However, the Academy took up his idea and began to play the role of incubator for high-tech startups, one of which would become Lenovo, most of them based along the ten-km Zhongguancun Street. The idea appealed to both staff and students of the two leading universities of Beijing, Peking University and Tsinghua University, who needed to make more money than the tiny government salaries and subsidies. Finally, in 1988 the government gave its blessing and the whole region came to be called Zhongguancun Science Park. In little over than a decade massive government incentives created thousands of new companies, the vast majority in software and telecommunications.

In 2008 the USA entered the Great Recession while Asia was still booming. Nonetheless in 2010 Asians overcome Hispanics as the largest group of immigrants to the USA. That is Asia's single biggest failure: people were still leaving the continent by the hundreds of thousands even at a time when Asia looked like a land of opportunities and the USA was widely considered on the way out as a world power. One had to wonder what it would take to reverse the tide if the biggest economic crisis in 70 years didn't do it.

The Importance of Process and of Evolution

The most important event in the modern history of Asian industry, however, may predate all of this. In 1953 Taiichi Ohno invented "lean manufacturing" (or "just-in-time" manufacturing) at Japan's Toyota. That was one of the factors that turned Toyota into a giant success story (and eventually into the largest car manufacturer in the world). More importantly, it created the mindset that the process is often more important than the product. When (in 1987) ITRI's president Morris Chang launched Taiwan's Semiconductor Manufacturing Company (TSMC), the first independent silicon-chip foundry in the world, to serve the "fabless" companies of the USA, he simply applied that mindset to the computer industry and realized that one could decouple the design and the manufacturing. That does not mean that only the design is creative: the design would not lead to a mass-market product (and possibly to no product at all) without a highly efficient manufacturing process that lowers the cost and improves quality.

From the point of view of the Asian suppliers, what was really revolutionary (and not just evolutionary) was the process, not the product. The fabless process and the offshore customer-service process were the real breakthroughs, not the new laptop model or the new operating system. Without significant progress in Asia in the industrial process many great success stories of Silicon Valley products may have not happened at all. The world viewed from Silicon Valley was a world centered on the place where a product was designed and marketed. The world viewed from Japan, Taiwan and Korea was a world centered on the industrial centers that could manufacture products based on whatever design, faster, cheaper and better. The world viewed from India was a world centered on the army of software engineers that could deliver software based on whatever needs, faster, cheaper and better. From the Asian viewpoint, it was the increasing evolutionary efficiency of the "builders" that allowed Silicon Valley to create "revolutionary" products and new markets. For example, despite being a relatively small economy and boasting virtually no invention that ordinary people can name, Taiwan rapidly became the fourth country in the world for patent filing (after the USA, Japan and Germany); and Japan rapidly became the country with the highest percentage of GDP spent in R&D.

That "evolution" stemmed from a mindset driven by incremental progress, the equivalent of new releases of an operating system, each one enabling a whole new range of features (and prices) for the next product to capture the headlines.

From the viewpoint of Asia, ideas for "revolutionary" products are actually easy. What is difficult is "making" those products, not imagining them.

Furthermore, it is debatable which region and system has yielded more "revolutionary" products. The argument that small companies and Silicon Valley rule is an opinion, not a fact. The perception is that in Japan innovation was driven only by big companies. That perception is true. What is not true is the perception that the source of innovation was significantly different in the USA. It is easy to forget that Silicon Valley has invented very little: most of the things that changed the high-tech world (from the transistor to the disk drive) were invented by big companies (like AT&T and IBM), and many of the really revolurionary ones (the computer, the Internet, the World Wide Web) were invented by government labs. Blaming Japan for relying too much on big companies and big government means not knowing what Silicon Valley actually does, which is not to invent the computer nor the Internet nor the transistor nor the smartphone. In fact, the rigid bureaucratic big-company system of Japan has invented a lot more than Silicon Valley. And it has arguably created more wealth for ordinary people, turning a poor country into one of the richest countries in the world.

The Western Contribution to the Success of Asia's High-tech Industry

Part of the success of Asia was obviously made in Silicon Valley: the fiber-optics boom of the 1990s helped collapse the cost of a leased telecom line between California and Bangalore, and then the Voice Over IP technology of the early 2000s made it irrelevant. Without this simple improvement in the cost of communications India's software industry would not have picked up the way it did. The dotcom boom introduced dozens of online services that eliminated the distance factor and therefore helped remote firms compete with local firms. Even the passion for industry standards that has always been a peculiarity of the USA ended up favoring those abroad who wanted to create critical mass for their services of outsourcing.

The government of the USA has indirectly been proactive in creating opportunities for Asian firms. When Bell Labs invented the transistor, it was under government pressure that its owner AT&T decided to license transistor technology at a low price to anybody who wanted it, including the company that later became Sony. It was under government pressure that IBM decided to "unbundle" the software applications from its mainframe computers, thereby spawning a software boom worldwide. It was the government that created and funded the Arpanet, that went on to become (as Internet) the backbone of the global outsourcing movement.

The biggest gift of the West to Asia was perhaps the revolution in management that had started in earnest in the 1920s but got a phenomenal boost in the 1990s with the collapse of the communist world and the boom of free trade. The USA boasted the most advanced class of business administrators in the world in terms of maximizing profits (at least short term), and their discipline almost inevitably took them to outsource both manufacturing and services. One of the tools that they used was also a Western invention: ERP software to manage complex sophisticated distributed supply chains.

And, of course, it was the USA and its allies that had taught Asia to speak English: India, Hong Kong and Singapore had been British colonies, and Taiwan, South Korea and Japan had been militarily occupied by the USA in one fashion or another. The language turned out to be an important factor in leapfrogging ahead of the Europeans.

Another, less obvious, reason why it was so easy for Asian companies to do business with Silicon Valley is that Silicon Valley (and the high-tech industry in general) was in reality being less innovative than it appeared to be to consumers: both software and hardware (consumer) products tended to follow a predictable evolutionary path that it was easy for suppliers to anticipate with the required building blocks and services.

Of course, we don't mean that Asia was just a lucky beneficiary of some Western policies. These various forms of indirect "help" from the West would not have worked without native factors that were already in place. In brief, those successful regions of Asia had traditionally valued higher education (and now also and especially in engineering and science), enjoyed enlightened state planning (that, for example, placed emphasis on electronics while Europe was still focused on cars and appliances), boasted world-class work ethos (that guaranteed quality and reliability) and were largely non ideological (they didn't view capitalism as evil, like large segments of the European public, and did not harbor religious/ethnic hostilities towards Westerners).

The Missing Ingredients in Asia

However, nothing like Silicon Valley has emerged in East Asia. One simple reason for this is that it is not only startup founders who have to be visionary in order to achieve technological breakthroughs: venture capitalists have to be visionary too. In Singapore, for example, they are government bureaucrats. Asia has a tradition of keeping the money in the hands of trusted (and prudent) bureaucrats. The USA has a tradition of letting the (reckless) successul self-made man be the one who funds the next big thing.

However, that cannot be the whole story. For example, Japan never created a vibrant software industry. But it would be unfair to blame it only on the economic and political system: very few startups in Silicon Valley were founded by Japanese immigrants, despite the fact that the number of Japanese engineers working in Silicon Valley has always been significant. Compare with the incredible percentage of Indian and Chinese immigrants who started companies. And this despite the fact that Japanese education is routinely ranked higher than Indian and Chinese education.

It might look surprising that a country like Singapore, that appears to be a lot more "modern" than the San Francisco Bay Area, contributed less to technological innovation than Silicon Valley. Singapore is a model of urban design and management. Silicon Valley is decades (if not a century) behind Singapore, meaning that it would probably take a century for Silicon Valley to catch up with Singapore's infrastructure projects, not to mention to match Singapore's architectural wonders (compared with Silicon Valley's none). It's not only the subway and the quality of roads that is superior in Singapore: the Singaporean citizens are way more "high-tech" than their peers in Silicon Valley. Singaporeans had cell phones when they were still a rarity in Silicon Valley. Silicon Valley still has to match the Internet speed that Singaporeans have been enjoying for years. People in Silicon Valley got the first taste of mobile payment a decade after it became commonplace in Singapore. However, hard as it may have tried, Singapore produced no Apple, no Oracle, no Google and no Facebook. The reason might be the very concept of what high-tech is. In Silicon Valley people need just a cubicle to work (and a car to get there, given the third-world public transportation). They are perfectly happy living in a culturally poor place of ugly buildings and restaurant chains`. In Singapore people expect a town that is comfortable; a nice place to live in. High-tech is a means to an end just like concrete and plastic. It is part of the urban fabric. It is one of the elements that contribute to making Singapore a model of urban design and management. In Silicon Valley people are willing and even excited to be worked to death like slaves for the privilege of being part of the high-tech world that designs the tools of tomorrow (and for a tiny chance of becoming the next billionaire). In Singapore there is nothing particularly prestigious about working in high tech: the prestige is in using it to do something that is relevant to society. Silicon Valley assumes that one can change the world just by releasing a new device or a new website that will spread virally. Singapore assumes that one can change the world by adopting whichever the best available tools are at the moment, and it is largely irrelevant who invented them; just like it's irrelevant who makes the blue jeans or the shoes that became popular after independence.

A timeline of Asia's high-tech industry


1949: Japan's MITI is established as an architect of industrial policy
1954: Sony's transistor radio
1954: Fujitsu enters the computer market
1958: The USA forces Taiwan's regime to abandon plans to recover the mainland and focus on economic development
1959: Hitachi builds its first transistor computer
1961: Shigeru Sahashi becomes director of the Enterprises Bureau at Japan's MITI
1964: South Korea launches plan to improve exports
1965: Chinese physicist Li Kuo-ting becomes Minister of Economy in Taiwan ("the father of Taiwan's economic miracle")
Dec 1966: Taiwan's Ministry of Economy establishes an "Export Processing Zone" (EPZ) in Kaohsiung, whose tenant companies enjoy privileges but must export all their output
1968: Singapore's plan to woe foreign multinationals
1969: Texas Instruments, National and Fairchild open plants in Singapore
1969: Shih Ming and Andrew Chiu found Taiwan's first semiconductor company, Unitron
1969: Japan's Seiko introduces the world's first commercial quartz wristwatch
1969: India's Tata appoints Faqir Chand Kohli in charge of computer services
1970: Japan's Sharp and Canon introduce the first pocket calculators
1971: Unitron's engineer Stan Shih designs Taiwan's first desktop calculator
1971: Busicom introduces the LE-120A Handy, the world's first pocket calculator
1972: Indian-born MIT-graduate Narendra Patni founds Data Conversion (later Patni) in the USA with back-office operations in Pune
1973: Taiwan establishes the Industrial Technological Research Institute (ITRI) to develop technologies that can be turned into goods for foreign markets
1974: Tata obtains a software contract from Burroughs, the first major software project offsources by the USA to India
1973: Japan's Canon introduces the first color photocopier
1973: Ichiro Kato's team at Waseda University in Japan unveils the first full-scale anthropomorphic robot in the world, Wabot-1
1973: South Korea establishes the Korea Advanced Institute of Science and Technology at Daeduk
1974: Tai-Ming "Terry" Gou founds the plastic factory Foxconn in Taiwan
1974: Japan's Hitachi produces its first IBM-compatible mainframe computer
1975: Azim Premji's Mumbai-based Wipro starts selling the first computer made in India
1976: Stan Shih and his wife Carolyn Yeh found the calculator maker Multitech (later Acer) in Taiwan
1977: 57 foreign firms, including IBM, dclose down their Indian plants rather than meet Indian demands for some degree of Indian ownership
1978: Deng launches economic reforms in mainland China
1978: Karnataka State's agency Keonics establishes Electronics City in Bangalore, India
1979: Mainland China sets up a Special Economic Zone (SEZ) in Shenzhen to experiment with foreign investment and export manufacturing
1979: Japan's Sony introduces the portable music player Walkman
1980: Japan's Sony introduces the double-sided, double-density 3.5" floppy disk
1980: Japan's Yamaha releases the first digital synthesizer
1980: Wipro, to fill a gap after IBM left India, hires Sridhar Mitta who sets up offices in Bangalore to make computers
1980: Taiwan's ITRI spawn the first startup, UMC
1980: The USA grants mainland China most-favored-nation status, i.e. access to US investors, technology and market
Dec 1980: Taiwan's minister Li Kuo-ting establishes the Hsinchu Science Park
1980: The largest semiconductor manufacturers in the world are: Texas Instruments, National, Motorola, Philips (Europe), Intel, NEC (Japan), Fairchild, Hitachi (Japan) and Toshiba (Japan).
1981: Taiwan's Multitech (later Acer) introduces its own computer, the Micro-Professor MPF-I
1981: Infosys is founded in Pune, India, by Patni employee Narayana Murthy
1982: Japan's Sony introduces the compact disc
1983: Japan's Sony releases the first consumer camcorder
1983: Japan's Nintendo launches the videogame console Nintendo Entertainment System
Dec 1983: Taiwan's Multitech (Acer) introduces one of the earliest IBM-compatible personal computers
1984: Stan Shih of Taiwan's Multitech (Acer) founds the research firm Suntek in Silicon Valley
1984: Fujio Masuoka at Japan's Toshiba invents flash memory
1984: Japanese firms introduce the 256K DRAM chips
1984: Liu Chuanzhi of the Chinese Academy of Sciences founds a privately-run but state-owned company, Legend (later Lenovo), to sell IBM's personal computers in China
1985: Taiwan hires US-based semiconductor-industry executive Morris Chang to run the ITRI
1985 Texas Instruments opens a software laboratory in Bangalore, India
1986: Taiwan's Acer, leveraging its supply-chain optimization strategy, releases the world's second computer based on Intel's 386 microprocessor (one month after Compaq)
1986: The Japanese government founds the Advanced Telecommunications Research Institute International (ATR)
1987: ITRI's president Morris Chang founds Taiwan's Semiconductor Manufacturing Company (TSMC), the first independent silicon-chip foundry in the world, to serve the "fabless" companies of the USA
1987: Ren Zhengfei founds the telcom-equipment maker Huawei in mainland China
1987: The largest semiconductor manufacturers in the world are Japan's NEC, Japan's Toshiba and Japan's Hitachi
1988: Foxconn opens a pioneering factory in China's experimental city Shenzhen
1988: China sets up the Zhongguancun Science Park outside Beijing
1988: Japan's Fujitsu introduces the world's first fully digital consumer camera
1988: Barry Lam founds Quanta Computer in Taiwan
1989: Singapore's Creative Technology introduces the Sound Blaster card for personal computers
1990: China's Lenovo introduces its first homemade computer when the market is dominated by IBM, HP and Compaq
1991: Wipro wins a software contract from a US customer that interacts via the Internet
1993: American Express outsources the management of its credit-card business to its Indian office led by Raman Roy, the first major project of business-process outsourcing to India
1991: The Indian government sets up the Software Technology Parks of India (STPI) to promote software exports and opens the first park in the Electronics City of Bangalore
1991: Japan's Sony releases the first commercial lithium-ion battery
1992: Japan's Fujitsu introduces the world's first mass-produced plasma display
1992: South Korea's Samsung becomes the largest producer of memory chips in the world
1994: Japan's Sony introduces the PlayStation
1994: Alpha and Frank Wu found the contract manufacturer AmTran Technology in Taiwan
1996: Japan's Toshiba introduces the first DVD player
1996: Japan's Sony introduces the chip FeliCa for RFID technology
1997: The Octopus card in Hong Kong pioneers contact-less credit cards
1997: Japan's Toyota introduces the Prius, the first mass-produced hybrid vehicle
1997: Lenovo passes IBM to become China's main vendor of personal computers
1997: Japan's Panasonic introduces the first flat panel television set
1997: Cher Wang, the richest woman in Taiwan, founds HTC
1998: South Korea's SaeHan Information Systems introduces the first mass-produced mp3 player, the "MPMan"
1998: Tencent is founded in China by Ma Huateng and Zhang Zhidong to develop the instant messaging platform Open ICQ or QQ
1999: Singapore's Creative Technology introduces the Nomad line of digital audio players
1999: Japan's NTT DoCoMo ("Do Communications over the Mobile network" introduces the "i-mode" service that allows mobile phones to access a broad range of Internet services
2000: Japan's Sharp introduces the J-SH04, the first mobile phone with a built-in camera
2000: Japan's Honda introduces the humanoid robot ASIMO
2000: Almost 50% of Japanese who access the Internet do so via a cell phone
2000: The foundry Semiconductor Manufacturing International Corporation (SMIC) is founded in Shanghai's Zhangjiang High-Tech Park
2001: South Korea's Hyundai spins off its electronics division as Hynix
2002: South Korea's Samsung is the second semiconductor manufacturer in the world after Intel, and Japan's Toshiba is third, passing Texas Instruments
2002: There are more than 2,000 startups in Seoul's Teheran Valley, and 69% of them are in IT
2002: Vizio is established in the USA to sell AmTran's television sets
2003: Between 1988 and 2003 high-tech ventures in Beijing's Zhongguancun grew from 527 to more than 12,000
2003: Sony introduces the first Blu-ray disc player
2003: Hitachi and Mitsubishi create the joint venture Renesas Technology specializing in microcontrollers (embedded chips)
2003: Asia produces about 40% of the world's IT goods and consumes about 20% of them
2003: Japan accounts for 21% of all patents awarded worldwide
2004: South Korea's Samsung is the world's largest OLED (Organic Light-Emitting Diode) manufacturer, producing 40% of the OLED displays made in the world
2005: Taiwan's companies produce 80% of all personal digital assistants, 70% of all notebooks and 60% of all flat-panel monitors
2005: Lenovo acquires IBM's personal computer business
2006: Between 1993 to 2006 the number of new science and engineering PhDs increased by 24% in the USA, by 189% in South Korea, and by more than 1,000% in mainland China
2007: The world's largest vendors of personal computers are HP, Dell, Taiwan's Acer, China's Lenovo and Japan's Toshiba
Aug 2007: Taiwan's Acer acquires its US rival Gateway
2008: Japan's Sony unveils the world's first OLED tv set, the XEL-1, the world's thinnest tv set at just 3 mm
2009: Asia employs 1.5 million workers in the computer industry while the USA employs only 166,000
2009: Vizio, whose products are made by Taiwanese company AmTran, becomes the main tv-set brand in the USA
2009: South Korea's Samsung announces mass production of 512 Mbit phase-change memory
2009: Taiwan's Foxconn (Hon Hai Precision Industry), becomes the world's largest manufacturer of electronics with revenues that dwarf Apple's and Intel's, employing 800,000 people
2009: India's Infosys sets up the largest corporate university in the world at Mysore
2010: Renesas acquires NEC's semiconductor division, becoming the world's largest manufacturer of microcontrollers (embedded chips) and the world's fourth largest semiconductor company
2010: Taiwan's Quanta Computer is the largest manufacturer of notebook computers in the world
2010: South Korea's Samsung introduces the smartphone Galaxy S
2010: Taiwan's HTC introduces the world's first 4G smartphone, the EVO
2011: Tencent releases the social networking platform Weixin/Wechat
2012: Taiwan's Acer introduces the world's thinnest notebook, the Aspire S5
2012: China's Huawei overtakes Sweden's Ericsson to become the world's biggest telcom-equipment maker
2012: South Korea's Samsung sells twice as many smartphones as Apple and five times more than Nokia
2012: South Korea's Hynix merges with SK Group to form the world's second-largest memory chipmaker after Samsung
2012: The Tokyo Institute of Technology creates a robot that learns functions it was not programmed to do (based on Osamu Hasegawa's technology)

Neurotech

In neurotech the new approach was based on computational power. In practice, the cumbersome, slow and expensive computers of the 1960s had forced computer scientists to focus on models, whereas now the small, fast and cheap processors of the 2010s were encouraging computer scientists to use brute force. The availability of colossal amounts of information on the Web made this change of strategy even more appealing. For example, Stanford's professor Andrew Ng led a team at Google that wired 16 thousand processors to create a neural net capable of learning from Youtube videos. The new approach contained little that was conceptually new, just a lot more computing power. The "deep belief networks" at the core of the "deep learning" of these systems was an evolution of the old "neural network" of the 1980s and it had mostly been done by Geoffrey Hinton at the University of Toronto.

The sensation in robotics was coming from Boston. In 2012 Rethink Robotics, founded in 2008 by one of the most celebrated robot scientists in the world, Rodney Brooks, introduced Baxter, a low-cost programmable industrial robot that promised to make robots affordable for small companies.

In 2012 Kurt Konolige and Gary Bradski of Willow Garage founded Industrial Perception (IPI) in Palo Alto to build 3D-vision guided robots ("robots that can see").

In 2007 the Stanford Artificial Intelligence Laboratory unveiled "Switchyard", better known as the Robot Operating System (ROS), a toolkit for developers of robot applications, later expanded at nearby Willow Garage.

Progress in Artificial Intelligence was also coming from an idea originally advanced by Carver Mead at the California Institute of Technology (Caltech) in 1990: build processors that look like the brain. In 2008 Dharmendra Modha at IBM's Almaden Labs launched a project to build such a "neuromorphic" processor, i.e. made of chips that operate like neurons.

Google's broadly publicized "driverless car" was a project begun by Sebastian Thrun at the Stanford Artificial Intelligence Laboratory, the descendant of Thrun's robotic vehicle Stanley (2005).

More prosaic applications were needed in the age of smart phones, cloud computing and social networking. SRI International's Artificial Intelligence Center, that had already spawned Nuance in the 1990s, spawned Siri in the new century. Adam Cheyer was the leader of a software project (code-named CALO/PAL) to develop a personal assistant capable of learning and self-improving. Founded in 2007 by Cheyer with Dag Kittlaus (a telecom executive from Norway) and Tom Gruber (an alumnus of Stanford's Artificial Intelligence Lab who had worked on collaborative knowledge management), Siri launched a virtual personal assistant for mobile devices that was acquired by Apple in 2010.

New Zealand-born Oxford-educated physicist Sean Gourley and Bob Goodson (Yelp's first employee and a YouNoodle co-founder) started Quid in San Francisco in september 2010. Quid developed a global intelligence platform for tackling large unstructured data sets (initially about high-tech innovation, i.e. about Silicon Valley startups).

The groundbreaking technologies feeding the social-networking frenzy kept coming from other regions, though. For example, face-recognition came to the masses via Face.com, launched in may 2010 by an Israeli company founded by Moti Shniberg, who had previously founded the pattern-recognition firm ImageID in 1998, and by Gil Hirsch. They turned their technology into a smartphone application (KLiK) and a Facebook application (Photo Finder), both real-time facial recognition programs capable of automatically identifying friends in photos (Face.com was acquired by Facebook in 2012).

Virtual Reality

The Bay Area had pioneered virtual reality but then pretty much lost interest in it. It was now sitting on the side while major developments were bringing the technology to the general audience: mass market immersive-reality devices such as Sony's head-mounted display HMZ-T1, introduced in november 2011; mass-market tracking devices such as the Microsoft Kinect, launched in november 2010 and originally conceived to allow gamers to interact with the Xbox 360 game console via gestures; and mass-market lenticular printing (that creates the illusion of three dimensions or the effect of a changing image as it is viewed from different angles) by companies such as Futuredisplay, founded in 2003 in South Korea.

Biotech

New biotech start-ups included iZumi Bio, founded to develop products based on stem-cell research (2007) and iPierian, founded to develop products based on cellular reprogramming (2007). In 2009 Swiss pharmaceutical giant LaRoche purchased Genentech for $46.8 billion.

Biotech, just like infotech, was moving towards the individual, which in its case meant personal genomics for predictive medicine. Navigenics was founded in Foster City in november 2007 by cancer specialist David Agus and Dietrich Stephan of the Translational Genomics Research Institute to provide genetic testing for predisposition to a variety of diseases. It was still an expensive service but the rapid cost reduction in DNA chips and the knowledge derived from the Human Genome Project helped bring it closer and closer to the masses. The cost of a personal genetic test-kit was $3 billion in 2003, and there was only one (the Human Genome Project). In 2009 the cost had decreased to $48,000 (made by San Diego-based start-up Illumina, formed in 1998 by venture-capital firm CW Group to commercialize a system developed at Tufts University). In 2009 Complete Genomics (founded in march 2006 in Mountain View) announced a genome-sequencing service for under $5,000. The genesis of this company spoke loud about the mature state of the industry. One of the founders was serial entrepreneur Clifford Reid, co-founder of Sunnyvale-based information-retrieval start-up Verity in 1988 and of San Mateo-based digital video communications company Eloquent in 1996. The other founder was Serbian-born biologist Radoje Drmanac, who had participated in the Human Genome Project since 1991 and had later (1994) co-founded Sunnyvale-based biotech start-up Hyseq. By the end of 2009 only about 100 human genomes had ever been sequenced. Complete Genomics planned to sequence 5,000 human genomes in 2010, 50,000 genomes in 2011 and 1 million genomes by 2014.

23andme, founded in april 2006 in Mountain View by former Affymetrix executive Linda Avey and by Sergey Brin's wife Anne Wojcicki, analyzed parts of the human genome to derive useful medical information, and its kits were priced under $500 by 2010. Halcyon Molecular, founded by Michael and William Andregg in Arizona in 2003 and relocated to Mountain View in 2008, hired PayPal's Luke Nosek in september 2009 and set a goal to sequence individual genomes in a few minutes and for less than $100. Human-genome sequencing firms were proliferating, each using a different technique and each focusing on different data.

In 2007 the Bay Area boasted about 700 biomedical companies.

However, the promises of the Human Genome Project were still largely unrealized. In particular, the thesis that genes cause diseases had sent biologists hunting for the common variants in the genome of individuals who are affected by the same health problems. Ten years later the "common variant" strategy was being attacked by an increasing number of scientists, throwing into disarray the whole industry of "personal genomics".

Meanwhile, Venter's new venture heralded the birth of synthetic biology as a business. In may 2010 Hamilton Smith's team at the Craig Venter Institute in Maryland achieved another milestone in synthetic biology by building a bacterium's DNA from scratch in the lab and then transplanting it into the cell of a host bacterium of a different species, where the artificial DNA took control of the host cell and started replicating. The resulting living being behaved like the species of the synthetic DNA. It was the first time that a living cell was being regulated entirely by artificially manufactured DNA. They had just managed to reprogram a living being. That living being's parent was a computer. This event opened the doors to an industry that would design custom bacteria on a computer and then build them in the lab. Eventually, one could envision a day when individuals would be able to program a living organism on a handheld device connected to the lab and order the living organism on the fly. This vision was coming possible because all the economic factors were converging: it was becoming increasingly easier to sequence ("map") the DNA of an organism, a fact that resulted in ever larger databases of genomes of existing organisms; it was becoming increasingly cheaper to synthesize ("build") DNA molecules; and both processes were a lot faster than they used to be, thanks to their rapid computerization. The only tool that was missing for a broader availability of life synthesis was a tool to edit the DNA sequences. The other tool that a wary humankind would have liked to see was the equivalent of the "undo" feature of computers. The media frenzy around this event resembled the media frenzy of the 1950s when computers were labeled as "electronic brains" that would eventually take over the world. Now it was bacteria who were bound to take over the world. Venter's next target was algae. Bacteria are single-cell organisms. So are algae. Algae can be used to make biofuels, because they can make carbon dioxide into fuels by photosynthesis.

Drew Endy at Stanford University was working on creating a catalog of "biobricks" that synthetic biologists could use to create living organisms. His model clearly mirrored the way the personal-computer industry got started, with hobbyists ordering kits from catalogs advertised in magazines and then assembling the computer in their garage.

In 2012 a Stanford bioengineering team led by Markus Covert produced the first complete computer model of a free-living organism, the bacterium Mycoplasma Genitalium.

Greentech

On the greentech front, in 2008 solar technology accounted for almost 40% of worldwide private investments in greentech followed by biofuels at 11%. In the USA venture capitalists invested a grand total of $4 billion into "green-tech" start-ups in 2008, which was almost 40% of all investments in high-tech in the USA.

Solar-energy company Solyndra was started in may 2005 in Fremont (east bay) by Chris Gronet, formerly an executive at Applied Materials that manufactured equipments for the semiconductor industry, and by 2009 it had $820 million in venture funding and more than a billion dollars in product orders. In march 2009 the USA government helped Solyndra build a 500-megawatt factory for cylindrical solar cells at the cost of $733 million, but then Solyndra went bankrupt in september 2011 leaving behind mostly doubts about the whole industry.

In 2007 Google's founders established Google.org, the non-profit arm of Google, to fund greentech startups. In 2008 they invested in eSolar, a Pasadena-based manufacturer of solar thermal plants, and in AltaRock Energy, a Sausalito-based firm tapping geothermal energy.

Siluria was founded in 2008 in Menlo Park by MIT professor Angela Belcher, who developed a method to reprogram the DNA of viruses so that they create inorganic materials. They applied the method to produce gasoline from natural gas.

Solar startups multiplied after president Obama's blessing: Twin Creeks Technologies, founded in 2008 in San Jose by veterans of the semiconductor industry such as Indian-born Siva Sivaram and Venkatesan Murali; and Cogenra Solar (originally called SkyWatch Energy), a spin-off of semiconductor manufacturer Applied Materials led by Gilad Almogy, started in 2009 in Mountain View and funded by Vinod Khosla.

Innovation, as in the past, was not really coming from Silicon Valley, though. The big news in greentech came from Boston-based 1366 Technologies (founded by MIT professor Ely Sachs in 2007), that developed a more accurate way to cast the silicon wafers for solar cells so that solar cells can be produced at lower costs. This company was leapfrogging Silicon Valley startups using a typical Silicon Valley model: partnership with the DARPA to bring the technology to maturity and then application to a commodity product.

A different kind of "greentech" was represented by the companies that aimed at serving the "green" market. Simbol Materials, founded in 2008 in Pleasanton by Luka Erceg and Scott Conley, hired geochemists from the Lawrence Livermore National Laboratory and bought technology from that lab to produce lithium from geothermal brines (the best source of renewable energy).

More interesting were the many attempts at fundamentally changing the way people behave. For example, in 2008 Simon Saba founded his own company in San Jose to design and manufacture a mass-market sport electric car. He envisioned the Saba as a Tesla for ordinary families.

The potential for linking infotech and greentech did not go unnoticed. German giant SAP had pioneered the field with software to perform carbon accounting, energy auditing, safety management and resource planning, soon followed by arch-rival Oracle. Anticipating legislation about climate change (that would penalize the emission of greenhouse gases), a few Bay Area startups entered the space of software for energy and emission management to help companies control their "carbon footprint". Hara was founded in 2008 by SAP's executive Amit Chatterjee and run by Oracle alumni. In january 2009 Tom Siebel, the founder of Siebel Systems who had become a passionate advocate of net-zero energy homes, launched C3 with a board that comprised former secretary of state Condoleezza Rice and Shankar Sastry, the dean of engineering at U.C. Berkeley. U.C. Berkeley had nurtured the science of energy conservation. One of its scientists, Charlie Huizenga, had founded already in 2005 Adura Technologies, whose software aimed to monitoring and optimizing the lighting of a building.

Cold Fusion

In 1985 the USA, the Soviet Union, Japan and the European Union had launched a joint project to build the International Thermonuclear Experimental Reactor (ITER). The hope was that it would lead to a power plant fueled by fusion. ITER had been designed according to the Soviet tokamak reactor invented by Soviet physicists Igor Tamm and Andrei Sakharov. The holy grail of nuclear fusion, however, was "cold fusion", i.e. fusion that does not require the high temperatures generated by such an expensive reactor. In march 1989 Stanley Pons, a chemist at the University of Utah, and Martin Fleischmann from the University of Southampton in Britain had announced that they had achieved "cold fusion", i.e. nuclear fusion at room temperature (about 20 degrees Celsius). Within a few months the scientific community had come to consider it a bluff, which discredited the entire field. Meanwhile, ignored by the media, in 1989 New Zealand's electrochemist Michael McKubre had just begun to study cold fusion at SRI. For about two decades the field had virtually been silenced by mainstream science. By the end of the 2000s interest had returned, as cold fusion would solve the problem of energy forever. Construction of ITER, mostly funded by the European Union, finally began in 2008 in southern France after India, mainland China and South Korea had joined the original quartet. At the same time mainstream science began to accept the results on "cold fusion" that Michael McKubre's lab had achieved. Meanwhile, hot fusion remained the scientifically proven way to go. The Livermore Labs were entrusted in 1997 with the National Ignition Facility (NIF). The project required high-power lasers to trigger nuclear fusion in the hydrogen fuel. The term "ignition" refers to the point when more energy is generated than is consumed by the plant. The Lawrence Livermore National Laboratory basically planned to simulate the nuclear fusion of a star (more than 100 million degrees Celsius, hotter than the center of the sun) with the world's most powerful laser.

Lasers

One of the great scientific endeavors of the 2000s was a joint project among CalTech, MIT and Stanford to detect gravitational waves: the Laser Interferometer Gravitational-Wave Observatory (LIGO). The most expensive project ever funded by the National Science Foundation (NSF), LIGO became operational in august 2002 in two observatories located 3,000 kilometers apart (Louisiana and Washington state). The experiment required the most precise measurement ever. Capitalizing on laser amplification studied by Robert Byer's team at Stanford, Ueda Kenichi in Japan developed transparent ceramic laser for high-energy applications that, in turn, led to Northrop Grumman's announcement in 2009 that it had created a 100-kilowatt laser, an impressive achievement for a discipline that in 1984 could only produce two-milliwatt lasers. This combination of high precision and high energy was unprecedented in history. Livermore Labs' researchers realized that the technology could also serve the purpose of the National Ignition Facility (NIF).

Lasers were also employed at the Lawrence Berkeley Labs to create a new generation of "miniature" particle accelerators. In 2006 Wim Leemans' team accelerated electrons to a billion electronvolts (1GeV) in a distance of centimeters rather than hundreds of meters. The next project was the Berkeley Lab Laser Accelerator (BELLA), in which a laser would produce one-quadrillion watts (one billion million watts) for a millisecond, enough to accelerate electrons to an energy of 10 GeV in a distance of just one meter.

Meanwhile, the SLAC inaugurated the Linac Coherent Light Source (LCLS), that in april 2009 produced the world's brightest X-ray laser (X-rays being much higher-frequency radiations than microwaves). It was now technically possible to pinpoint a biological cell, and, generally speaking, explore matter at the molecular scale.

Culture and Society

The future was more mysterious than ever, though, as these technologies diverged in so many directions. The debate about the future permeated many Bay Area circles, mostly centered around oracles of the future ("futurists"). Taking inspiration from the private International Space University (ISU), founded by MIT professors in 1987, in 2009 OCR inventor Ray Kurzweil and ISU's founder Peter Diamandis started the Singularity University, located at Moffett Field, which basically bestowed academic credentials onto futurists. Sometimes the futurists' rhetoric was oddly reminiscent of the post-hippie new-age spirituality of the 1970s, except that now it was focused on achieving immortality (scientists at the S.U. speculated that in the future immortality could be achieved by downloading one's consciousness onto a computer).

The Seasteading Institute, founded in 2008 in Sunnyvale by Patri Friedman, envisioned cities floating in the middle of the ocean as utopian libertarian communities to experiment with alternative social systems.

The arts reacted seemingly by discarding the whole notion of a future, by embracing in fact a playful tone even when they confronted tragic themes. Al Farrow used guns and bullets to sculpt his "Cathedral" (2007). In 2010 Scott Sona Snibbe, one of the many Bay Area practitioners of immersive interactive art, turned one of his interactive software artworks, Gravilux, into an application for the iPhone and iPad downloadable for free from the iTunes store: within weeks it became a worldwide success.

At the intersection of art and science in 2001 neuroscientist Semir Zeki had founded the Institute of Neuroesthetics in Berkeley, and in 2008 Piero Scaruffi started the Leonardo Art Science Evenings (LASERs) in San Francisco.

Berkeley's passion for "including" disabled people culminated in 2010 with the inauguration of the Ed Roberts campus, a place not only devoted to people with disabilities but designed (by architect William Leddy) to make sure that anyone (anyone) could use it, even blind people without arms or legs.

Skyrocketing real-estate costs in San Francisco were sending more and more artists across the bay to Oakland. Oakland had long been famous mainly for crime. In 2007 it still had the third highest murder rate in California with 120 murders (one every three days). In 2012 it was still the 7th deadliest city in the USA. This, of course, simply reflected the widespread poverty and neglect: between 2007 and 2012 more than 10,000 homes were foreclosed. Secondly, Oakland had become the state's capital of "weeds" (marijuana) or, at least, was at the forefront of the campaign to legalize cannabis. An entire district was nicknamed "Oaksterdam", a reference to Amsterdam, the European city where marijuana was virtually legal. In 2004 Oakland voters approved "Measure Z", which officially made marijuana-related "investigation, citation and arrest" a low priority for the city's cops. In 2007 Richard Lee even founded the Oaksterdam University, a non-profit organization teaching how to grow (medical) marijuana. There were shops where anyone, paying a small fee, could get a prescription to buy marijuana. It didn't look like the ideal place for engineers, and, in fact, Oakland's only major success story of the high-tech community was Pandora. In 2013 venture capitalists poured $7 billion into San Francisco's two thousand high-tech companies providing 50,000 jobs. In the same year venture capitalists poured sligthly over half a billion into Oakland's 350 high-tech companies, providing not even 6,000 jobs (10% of which worked at Pandora). However, the cost of office space was about half in Oakland than in San Francisco. But the arts thrived and Oakland was increasingly on the map for major art shows. In 2006 eight art galleries located in the Northgate and Temescal (north Oakland) neighborhoods of Oakland (21 Grand, 33 Grand, Auto 3321, Boontling Gallery, Buzz Gallery, Ego Park, Front Gallery, and Rock Paper Scissors Collective) joined together to launch "The Art Murmur", a free art walk open to everybody held on the first friday of every month. The demographic shift underway was easy to detect: between 1990 and 2011 the black community went from 43% of the population to 26%. Landlords were evicting black families (typically lower-income families) to rent to higher-income families. In 1994 California voters passed Proposition 184 (better known as the "three-strikes" law), mandating life sentences for those convicted of three violent felonies; and in 1996 Bill Clinton signed a law authorizing landlords to evict tenants suspected of any crime. The combined effect of these two laws was to decimate the black community. A side-effect was to open neighborhood to the artists who were fleeing the high rent prices of San Francisco.

An important ideological change was taking place inside Bay Area universities about using the new digital media to export knowledge to the rest of the world. For example, copying the idea of the Khan Academy launched by Salman Khan at the MIT in 2006, in 2011 Stanford professors Peter Norvig and Sebastian Thrun created free courseware on the Web that could be accessed by students worldwide. In 2011 Sebastian Thrun quit Stanford to start the online university Udacity that would educate tens of thousands of students worldwide for free. Also in 2011 two Stanford researchers, Andrew Ng and Daphne Koller, whose free Web-based courseware had already been used by more than 100,000 students, launched their own startup, Coursera, aiming at providing interactive courses from Stanford, UC Berkeley, the University of Michigan, the University of Pennsylvania and Princeton University in all sorts of disciplines.

Meanwhile, the ever-shifting demographics of the Bay Area was experiencing another major make-up. This time they were Muslims. Most of them came from the Indian subcontinent and chose Fremont as their new hometown. In October 2006 an Afghan woman wearing the Muslim head scarf was killed in a brazen daylight shooting. That tragic incident put Muslims on the social map of the Bay Area. The Zaytuna Institute, the first Muslim liberal arts institution in the USA, had been founded by Shaykh Hamza in 1996 in Berkeley. The first Bay Area Muslim Film Festival had been held in march 2004 in Berkeley. The Muslim population of the Bay Area was estimated at 250,000 in 2008.

The success of Paypal, Facebook and the likes had transformed Palo Alto from a student town to a startup town. Mountain View, the first beneficiary of the dotcom since the days of Netscape and Yahoo and now the heartland of Google, was on its way to become another exclusive community. Silicon Valley had expanded north (to Oracle's Redwood City and to Genentech's South San Francisco) but not yet to the intellectual hubs of Berkeley and San Francisco that, despite nurturing several startups, never quite experienced the same ebullient business creativity, as if to prove that a startup needs irresponsible thinking more than it needs worldly erudition.

However, Silicon Valley was paying an increasingly high psychological price for its lifestyle. Nowhere else in the world could one see so many people driving alone in their car to work and back home (or just to the mall/gym/whatever). Nowhere else in the world could one find such a widespread use of cubicles in office buildings, so that each person has her/his personal space. And probably in no other urban center was the single-family detached home so popular as in California. An increasing number of people lived by themselves, drove alone to the office, worked alone in a cubicle, ate their lunch alone... All of these used to be not only social but even crowded activities. From the beginning the suburban environment within which Silicon Valley prospered seemed to be designed to provide some kind of "isolation". Not quite "loneliness", but certainly a degree of separation that reinforced the North American tendency towards individualism and bottom-up organization. Socialization was happening through the institutions created by that bottom-up approach, such as the ancient Gold Rush practice of meeting in saloons (nowadays bats and restaurants) or at the rodeo (now a gym). As Silicon Valley sociologist said in a private conversation while we were discussing this issue, "There must be a reason if eating is so important here". People had never been so far from family and never had so superficial friendships. No surprise then that social media became so popular in Silicon Valley: their function was, fundamentally, to keep people isolated while connecting them.

Anthropology of the Age of Self-awareness

The 2000s were the decade when Silicon Valley was obsessed with... itself. The Tech Museum of Innovation opened in 1998 in San Jose, and the Computer History Museum opened in 2003 in Mountain View. Silicon Valley had always been a bit surprised of being a world-famous phenomenon. As many of its original founders were reaching the age of the memoir, Silicon Valley also became a self-celebratory phenomenon.

This society saturated with high-tech was nonetheless still humane. A few events were emblematic of the anti-technological reaction that was more pervasive than ever despite the appearances. First was the growing popularity of the outdoors. A store like REI, founded in 1938 in Seattle by a group of mountaineers, became a conglomerate thanks to its success in the Bay Area, where it maintained eight locations: it specialized in sporting goods, notably for hiking, climbing and biking. The Bay Area had always been obsessed with fitness (and the gym had become ubiquitous) but this went beyond fitness. Silicon Valley engineers used their spare time to train for the epic hikes of Yosemite and the High Sierra, as well as for marathons, bike races and triathlons, no matter how unprepared they were.

The "Burning Man" festival had moved to an isolated corner of the Black Rock desert in Nevada where it had completely lost its "counterculture" status (since it was now one of the most advertised and expensive events of the year) but had become a futuristic urbanistic experiment. During the Labor Day weekend of september tens of thousands of people, mostly from the Bay Area, set up a tent city, lived in it, were stimulated to be creative and spontaneous, and then simply "undid" it all, leaving no traces behind. Burning Man had become famous for the fantastic and participatory art installations (that were meant to be burned at the end of the festival) and for the picturesque costumes and body art of the crowd, but it was becoming more interesting as a self-organized community. Burning Man had originally been studied as a semi-religious ritual of purification and communion by the high-tech generation, but now it was also studied as a city powered by solar energy and motorized by biodiesel, decorated with arts instead of billboards, and cleaned by the citizens themselves. By comparison, the European equivalent was the "Love Parade", which was really just a big party of music, alcohol and drugs.

And then there was the end of the world. The cycles of the Maya calendar end with the year 2012. This and other coincidences led to the formation of theories about some kind of impending apocalypse that became a favorite topic of discussion among engineers who had never cared about Mayan history (and, alas, probably didn't really know who the Mayas were).

Growing awareness of the role of technology in shaping society was mirrored by growing moral anxiety. In october 2011 a new class of corporation was created in California, the "benefit corporations", which are corporations whose charters specifically mandate the pursue of ethical and environmental goals instead of giving priority to maximizing the financial return to shareholders.

While these phenomena could be viewed as reactions to the materialism of high-tech business, they were also emblematic of a shift towards more superficial hobbies. Knowledge-driven hobbies had been replaced by skills-driven hobbies: salsa dancing, mountain biking, snowboarding, marathon running, etc. This shift translated into a rapid collapse of the independent bookstores, several of which were vestiges of the counterculture of the 1960s. It wasn't just the online bookstore that killed them: it was also a rapidly declining audience for serious books. In 2001 Printers Inc in Palo Alto closed. Kepler's in Menlo Park closed in 2005 but was saved by a grass-roots campaign. In 2006 both Cody's in Berkeley and A Clean and Well Lighted Place in Cupertino had to shut down.

In reality, there were increasingly big differences between the various poles of the Bay Area. Long work hours and maddeningly slow traffic were progressively isolating Silicon Valley from the rest of the Bay. Silicon Valley was a region of low-rise residential buildings and town-home parks organized in geometric neighborhood that were largely self-sufficient. Therefore the need to visit other parts of the Bay was minimal. The distances between Silicon Valley and the artistic world of San Francisco, and the political world of Berkeley, and the scientific world of Stanford had increased dramatically. No wonder that Silicon Valley was (in)famous for no cultural variety: everybody read the same book, watched the same movie and sang the same song. Its inhabitants were too isolated from culture. Silicon Valley was giving a new meaning to the word "provincial".

For a long time the engineering world had been fundamentally male. The 21st century began with an increasing number of women joining the ranks of engineers. There was also significant progress in promoting women to executive jobs. Young female executives included Sheryl Sandberg (1969), chief operating officer of Facebook, and Marissa Mayer (1975), a vice-president at Google. However, there still was no female Steve Jobs and no female Mark Zuckerberg.

This society was still very transient. In 1992 unemployment in Silicon Valley had reached 7.4%. In 2000 it had declined to 1.7%. One year later it had increased to 5.9%. By 2007 it was down again. It was not the same people who left and came. Those who left probably never came back. It was terribly difficult to resettle in the Bay Area after one left it, both in terms of real estate (home prices tended to skyrocket during a recovery due to a chronic shortage of housing) and in terms of employment (easy for a recent immigrant or fresh graduate to take a risky job, difficult for someone who already had a career somewhere else). Silicon Valley was the one region of the world whose identity was not defined by the people who lived in it, because they changed all the time.

However, perhaps the most unique aspect of culture and society in the San Francisco Bay Area was that there was absolutely no building reflecting the booming economic, technological and political power. Rarely had human civilization been so invisible. There was no breathtaking skyscraper, no imposing monument, no avantgarde city landmark: Silicon Valley and the rest of the Bay did not invest a single penny in advertising its own success. In march 2012 the New York Times estimated that all fine arts museums of San Francisco combined ranked 13th in the nation for investments in new acquisitions, even behind the Princeton Art Museum, with a total that was less than 10% of what the Metropolitan Museum of New York alone spent.

The Gift Economy

At the end of the Great Recession of 2008-09 economists were missing one important factor among the causes of the chronic unemployment in the USA: free community content. In 2006 YouTube only had 60 employees, but they were managing 100 million videos. The YouTube employees were not making the videos: millions of people around the world were donating them to YouTube. If a Hollywood studio had decided to create 100 million videos, it would have had to hire hundreds of thousands of people to act them, direct them, produce them, edit them, and upload them. In 2004 Craigslist only had ten employees moderating more than a million advertisements posted every month. A newspaper handling the same amount of advertisements would have to hire hundreds of editors. In 2005 Flickr only had nine employees, but they were managing a repository of millions of pictures. Those pictures were taken around the world, edited, uploaded, documented and even organized by millions of users. A magazine that decided to create a similar repository of pictures would have to hire thousands of photographers, tour guides and editors. In 2005 Skype only had 200 employees, but they were providing telephone service to more than 50 million registered users. Any telecom in the world that provided a comparable service employed tens of thousands of technicians, operators, accountants, etc. One of the fundamental discoveries of the "net economy" was that users of the Web around the world are happy to contribute content for free to websites willing to accept it.

This phenomenon obviously displaced workers who used to be paid to create that very content. This phenomenon was not creating joblessness but merely unemployment: it was creating unpaid jobs. Millions of people worked (some of them for many hours a day) to create and upload content to other people's businesses (such as Wikipedia, Facebook, Twitter, Flickr and YouTube). They were working, but they were not employed: they worked for free, out of their own will. Not even slaves did that. Indirectly, the Web had created a broad new class of knowledge workers: volunteer amateur editors whose net effect was to displace the existing knowledge workers (photographers, journalists, actors, directors, researchers, writers, librarians, musicians, as well as all the engineers and clerical staff who provided services for them). When thousands of knowledge workers lose their job, i.e. when their purchasing power collapses, inevitably this has repercussion on the entire economy and creates further ripples of unemployment. Every time someone adds a line to Wikipedia, a professional knowledge worker becomes less indispensable and more disposable. Every time someone adds a picture to Flickr, a video to YouTube, news on Twitter, a notice on Facebook or an ad on Craigslist the job of a professional becomes more vulnerable. In the past each wave of technological innovation had come with a wave of new jobs that replaced the old ones. Society had to train millions of users of word processors to take the place of million of typists; and companies had to churn out computers instead of typewriters, a process that involved hiring more (not fewer) people. In fact, each wave of technological progress typically created new opportunities for knowledge workers, and this class therefore expanded rapidly, creating more (not less) employment (and higher incomes). The expansion was still happening: there ware now millions of people making videos instead of just a few thousands, and there ware now millions of people taking pictures and millions posting news. The difference was that this time they didn't ask to be paid: they were doing it for free. Therefore businesses could operate with a minimal staff: the old knowledge workers were replaced by free labor. Therefore the number of knowledge workers was still increasing (and even exponentially), but the number of those who were paid for their work was shrinking dramatically. It was an illusion that YouTube was run by only a handful of employees: YouTube "employed" millions of "employees". It just so happened that 99% of them were happy to work (provide content) for free. Therefore there was no need anymore to actually hire people to create content. Protectionists were complaining that developing countries were "dumping" cheap products on the USA market that caused USA companies to go out of business. Protectionists were inveighing against "unfair trade". But the real enemy of employment was free labor. Nothing kills jobs faster and more permanently than free labor. That is a form of competition that was coming from inside the USA society, an accidental by-product of technological progress.

This accidental by-product was actually the dream of socialist utopians: the net economy had created production tools that were available for free to everybody. That was precisely Marx's definition of socialism: the collective ownership of the means of production. This accidental by-product was also the dream of the hippy utopians of the San Francisco Bay Area. In the 1970s Stewart Brand of the WELL had imagined precisely such a virtual community of people engaged in the free production and exchange of knowledge goods: a community of people investing their time and sharing their content for free. The utopian society of Marx and Brand had materialized as a "gift economy" (a term coined in 1985 by Lewis Hyde and applied to the net economy in 1998 by Richard Barbrook) in which a few businesses provided the means of production to a mass of millions of volunteer amateur editors. The free labor of these many worker ants allowed a very small number of queens to get extremely rich while causing millions of middle-class families to lose their income. The irony is that they were often the same people. The very person who uploads a picture, a text or a video for free is the person who will (directly or indirectly) need to look for another (often less remunerative) job as a (direct or indirect) consequence of that act of free labor. The Internet had indeed democratized society: everybody could now start their own business. At the same time the Internet had increased the value of knowledge, another step in the progress of human civilization from survival-based goods to knowledge-based goods. The problem was that the Internet had also democratized knowledge production: everybody could now provide content, and they were willing to do it for free.

The net economy was, in fact, rapidly evolving towards an economy of one-man operations. There were now Web-based tools available to build, run and monetize a business that only required limited technical skills and no more than a few days of work. One person alone could de facto create an assembly line entirely on the Web to produce a mass-market product/service (in the same category as YouTube, Flickr and Craigslist). That assembly line did not employ any worker other than the founders who assembled it. Once the one-person business was up and running, its success mainly depended on how many people were willing to contribute content for free, i.e. how much free labor you could harvest on the Internet. One could foresee a future when a startup would require even fewer founders. If successful (i.e., if it ever managed to attract millions of amateur content providers), it would employ even fewer people, and only those very few would benefit financially from its success.

In the age of smartphones and email physical proximity was no longer a necessity for a startup. There was no need to be based, say, in Palo Alto, where the cost of living was so high. The time had come for geographically distributed startups. For example, StackOverflow was founded in 2008 by Jeff Atwood in Berkeley and Joel Spolsky in New York, and hired people in different states.

These startups of the "gift economy" were annoyed by the pressures of investors. Therefore it was no surprise that in april 2010 Jared Cosulich and Adam Abrons founded Irrational Design with the explicit goal of not seeking venture capital. The "business plan" of the startup was to be creative, with no specific product in mind. Venture capitalists were perceived as hijacking the minds of the founders to focus on a lucrative product. Irrational Design was emblematic of a generation that aimed for precisely the opposite: let the minds of the founders invent at will, with no regard for market response or potential acquisitions.

In july 2002 eBay paid $1.5 billion for PayPal, a tiny unprofitable online payment service. In october 2006 Google paid $1.65 billion for YouTube, a video uploading site that had no revenue.

The traditional economy had tended to concentrate wealth in the hands of a few large companies that had run giant empires of hundreds of thousands of employees around the world. The gift economy was rapidly concentrating wealth in the hands of a few individuals who ran giant empires of tens of millions of unpaid amateurs.

The New Labor

Websites like Etsy, founded in 2005 in New York, and eLance, founded in 1999 in Boston by Beerud Sheth, were allowing anybody to become a self-employed businessman. As jobs disappeared due to automation, the population of "entrepreneurs" was destined to skyrocket.

As Venkatesh Rao argued in his article "Entrepreneurs Are The New Labor" (2012), the number of startup founders was destined to increase exponentially but they could no longer be viewed as the old-fashioned entrepreneurs: in reality they were becoming the modern equivalent of specialized work. Artisans, craftmen. The startup had become the modern form of apprenticeship: startup founders learn how the modern, lean company works. When a bigger company buys them out, they are de facto "hired" and transition from apprenticeship to full-fledged work. Large companies will increasingly be aggregations of startups, they will increasingly rely on a workforce of entrepreneurs.

The Assembly Line Model of Startup Funding

The first boom of Silicon Valley's venture capital world took place in the mid-1980s when large semincondutor companies needed money to build "fabs" (short for "fabrication plants"). A fab is an extremely expensive project, that requires hundreds of millions of dollars. The age of one-man startups, though, did not require such massive investments. The returns (as a percentage) could be even higher with a much smaller investment. Unlike fabs, that need an accurate plan of implementation, Web-based startups could adjust their functions in real time based on user's feedback. So all that was needed to start a company was really just a good idea and a rudimentary website.

At the end of the 2000s it was getting so easy and cheap to start a company that the business of funding companies was resembling an assembly line. When car manufacturing became cheap enough, Ford built an assembly line and launched the first mass-market car. Something similar was happening in Silicon Valley in 2010 in the business of manufacturing startups. Sometimes angels would invest without even having met the founders in person.

This logic was being pushed to the extreme limit by a new generation of angel investors who were sometimes barely in their 30s. The Google, Facebook and PayPal success stories, plus all the startups that they acquired (Google acquired more than 40 companies in 2010 alone), had created plenty of very young millionaires and even some billionaires. They could comfortably retire, but it was "cool" to become angel investors and help other young founders get started and get rich. These new angels were "kids" compared with the veterans of 3000 Sand Hill Rd, but many of these kids felt that they could more accurately guess the future of the world and jumped into the game passionately. Needless to say, these kids tended to fund startups that were started by kids even younger than them.

The mindset of both investors and founders was very different from the mindset of investors and founders of the 1980s. The credentials of a founder were often based on popularity, not on a professional business plan. In fact, these new investors "hated" the Wall Street type of founder. They loved, on the other hand, the "cool" kid. The successful founder was someone who could create "buzz" on the Internet about his or her idea, even if he or she had no clue how to monetize that idea. It was, in a sense, the psychology of the high school transplanted into the world of finance. The intellectual idealism of the early Internet was being replaced by the subculture of high-school gangs (albeit one that abhorred drugs and violence).

The new angels, in turn, were not professionals educated by Stanford or Harvard to the subleties of Economics: they were former "cool" kids. They invested based on their instinct, trying to guess who would become the next cool kid. To some extent the venture capital business had always been a statistical game: you invest in ten startups hoping that just one makes it big; but it had always been backed by some (economic) science. It was now becoming pure gambling. The psychology required from an angel investor was more and more similar to the psychology of the gambler who spent the day in front of a slot machine in Las Vegas casinos.

The birthdate of this new kind of venture capitalism was commonly taken to be 2005, when Paul Graham started Y Combinator in Mountain View and incubated eight seed-stage startups.

Another significant change in the way startups are funded took place when the "crowd-funding" website Kickstarter launched (in 2008 in San Francisco by day trader Perry Chen, online music store editor Yancey Strickler and web-designer Charles Adler). The concept had been pioneered by Brian Camelio's New York-based music company ArtistShare: raise money among ordinary people to fund a (music) event. Kickstarter transferred the concept into the world of the Bay Area startups with amazing success: by july 2012 Kickstarter had funded more than 62,000 projects that had raised more than 200 million dollars. Kickstarter got further help from the government when the JOBS Act was enacted in april 2012: it made it legal for smaller investors to fund a company and with fewer restrictions.

Last but not least, the early adopters too were of a different kind than for previous generations of products. The early adopter of an Intel product or of an Oracle product was a multinational corporation. The early adopter of an Apple gadget was a professional with a good salary, and typically a technology-savvy one. Now the early adopter of a website (say, a social networking platform) was, instead, just a kid himself. The early adopter was not paying anything to use the "product" (the social networking platform). Therefore the profile of the early adopters was, for the first time ever, disconnected from their financial status and instead related to how much spare time they had and were willing to spend on the Internet. The early adopters were typically very young. These "kids" created the "buzz". They established the credentials of a new platform. The adult audience followed the trend set by a very young audience.

The Big-Brother Web

Google, Facebook and the likes justified their new features on the basis of providing "better services", but in reality they were providing better services mainly to marketers. The multinational marketers (such as WPP Group) were now in control of the Web. They drove the new features of the most popular websites. For that reason Google and Facebook were improving the "services" of tracking what people do on the Internet. In theory this vast database of individual behavioral patterns was meant to provide a personalized experience on the Internet. In practice its motivation was economic: marketers were willing to pay more for advertising on platforms that tracked individual behavior. For the marketers it meant the difference between a shot in the dark and targeted advertisement. Targeted adverts were obviously much more valuable. Google's and Facebook's founders kept swearing that there was no "Big Brother" in the future of their worlds, but a sort of Big Brother was precisely the ultimate outcome of their business plan.

An event created even more anxiety. In August 2010 Google signed a deal with Verizon that basically terminated the "net neutrality" principle. The Internet had been born as a noncommercial platform. Therefore it had been easy to enforce a principle that every piece of data, no matter how powerful its "owner", should move at the same speed through the vast network of nodes that physically implement the Internet. That principle was never codified in a law, but it had been respected even during the dotcom boom, allowing, for example, Amazon to grow at the expense of the traditional bookstore chains and Netflix to hurt the business of the traditional movie theater chains. The Google-Verizon deal represented a major departure because it specifically stated that the age of wireless access to the Internet was different in nature. Since the future of the Internet was obviously to be wireless, this was not a hypthetical scenario: it was a promise.

Changing Nature of Innovation

Significant changes were occurring in the very focus of innovation.

Technological innovation was no longer what it used to be, especially for the software industry. Silicon Valley had become the testbed for extremely complex system engineering (rather than the testbed for new ideas). The real value was to be found not in the products themselves (their features and their looks) but in the complexity that surrounded and enabled their success. Oracle (the distributed ERP system), Apple (especially iTunes), Google (billions of simultaneous searches and videos), HP and Vmware (the cloud), and Facebook (billions of simultaneous posts) were ultimately engaged in solving very complex problems of scale. The customer paid for a product (and sometimes didn't even pay for it) but in reality the customer was buying an infrastructure. For example, Google's great innovation in its early years was the search engine, but the rest of Google's history was largely a history of acquiring or copying other people's ideas: the real contribution by Google consisted in turning rudimentary platforms (such as Android and Maps) into formidable robust distributed platforms.

The big Silicon Valley companies (notably Google and Oracle) were now growing through acquisition. They specialized in turning around not businesses but platforms. They had become extremely efficient reengineering factories.

The hardware industry was prospering but knowing that physical limits were going to be reached soon. In particular, it was getting more and more difficult to achieve faster speeds without generating unmanageable heat. Increasing the clock speed of a chip also increases its consumption which also increases the heat it generates. The stopgap solution had been to squeeze multiple processors on the same chip, but that was becoming impractical too. There was a general feeling that the age of the CMOS transistor was about to come to an end. For some the solution was to move away from transistors. Stan Williams at Hewlett-Packard was working on "memristors" based on titanium dioxide. Transistors are "volatile", i.e. they must be continually powered in order to preserve information, whereas "memristors" would be a nonvolatile technology, that only needs to be powered when "reading" or "writing" information (changing its state or checking its state). At Stanford University the Robust Systems Group led by Subhasish Mitras was researching nanocircuits: the smaller the circuit the smaller its demand for power and the heat it generates. Mitras was experimenting with carbon nanotubes.

The change here reflected the new dynamics of the computer industry. In the past change had been driven by the demands of bureaucratic, military or space applications, i.e. by government demand, and to some extent by the needs of corporate America. Now change was driven by the insatiable appetite of consumer electronics.

Another concept that had not changed in decades was the one underlying magnetic data storage. In 2011 Andreas Heinrich's team at IBM's Almaden Research Center reduced from about one million to 12 the number of atoms required to store a bit of data. In practice, this meant the feasibility of magnetic memories 100 times denser than the most popular hard disks and memory chips.

Quantum computing was, instead, still largely ignored in Silicon Valley. The idea of using the odd features of Quantum Physics to create supercomputers dated from 1982, when one of the greatest physicists, Richard Feynman, speculated that a computer could store information by exploiting precisely the principles of quantum superposition (that a particle can be in two states at the same time) using "qubits" (quantum bits) instead of binary bits. In 1994 Peter Shor at Bell Labs proved an important theorem: that a quantum computer would outperform a classic computer in a category of difficult mathematical problems. In 1997 British physicist Colin Williams and Xerox PARC's Scott Clearwater published "Explorations in Quantum Computing", a book that actually described how a quantum computer could be built. Vancouver's D-Wave, founded in 1999 by quantum physicists Geordie Rose and (Russian-born) Alexandre Zagoskin of the University of British Columbia, aimed at doing precisely that. In february 2007 D-Wave finally demonstrated its Orion prototype at the Computer History Museum in Mountain View. In 2009 Yale's professor Steven Girvin unveiled a macroscopic quantum processor (quantum processors were supposed to be very microscopic). In may 2011 D-Wave announced the sale of its first commercial quantum computer (purchased by Lockheed for the Quantum Computing Center that it established with the University of Southern California in Los Angeles), although purists debated whether it qualified as a real "quantum" computer.

The Electronic Empire

The world economy was hit hard in september 2008 by a financial crisis that started in the USA, due to reckless banking speculation. The crisis was mostly a Wall Street problem, but it dragged down the entire economy. However, high-tech companies in the Bay Area did relatively well, most of them reemerging after two years unscathed: the dotcom crash had been useful to teach them how to trim costs quickly in the face of an economic downturn.

Silicon Valley had traditionally been indifferent to national and especially internationally events. Even the various wars that the USA fought (World War I, World War II, Korea, Vietnam, first and second Iraqi war, Afghanistan) were perceived as distant echoes and, quite frankly, as business opportunities (despite the strong anti-war sentiment of San Francisco and Berkeley, yet another contradiction in the ideological dynamics of the Bay Area). This was true because of the relative isolation of California, but also because for a long time Silicon Valley's core business did not need anybody else. Silicon is the second most abundant chemical element of the Earth's crust after oxygen. The entire industry of semiconductors had a colossal advantage over other industries: its primary element was cheap and widely available. There was no need for wars in the Gulf War and no wild spikes in prices like for gold. Software had an even smaller resource footprint: a software engineer just needed two fingers to type on a keyboard. Until the 1990s Silicon Valley saw the rest of the world either as a market or as an outsourcing facility. The age of the cell phone, of the videogame, of the digital camera and of the digital music player, instead, relied on another material, tantalum, which was far less abundant. Tantalum was obtained from coltan (columbite-tantalite) and one country held the largest reserves of coltan: the Democratic Republic of Congo. That nation also held another record: the bloodiest civil war since 1998. Reports started circulating that the civil war was funded by the smuggling of coltan towards (eventually) the Western companies that needed it for their consumer electronics products. At the same time stories also appeared of the terrible work conditions in the Chinese factories of Shenzhen, where hundreds of thousands of low-paid workers assembled electronic devices for USA manufacturers (notably Apple). Meanwhile, lithium batteries, widely used in mobile electronic devices, obviously relied on lithium being available and cheap, but about 50% of the world's reserves of lithium were based in Bolivia, which had just elected a socialist president with an anti-USA agenda. China had become the biggest producer of "rare earths" (vital to cell phones, laptops and greentech) and pressure was mountain to reopen the California mines that used to dominate the world markets. The 50 million tons of electronic waste generated worldwide each year were mostly sent for disposal to developing countries where valuable metals were converted into scrap metal for resale at the cost of human lives and environmental disasters. At the same time California was going through very turbulent financial times for the first time in its history, having one of the highest unemployment rates in the country, slow growth and a chronic budget deficit. Silicon Valley had risen (from the 1940s to the 1990s) in a state that was booming. Now it was embedded in a state that was failing. Putting it all together, Silicon Valley was no longer insulated from global politics as it had been for decades.

It was not a coincidence that in 2010 two of its most popular executives, Meg Whitman and Carly Fiorina, entered the national political scene. Even more relevant was the role that technology produced in Silicon Valley influenced world events: in 2009 the Iranian youth took to the streets using Twitter to coordinate and assemble protests against the rigged elections, in 2010 students armed with cell phones and social networking software overthrew the Tunisian dictator, and in 2011 it was a webpage created by Google executive Wael Ghonim who indirectly started the mass demonstrations against the regime in Egypt. In response to the events, the Egyptian government took down the Internet. A few days later Google launched a special service to allow the protesters in Egypt to send Twitter messages by dialing a phone number and leaving a voice mail: the purpose was not to help Twitter (a competitor) make money but to help the protesters win the revolution. Silicon Valley was becoming a political player largely independent of any government.

Google vs Apple vs Facebook vs...

During the 2000s Google had largely represented the epic battle between the Web-based world against the desktop-based world of Microsoft. Google had won the ideological battle, as even Microsoft was beginning to move towards cloud-based applications; but made little money out of it. Mostly it was the big virtualization platforms that were benefiting from this epochal switch in computing platforms.

Google and Facebook were growing thanks to business plans that relied almost exclusively on selling advertising. They both offered a free service (in Google's case, many free services) based on content (text, images, videos, posts) provided for free by millions of "volunteers" (the public of the Web). And then they made money by selling advertising space to companies eager to publicize their products to the viewers of all that content. Neither Google nor Facebook was creating any content. They were just parasiting on other people's content. Neither had found a way to make money other than through advertising techniques. From a strategic point of view there was a difference between the two though.

Google's search engine had been invincible for a few years, but by the end of the decade it was increasingly weaker to other kinds of businesses. It was becoming apparent to Google's own management that the switching cost for a user to adopt one of the newer (and perhaps better) search engines was virtually zero. The network effect of a search engine is low by definition (how much the value of the product depends on the number of users using it). On the contrary, Facebook enjoyed both switching costs that kept users from leaving and its network effect was very high (the more people use it, the more valuable it is). No surprise then that in 2011 Google announced Google+, its second attempt at establishing a viable social-networking platform (after the embarrassing Buzz). By then Facebook had passed 750 million users.

At the same time Google invested in the smartphone market. In July 2008 Apple had launched the App Store for iOS applications (an iOS device was an iPhone, iPod Touch or iPad). By july 2011 the App Store had 425,000 apps (uploaded by thousands of third party developers), downloaded 15 billion times, on 200 million iOS devices. By then Google's equivalent, the Android Market, had 250,000 applications, downloaded 6 billion times, on 135 million Android devices; and it was growing faster: Google was activating new Android devices at the rate of 550,000 per day. By the end of 2011 Android smartphones owned 46.3% of the market and Apple iPhones owned 30%, leaving RIM (14.9%) and Microsoft (4.6%) way behind: there was now one area in which Silicon Valley alone beat the rest of the world.

Google's management, probably aware that Google's success stories tended to be the ones acquired from others as opposed to the ones developed internally, launched Google Ventures, the venture-capital arm of the company. Basically, it was beginning to make sense for Google to invest its huge cash reserves into other companies rather than into Google's own R&D. Google was trying yet another revolutionary business model: to become an incubator of startups (infotech, biotech and cleantech), an incubator that offered a much more powerful infrastructure than a venture capitalist could, starting with office space at its Googleplex and computing power in its gargantuan server farm. Google even offered a huge bonus ($10,000 in 2011) to any employee who suggested a startup resulting in an actual investment.

Three major development platforms were competing for world domination: the Facebook Platform (launched in 2007), the iPhone App Store (in 2008) and the Android platform (2007).

The company that was rapidly losing its sheen was Yahoo! Its advertising business remained strong but Yahoo! had done the opposite of Google: invested in creating content, even hiring journalists and bloggers. Yahoo! had basically moved towards becoming a media company at a time when Google and Facebook had been moving towards the more cynical model of purely exploiting the content created by their users. In a sense, Yahoo! still believed in quality at a time when Google and Facebook were proving that quality did not matter anymore and that advertising revenues depended almost exclusively on quantity. To make matters worse, Yahoo! had not contrived a way to have its users disclose personal information the way Google and Facebook had, which meant that Google and Facebook (happy to violate as much privacy as tolerated by the law) could offer targeted advertising to their advertising customers.

Yahoo alumni seemed to be more creative than the company itself. In 2009 two Yahoo executives, Brian Acton and Jan Koum, founded WhatsApp in Santa Clara to provide instant messaging over the Internet Protocol for smartphones (Android, BlackBerry, iPhone, etc). While Facebook and Google were not paying attention, WhatsApp did to SMS what Skype had done to the old telephone service: it turned it into a computer application. By the end of 2009 Whatsapp already had one million users; but that was just the beginning of its meteoric rise: one year later they were ten million, and by the end of 2012 Whatsapp had passed the 200 million users mark. In 2014 Facebook acquired WhatsApp for a huge amount of money. By then WhatsApp had 450 million users and was growing faster than Twitter, but had always refused the advertisement-based model of all the major dotcoms and was therefore making very little money.

Another front in the multiple wars that Google was fighting was the intellectual-property front. In November 2010 a consortium of Microsoft, Apple, Oracle and EMC paid $450 million for 882 networking-software patents owned by Novell, or about $510,000 per patent. They were not interested in Novell's technology, but simply in protecting itself from potential lawsuits. In June 2011 a consortium that included Microsoft, Apple, Ericsson, Sony, EMC and Research In Motion purchased the 6,000 patents for wireless communication owned by bankrupt telecom Nortel. They paid $4.5 billion, or $750,000 for each patent. Microsoft now co-owned the Nortel patents for voice services and the Novell patents for Linux, while Apple co-owned the Nortel patents for manufacturing and semiconductors and Oracle owned the patents for Java (acquired from SUN). Google's Android was left naked. In August 2011 Google paid US$12.5 billion for Motorola's smartphone business, that owned 24,500 patents, which was about $510,00 per Motorola patent. The purpose became clear a few days later: Apple had levied multiple lawsuits against Android-based smartphone manufacturers like HTC and Samsung, accusing them of "copying" the iPhone, and now HTC was in a position to sue Apple using some of the patents purchased by Google.

During the second decade of the century Silicon Valley was devastated by such patent wars. Google, Apple, Oracle and others filed patents by the thousands (and bought even more) and then filed lawsuits against each other. A patent was becoming a weapon of mass destruction: the richest corporations could file patents about the most trivial of ideas and use it to prevent competitors from implementing entire families of products. The main losers were the small companies and the startups: they did not have an army of attorneys to file patents and did not have an army of attorneys to fight the big corporations in court. We will never know how many inventors were on the brink of creating revolutionary products and either gave up, were forced to give up or were acquired by bigger companies. The process of filing a patent was considered ridiculous by the very corporations that indulged in it, but there was no chance of changing it: those same corporations hired an army of lobbyists in Washington to counter any attempt to rectify a clearly flawed process. A patent is a license to sue. It rarely represents a true innovation. In most cases it represents the exact opposite: an impediment to innovation, and even a blunt threat to anyone who dares innovate. Companies that have limited cash were forced to spend much of it fighting lawsuits instead of spending it to fund research and development. In other words: more and more money was being spent to fund lawyers than to fund research.

The Saga of Apple

In July 2011 Apple had more cash and securities ($76 billion) than the cash reserves of the government of the USA. Google was on a buying spree (basically an admission of inferior technology) while Apple rarely bought anything from others (an impressive demonstration of technological superiority). Apple had always been a strange company, a distant second to Microsoft in operating systems for personal computers, but possibly better respected than Microsoft (certainly so in Silicon Valley), as if Microsoft's rise was mere luck while Apple's survival was pure genius. Meanwhile, it continued to refine its MacOS to the point that Microsoft's Windows looked positively troglodytic to the Apple base. The iPod and the iPhone increased Apple's reputation in designing wildly appealing products, even though neither fully dominated its market (there were lots of digital music players, and Google's Android was growing a lot faster than Apple's iOS). The MacOS and iOS had, however, an incredible following worldwide, unmatched by any other desktop and mobile software platform. And, last but not least, Apple ruled the tabled market with the iPad (almost 70% of the marked in mid 2011). Apple had never embraced social computing (just like it had been slow to embrace the Internet to start with), and it was late in cloud computing (the iCloud was announced in June 2011), but there was a general feeling that Apple did things only when it was capable of stunning the world. No other company could afford to be so late to the market and still be expected to make a splash. The philosophical difference between Google and Apple was even wider than between Google and Microsoft: Apple still conceived the Web as a side-effect of computing, not as the world inside which computing happens, whereas Google (whose slogan was "nothing but the Web") was pushing the vision of the Web as "the" computing platform.
The difference in business models was even more profound: Google was making (mostly buying) relatively trivial technology and letting people use it for free (with the advertisers footing the bill), de facto turning the Web into a giant advertising billboard; while Apple was expecting people to pay a premium for its devices and services, just like a traditional brand of goods. In fact, one of Apple's great and unlikely success stories was its retail stores: the "Apple store" was popular worldwide, making Apple of the most valuable brands in the world. In July 2011 revenues from its retail stores were $3.5 billion, up from $2.6 billion the previous year (2011 was a terrible year for the whole Western economies) and Apple was planning to open 40 new stores in the following six months, mostly abroad. Google was trying to make computing more or less free, while Apple was trying to make computing as fashionable as cosmetics and apparel. Apple had tried this before, when its closed platform Macintosh had competed with Microsoft's open platform Windows. Now it was the relatively closed iPod, iPhone and iPad versus Google's Android platform.
In many ways the iPod, the iPhone and the iPad marked a regress in computing. They were computers, but limited to a few functions (that they did very well). For the sake of making those functions as mobile as the transistor radio (the first great mass-market portable electronic device) Apple reduced the power of the computer. In theory application developers could add their own applications to the iPhone, but in practice Apple had veto rights over which applications were allowed (so much for freedom of choice and the free market).

When Steve Jobs, the ultimate icon and mythological figure of Silicon Valley, died in October 2011, his legacy was the legacy of the whole of Silicon Valley: a new discipline that, borrowing other people's inventions, was not solely about the functionalities of a product and was not solely about the "look and feel" of a product, but was very much about the way that the human mind and the human body should interact with technology. It was a discipline born at the confluence of the utopian counterculture of the Sixties, the tech hobbyist culture of the 1970s and the corporate culture of Wall Street; and it was about creating a new species, a machine-augmented Homo Sapiens (Home Sapiens Plus?). Just like Silicon Valley as a whole, Jobs had mostly copied other people's ideas, but turned them into existential issues. Jobs elevated to a sophisticated art the idea that producer and consumer should engage in one and the same game: exhibitionism; and then congratulate each other, and even worship each other like fraternal deities.

It was telling that very few people noticed the death, also in Palo Alto, and just six months later, of Jack Tramiel, the founder of Commodore, the first company to sell a million units of a personal computer and whose Commodore 64 sold four times the sales of the Apple II.

Silicon Valleys in Time

If a historian specializing in technological evolution had examined the world a century ago, s/he would have never bet her money on a primitive, underpopulated and underdeveloped region like the San Francisco Bay Area. She might have picked a place in the USA (most likely Boston, New York or Philadelphia), but more likely a place in Western Europe, probably somewhere between Oxford and Cambridge. With 20/20 hindsight everybody has a theory about why it all happened in "Silicon Valley" (that for our purpose is really the broader Bay Area) but most of those theories are easily disproven if one studies other regions of the world: the same conditions existed somewhere else and to an even greater degree.

One needs to spend more time analyzing the previous cases of economic, technological and cultural boom. Three obvious candidates are Athens in the 5th century BC, Firenze (Florence) of the Renaissance and the eletrical Berlin of a century ago. There was little that made Athens truly special. Other cities might have succeeded better than Athens, particularly the ones on the coast of what is today Turkey that were the cradle of Greek civilization. However, Athens was probably the nicest place to live: the attitude of its citizens was different, somewhat eccentric for those times. Eventually it was that attitude that led to the invention of democracy and capitalism. I would argue that it was also that the real protagonist of Athens' renaissance was that attitude (harder to pinpoint than the wars and political upheavals that historians describe in detail).

It may have been easier to predict Firenze's (Florence's) ascent given that the city had been getting richer since at least the 12th century. However, who would have bet on a city state within a peninsula (Italy) that was mostly famous for endemic warfare? And if you had to pick an Italian city-state why not Venezia (Venice) that was creating a little empire (not surrounded by dozens of city-states like Florence was in Tuscany) and that constituted a crucial link between the superpower (Constantinople) and Western Europe? Again, what stands out is the attitude of the Florentines: if you liked adventure and innovation, it was nicer to live in Florence than in greedy, narrow-minded Florence, and eventually Florence did produce more liberal regimes and enlightened dictators instead of Venice's faceless dogi.

The "electrical Berlin" of the early 20th century came out of a black hole: Germany did not even exist 30 years earlier. Germany was the last place in Europe that was still politically divided in tiny medieval-style states the late 19th century. When Germany got unified, the spirit of unification certainly fueled nationalistic pride but, again, there was hardly anything special about German science and technology up to that point (there was indeed something special about German philosophy and German poetry, but one can argue it actually went against progress). What was unique about Berlin at that time was the enthusiasm of its population: the attitude, again, was indeed unique. In all three places it was the attitude (the spirit) of the population that was markedly different from the norm. In all three places that attitude rewarded the independent in ways that were not the norm, and it instituted a stronger degree of meritocracy than elsewhere.

The same might be true also for the San Francisco Bay Area. The great universities, the mass immigration and the venture capital (knowledge, labor and money) came later. What was already there was the spirit of the Frontier, the spirit of the eccentric independent explorer that later would become the hobbyist and the hacker.

Silicon Valleys in Space

There have been many attempts to recreate Silicon Valley in other countries. It was worth examining the ones in Western Europe that at the time led the world in universities, immigrants and capital.

France created Sophia Antipolis, a technology park in Southern France. First of all, it was created by the French government with a socialist-style centralized plan. The region has become a magnet for foreign IT companies that want a foothold into Europe, but hardly the creator of domestic startups that the Bay Area is in the USA. There are a few factors that make a huge difference: there is social pressure to join big corporations, not to start small companies; if you do open your own company, failure is terminal; very few foreign talents have been (permanently) attracted to the area; on the contrary many of the French talents trained there have emigrated to the USA where they did start the kind of company that they would not start in France. Note that, by all accounts, the quality of life in southern France matches if not surpasses the quality of life in California.

The Munich metropolitan area in southern Germany has become another high-tech hub. In this case the German government did not quite plan a Silicon Valley per se: it was the defense industry that brought advanced manufacturing to the area in ways not too different from how the defense industry bootstrapped Silicon Valley's high-tech industry. The advanced manufacturing that led to the success of companies like BMW transformed an essentially rural community in Bavaria into a high-tech hub. Here there are also excellent educational institutions: the Fraunhofer Institute and the Max Planck institute, that provide world-class public education. The quality of life is quite high by European standards (and the weather is much better than in most of Germany). The socialist underpinning here is represented by the fact that the main "venture capitalist" has been Bayern Kapital (an arm of the state government). The region has indeed spawned a varied fauna of infotech, biotech and cleantech startups just like in Silicon Valley. The region has also avoided the brain-drain that consumes most of Western Europe: relatively few German entrepreneurs and engineers have moved to the USA. However, this region too has failed to attract significant numbers of foreign talents.

Four factors made Germany completely different from the Bay Area. First of all, Germany's fundamental problem was the high cost of its labor (about ten times the cost of labor in China in 2010). Therefore the whole nation was engaged in a collective distributed project to devise ever more efficient ways to manufacture goods. Therefore both universities and corporations focused their research on innovating the manufacturing process, rather than innovating the products made through those processes. The German system is biased towards perfecting existing technology rather than creating new technology. RWTH Aachen spent billions of euros to create a technology park that specializes in just manufacturing techniques. Stanford's technology park was never meant for just one specific application of technology. Secondly, the relationship between industry and academia has always been different in Germany than in the USA. German corporations fund academic research that is very specific to their needs, whereas universities in the USA receive money that is generically for research. This means that the transfer of know-how from academia to industry is much smoother and faster in Germany than in the USA, but at the same time the students are raised to become workers and then managers in the existing corporations rather than start new creative businesses. Thirdly, Germany's big success story could also be a curse: Germany achieved an impressive degree of distribution of high education, spreading world-class research institutes all over its territory (the Max Planck Institute had 80 locations in 2010, the Fraunhofer Society had 60), but this also means that most of the bright scientists, engineers and entrepreneurs don't need to move to another city, as they can find a top-notch technological centers right where they live. Silicon Valley was mainly built by immigrants (from both other states of the USA and from abroad). It was one place where everybody converged to do high-tech because most of the rest of the country did not have the conditions that are favorable to the high-tech industry. Germany provides them almost to dozens of regions, therefore none of them can become the equivalent of Silicon Valley. Finally (and this is true for all of continental Europe), German industry has to deal with a strong anti-technological and anti-capitalist sentiment that was created over the decades by an alliance of socialists, environmentalists, hippies, philosophers, psychologists and so forth.

So far the main success stories of Europe have come from these regions: SAP (Heidelberg), ARM (Cambridge) and Nokia (Oulu). The European region that came closest to resembling Silicon Valley (although at a much smaller scale) was Oulu in northern Finland, where more than a thousand startups were born in the 2000s, most of them in wireless technology, but also in biotech, cleantech and nanotech.

Israel was, actually, the country with the highest venture capital per person in the entire world ($170 compared with $75 in the USA in 2010). Its startups focused on the military, communications, agricultural and water technology that were essential for the country's survival. Many of these startups were acquired by Silicon Valley companies. None of these startups managed to grow to the point of becoming an international player.


(Copyright © 2014 Piero Scaruffi)

Table of Contents | World News | History pages | Editor | Correspondence