"The original Silicon Valley meaning of a disruptive company was one that used its small size to shake up a bigger industry or bloated competitor. Increasingly, though, the [TechCrunch Disrupt] conference stage was filled with brash, Millennial entrepreneurs vowing to 'Disrupt' real-world laws and regulations in the same way that me stealing your dog is Disrupting the idea of pet ownership."
- Paul Carr1
Welcome to the post-disruption era. Yes, here, now, disruption has been disrupted. Broken after having moved too quickly, forgiveness always already asked in lieu of permission. Disruption has subverted itself, fortifying the very notions it purports to disrupt. A false dialectic, generating itself while obscuring its own vacuity. Post-disruption: an ahistorical narrative caught in its own circular reference, perpetuating the myth of itself in service of its antithesis.
OK, OK, pump the breaks for a second. There's no need to get bogged down in the morass of postmodern obscurity quite yet. We're talking "disruptive" here - the sexy, the exciting, the new. Technology. Innovation. Silicon Valley. The internet. Touchscreen computers that fit in your pocket and algorithms that know what you want before you even want it. Google. Amazon. Facebook. The leaders of the new economy, striving to organize the world's information and make everything more open and connected. The movers and the shakers. The disruptors.
Change. That is what our disruptors are fighting for. And not just change for the sake of change, but change for the better. Upending not only stodgy corporations but also slick politicians, sensationalist newsmen, and even the darker sides of ourselves. The underdogs, leveraging the power of the cloud and big data to solve the world's problems. Building a future where software and the internet will make anything possible. Creating a more social, more mobile world.
So where does the "post" come in? Well, as it so happens, this colloquial understanding of disruption has become false. Though our disruptive heroes may have, at one time, fought for change and the common good, they now find themselves firmly invested in the status quo. The disruptive narrative expounded by pundits of the technology industry is at best contradictory and at worse meaningless, nothing more than rhetoric used to advance the agenda of the new tech oligarchy.
"Disruption" is now an example of postmodern imperialism at its best, disrupting economics and sociology while inventing an artificial narrative of how the world works. In the spirit of social theorist Jean Baudrillard, "disruption" is its own pure simulacrum, a self-perpetuating simulation of itself bearing no semblance to actual history or fact.2
The notion of "disruption" has not always been this artificial. At its inception, "disruption" could be meaningfully used to describe empirical business trends. Appeals to "disruptive innovation" were reflective of real events, instances when new products displaced existing products by offering similar performance while also addressing the needs of new customers.
But that concept of "disruption" has degraded over time. First, "disruption" was perverted, narrowed to exclusively describe the technology industry while also broadened to imply revolutionary change. Though this perversion may have been an accurate, if hyperbolic, portrayal of startup tech companies at the turn of the century, it grew to be inaccurate as those companies displaced the established firms they set out to disrupt. "Disruption" came to mask its own absence; "disruptive" companies created the appearance of change while simultaneously fortifying their own market position. Eventually, "disruption" became untethered from reality entirely and receded into meaninglessness, existing only as a positive term deployed in service of technology's progressive narrative.
"Disruption" is now used to rationalize the astonishing wealth and influence Silicon Valley leaders have amassed while the rest of society has languished. It paints tech companies as both revolutionary and naturally benevolent to woo employees and regulators while executives and investors funnel ever larger fortunes into their own pockets. Meanwhile, average incomes have stagnated, policies to protect workers have eroded, and broad-based democratic engagement has declined. Worse yet, many recent "innovations" are simply old ideas in new attire, offering little more than minor conveniences to a narrow, privileged audience. "Disruption," it seems, has become little more than a marketing term used to paint those innovations in a positive light and justify the ascent of tech industry leaders.
This perversion of "disruption" is similar to the gradual evolution of postmodernity Baudrillard observed over the course of the 20th century.3 But, as technology pundit and Google fanboy Jeff Jarvis put it, "accepted wisdom has it that internet time moves quickly; that we are living through change at an unparalleled pace; that our modern minutes are but 10 or 20 seconds long."4 And so our story of disruption begins not a century ago but as recently as 1997, with Clayton Christensen's publication of The Innovator's Dilemma.
DISRUPTING HISTORY
Clayton Christensen couldn't sleep. Despite a degree from Harvard Business School, a successful stint at the Boston Consulting Group, and experience as the founder and chairman of a technology company, he found himself plagued by a persistent question: why is success in business so difficult to sustain? Though we had nearly a century of management theory under our belts, we seemed no closer to predicting the continued success of an enterprise.
So, nearing middle age and with a new baby on the way, Christensen left the business world to pursue an academic career. He dug deep into the history of 20th century industry, and after several years of intense work he came to a startling conclusion: companies that do everything "right" can fail precisely because they so successfully adhere to prevailing management theory. Market-leading companies that are managed effectively, listen closely to their customers, and invest heavily in research and development can still fall victim to disruptive innovation.5
Christensen theorized that market-leading firms are actually quite adept at developing "sustaining" innovations that "foster improved product performance." However, they are often caught flat-footed when faced with "disruptive" innovations that, at first, "result in worse product performance," but "bring to a market a very different value proposition than had been available previously."6 Over time, these disruptive innovations are able to surpass the capabilities of legacy technologies and destabilize market-leading firms.
Christensen uncovered this trend in the tumultuous history of the disk drive - those little spinning disks that used to provide long-term memory for most computing devices, from mainframes to laptops. The disk drive industry, it seems, was racked with disruption in the 1970s and 1980s; new companies rose to dominance every few years, only to be washed away shortly thereafter by a new wave of disruptive entrants.7 Even though the rapid cycle from boom to bust was well known in the industry, market-leaders seemed completely incapable of holding off the advances of their successors.
The problem, it seems, was that disks kept getting smaller. Each drop in diameter - 14" to 8" to 5.25" to 3.5" - was accompanied by, at first, a decline in capacity; smaller drives offered less storage space (often at a higher price) than their larger counterparts. But while existing customers of disk drives had no interest in innovations that couldn't deliver more cost-effective capacity, "new customers," Christiansen notes, "were willing to pay a premium for other attributes that were important to them - especially smaller size."8
Drive size compression is Christiansen's prime example of disruptive innovation. Smaller drive manufacturers disrupted the disk drive market by not even addressing the traditional market at first; existing customers wanted higher capacity at lower cost, and new manufacturers delivered the opposite. But new markets - that is, new customers with distinct needs - emerged in parallel, fueling demand for smaller drives. While mainframe computer market relied on cost-efficient 14" drives, minicomputers required the smaller size of 8" drives. Personal computers, in turn, ran on 5.25" drives, and portable devices on 3.5" drives.
Startup disk drive manufacturers addressed these small new markets with small new drives, and rode market growth and sustaining innovation to supplant their established, larger drive predecessors. By addressing a new market and continuing to improve the capacity of their drives, new entrants were eventually able to compete with and even outperform the established manufacturers of larger drives.
At first, Christensen thought this trend of disruption - new products opening up new markets then advancing to address existing customers - was exclusive to the computing industry. But, "one by one," Christensen says, "people read the research and said this is 'exactly what is happening in my industry.'"9 With a closer look, it seemed, every industry exhibited some traces of disruption, from steel manufacturing to backhoes.
The disruption of department stores by discount retailers in the 1960s provides a clear example: through operational efficiencies, new retailers were able to offer quality goods at prices lower than had been available before, putting retail goods within the grasp of a wholly new population of consumers. Discount retailers disrupted traditional retail by, at first, not even addressing traditional retail customers; instead, they "focused on the group of customers least attractive to mainstream retailers: 'young wives of blue collar workers with young children.'"10
But it was not enough for discount retailers to reach a new audience alone, for that would label them as "new" rather than "disruptive." They also had to steal customers from traditional retailers, which they were able to do as product quality improved while prices remained low. While not all customers switched from department stores to discount retailers - disruption does not necessarily require that new companies completely supplant their predecessors - more price-sensitive customers gradually shifted away from traditional retail.11
By addressing a new customer base, disruptive innovations typically provide some consumers - often cost-conscious consumers - with access to products and services that were previously unattainable. Smaller disk drives made personal computers available to the middle class, while discount retailing put quality products within the reach of lower-income families. At the same time, by addressing the needs of some existing customers, disruptive innovations inevitably steal some customers from established firms, and may displace those firms entirely.
In some ways, this paints a startlingly optimistic picture of "disruption." Though the word itself invokes connotations of destruction and destabilization, it actually seems to best describe instances of empowerment and renewal. Perhaps this benevolent tone contributed to the term's perversion in the years following the publication of The Innovator's Dilemma, when disruption enthusiasts were quick to trumpet the almost divine capacities of the disruptor.
But the true denaturing of disruption has its roots in a much uglier term. You see, Christensen's "disruption" is actually strikingly similar to another bit of business jargon: "cannibalization." While new products may address new markets, they can be cannibalistic if, at the same time, they steal customers away from existing products.
Disruption, like cannibalization, is when one product devours another. Perhaps that is what Silicon Valley luminaries are getting at when they proclaim that "software is eating the world."12
DISRUPTING TECHNOLOGY
"We are in the middle of a dramatic and broad technological and economic shift," claims venture capitalist Marc Andreessen. "Over the next 10 years, I expect many more industries to be disrupted by software, with new world-beating Silicon Valley companies doing the disruption in more cases than not."13 Here is the creator of one of the first web browsers, a celebrity in the West Coast tech enclave, co-opting Christensen's disruption in service of Silicon Valley entrepreneurs.
The perversion of disruption did not happen overnight. Following the dotcom bubble in the late 1990s, the tech industry receded from the limelight, only to reemerge a decade later alongside the purported "revolution" of the social, the mobile, and the cloud. Without a strong pedigree, this newly rejuvenated industry found itself in search of a narrative, some justification for its renewed success and an answer for why, this time, things might turn out differently. And disruption proved the perfect story to tell.
"We are only in the opening stages of this next revolution," says TechCrunch contributor Danny Crichton, "but Silicon Valley and startups are firmly in the lead."14 Over the past several years, pundits like Danny - a venture capital associate and former Google summer intern15 - have made it conventional wisdom that technology companies are the sole source of disruption. Though it may be hard to believe, tech bloggers are able to able to cultivate this narrative of disruption across a wide audience - TechCrunch alone reaches over eight million people every month.16
It's not just tech blogs perpetuating the story of tech firms as disruptors, though. New York Times columnist Nick Bilton has been singing the tune of disruption since he started writing for the paper in 2009.17 In fact, his column on the technology industry is titled "Disruptions" - a weekly exploration of "how technology is shaping our lives."18
A new wave of management gurus has also latched onto the technology-as-disruptor narrative. "The disruption is everywhere," says writer Jeff Jarvis. "What makes technology a model is that it is in a state of constant disruption; it disrupts and deflates and rethinks and rebuilds itself constantly."19 Whatever that means. In fact, Jeff has made a name for himself blending pseudo-theory with unabashed tech fandom in the form of books like What Would Google Do? - "an indispensable manual for survival and success in today's internet-driven marketplace."20 Read, lest ye be disrupted.
But the old guard, too, has bought into the notion that technology companies are the locus of disruption. In 2013 and 2014, the prestigious management consulting firm McKinsey & Co. published a series on "Disruptive Technologies" featuring interviews with - you guessed it - the tech elite. None of the other industries Christensen makes note of - discount retail, or excavating equipment, or motor vehicles, or steel - make an appearance. Instead, we have Google Chairman Eric Schmidt and two other tech executives expounding on the disruptive power of technology alongside tech-centric academics Clay Shirky and Erik Brynjolfsson.21
Somehow, disruption had become an exclusive possession of tech entrepreneurs.
To be fair, overwhelming disruption is rare outside of the technology industry, as it is often difficult for new entrants to address the needs of every consumer. When buying clothing, style is a key consideration, but so are price, durability, and comfort. When booking a flight, it is essential that an airline fly where you want to go - but price, legroom, and customer service are important too. Kmart may have won over cost-conscious consumers, but fashion-focused shoppers likely stuck with Macy's despite higher prices. Virgin America may have "disrupted" the airline industry with better service, but there are still destinations where you have no choice but to fly United.
The technology industry is different. Historically, one criterion has trumped all others: cost per unit of computing capacity (e.g., processing speed or memory). Furthermore, computing capacity is typically improved upon at a regular and relatively predictable rate.22 So when a new disk drive manufacturer enters the market with a smaller, weaker device, it's not too hard to envision it matching the computing capacity of larger drives after not too long. Thus, technology is prone to "complete disruption," as new products can advance along a straightforward path of performance improvement to completely obviate the need for existing products.
The fact that it is so disruption-prone made it easy for the technology industry to adopt disruption as its seminal narrative, and do so to the exclusion of other industries. Why mention modest disruption in retail or airlines, where new entrants may nibble a few points of share away from market leaders, when stalwarts of the technology industry are so often entirely devoured by renegade upstarts run by brash 20-somethings? It's simply a more exciting story to tell.
But disruption, taken as is, paints a relatively uninspiring portrait of rampant volatility - not the best image for a newly reinvigorated industry to project when positioning itself for long-term growth. Luckily, it wasn't a long leap to extend the characteristics of empowerment and renewal to position tech disruptors as the harbingers of broad, sustained quality of life improvement.
Take Google's mission statement, for example: "To organize the world's information and make it universally accessible and useful."23 Or Facebook's: "To give people the power to share and make the world more open and connected."24 The old guards of the new economy consistently profess altruistic motives for their capitalist enterprises.
Recent entrants take this pretension even further, with slogans that steer clear of tangible objectives in favor of vague commitments to make the world a better place. Mobile taxi app Uber, for example, is "changing the logistical fabric of cities around the world," using "technology to give people what they want, when they want it," and making "transportation as reliable as running water."25 Digital cloud storage provider Dropbox's mission is, apparently, "to simplify life for people around the world."26 And Tinder, the app that makes it even easier for teens and 20-somethings to hook up, attests to be "like real life, but better."27
Of course, you can't radically change the world by destabilizing established businesses alone. Accordingly, pundits have taken Silicon Valley's altruistic objectives to their logical conclusion: technology companies are improving our everyday lives by disrupting any and all existing institutions, social structures, and prevailing modes of thought. In this way, the tech industry has further distorted the notion of "disruption," ascribing revolutionary attributes to a formerly narrow and academic piece of business jargon.
The first target of this broad-based disruption is not an individual company or group of companies, but our modern capitalist economy as a whole. "Instead of being dominated by a few, giant tree- structured organizations, it's now looking like the economy of the future will be a fluid network of smaller, independent units,"28 claims the founder of the startup incubator Y-Combinator Paul Graham. Of course, these smaller, independent units are startups, and "startups usually involve technology, so much so that the phrase 'high-tech startup' is almost redundant."29 Frankly, "startups may represent a new economic phase, on the scale of the Industrial Revolution," because "people are dramatically more productive as founders or early employees of startups... and that scale of improvement can change social customs."30
"In a poetic sense, the prime goal of the new economy is to undo - company by company, industry by industry - the industrial economy,"31 echoes Kevin Kelly, founding editor of Wired magazine. Venture capitalist Peter Thiel - co-founder of PayPal and early investor in Facebook - agrees: "Silicon Valley is going to be the center of the US economy for the next 10-20 years."32
Employment, too, must be disrupted. According to Danny, "with the rise of the On-Demand Economy, Silicon Valley is taking the lead on building a better environment for work... The algorithm today could do for workers what unions did in the 19th century: provide a vastly improved market for work, one that is simultaneously more convenient, safe, and lucrative... We have every opportunity to build a far more creative and dynamic economy, one that quickens the pace of human advancement and bestows a better life on all of mankind."33 It truly is God's work the millionaires of Menlo Park are undertaking.
So why not make everyone a tech worker? "Coding is the blue-collar job of the 21st century," according to venture capitalist and early Facebook employee Chamath Palihapitiya. "Learn to code; everything else is secondary. College doesn't matter that much. It is the most important job of the next hundred years."34
But not only work and commerce need to be disrupted - politics and society must be, too. Luckily, Silicon Valley is leading the way in giving every citizen a voice, facilitating peaceful discourse, and enabling collective action. "There are 5 billion people in the world who don't have Internet connections," Astro Teller, chief visionary of the Google X experimentation lab, reminds us, "and there's very little that would cause the world to be more at peace, more prosperous than getting the other 5 billion people on the planet connected."35 Facebook CEO Mark Zuckerberg affirms: "The internet as a whole and social media will bring reconciliation and peace."36 After all, the internet is "the key driver of social and economic progress in our time."37
With the democratizing power of the internet, the tech industry could thoroughly disrupt the government. "If the geeks take over the world - and they will," Jeff proclaims, "we could enter an era of scientific rationality in Washington. Other nonpoliticians have improved government. Michael Bloomberg runs New York City as a business. Arnold Schwarzenegger rules California with the power of personality. A Google guy might just run government as a service to solve problems." How might such a government work, you ask? "I'd like to see citizens use the Web to establish personal political pages," he asserts. A "Facebook of democracy," or something like "Google as the polling place that never closes... a platform for organizing citizens."38
After all, in the eyes of Silicon Valley luminaries, the way we are governed today is broken. "The gears are grinding together in government, and it's slow and complicated and no one understands it." That's Bill Maris, managing partner of Google Ventures, the in-house venture capital firm at Google. "Great things are usually not accomplished in Silicon Valley through government policy, they are accomplished by individuals who set out to change the world, invent something, create a better live [sic] for themselves and their children."39
Troublingly, many of these claims about the broad and benevolent disruptive power of technology are either misleading or unfounded. First and foremost, the federal government played a critical role in the foundation of Silicon Valley and the creation of the internet. As historian Leslie Berlin notes, "the government, in effect, served as the Valley's first venture capitalist" by bankrolling early microchip and computer manufacturers for defense purposes in the mid-twentieth century.40 The government also developed the fundamental infrastructure and networking technology that preceded the modern internet, including ARPANET in the 1970s and NSFNET in the 1980s and 1990s.41
Despite this substantial public investment, and contrary to the claims of Kevin Kelly and Peter Thiel, the technology industry remains a small fraction of the US economy, accounting for merely 5% of GDP. It also isn't the principle driver of economic expansion, delivering only 4% annual growth in 2014, about in line with GDP. 42
It also is not evident that the internet is naturally a "force for peace," as Mark Zuckerberg has claimed.43 While the internet can be used to deliver essential public goods like education and healthcare, it can also enable terrorists to more easily recruit new devotees.44
Technology does appear capable of driving meaningful economic and social progress, but it all depends on how it is implemented. We cannot simply create new services and expect that they will naturally be used for the common good. To deliver on their benevolent promise, Silicon Valley innovators must collaborate with other businesses and government to incubate new technology and guide its use for the benefit of society.
Unfortunately, technology leaders tend to see cautious collaboration with community leaders as antithetical to the breakneck pace of disruptive innovation. After all, the visionaries in the Valley are naturally going to need to "break things."45 "Innovation is disruption; constant innovation is perpetual disruption," Kevin Kelly reminds us. "This seems to be the goal of a well-made network: to sustain a perpetual disequilibrium."46 To innovate, we must be prepared for disequilibrium, for volatility.
"Our information age will be marked by unintended consequences, so the sooner we recognize, embrace, and adapt to them, the better," asserts Jeff. "The wise course then is not to try to forestall change (to slow or stop it through regulation), but to accelerate it through openness and investment."47 "Now is the time for experiments, lots and lots of experiments," echoes Clay Shirky, because "nothing will work, but everything might."48
Behind this expectation of chaotic innovation is an underlying sense of inevitability. "We know, as The Luddites learned, that technological advancement stops for no one, that it will happen regardless of what we say or do," says long-time tech journalist Rob Miller. "Surely, it will sometimes get ahead of our ability to understand it and regulate it in an appropriate way."49
What this means in practice is that life becomes more volatile. Danny readily admits that, while "stability is critical for being able to settle down and raise a family... this new talent market is going to be much less secure and stable than the old one. Flexibility and control does [sic] come at a price."50
Suddenly this is beginning to sound much less benevolent. Though the technology industry may, in theory, disrupt the economy and the government for the better, it also seems poised to degrade social protections and stability in our everyday lives.
Instability is the loose thread at the narrative's conclusion that, once pulled, begins to unravel the entire rhetorical façade. "Disruption," as used by the tech elite, begins to take a more sinister turn, resembling not a hyperbolic version of an empirical business trend but rather a way of masking the absence of that very trend. All of this grandiose, benevolent ballyhoo is not just inaccurate - it's purposefully misleading.
At the end of the day, tech companies are commercial enterprises that are obligated to create value for their shareholders. While Bill Maris may claim "we're not 'in it for the money', we're in it to do something really important," he'll clarify, in the same breath, that "companies that are financially successful tend to be those that make the biggest impact."51
However earnestly executives and pundits want to believe the opposite, the interests of business owners and the general public are often fundamentally at odds. And the business owners are winning.
DISRUPTING RHETORIC
Since the first tech climax in 2000, average household income in the United States has declined 4%. Even since the economic recovery in 2010, when the social, mobile, and cloud revolution was just gaining steam, income has grown just 3%. Despite what pundits might claim, technology does not appear to be rapidly enriching our society. 52
While incomes in aggregate have stagnated, inequality has reached historic highs. Households in the top 1% earn, on average, more than one million dollars per year after taxes. That's fifteen times more than an average household in the bottom 99%.53
Meanwhile, democracy appears to be languishing. Voter participation is atrocious, with turnout falling to its lowest point in 72 years in the most recent US midterm election.54 And despite the power of the internet, censorship and government secrecy actually appear to be worsening, with the United States plummeting 29 spots on an international ranking of press freedom since 2009.55 At the same time, satisfaction with the US government has dropped to record lows; only about one in four people report being satisfied with the way the nation is governed today.56
And, of course, volatility is increasing. Over six million part-time workers are unable to attain full- time positions today, up 100% from 2000.57 Nearly one-third of US households now experience income volatility month-to-month, with irregular working hours cited as the most common reason for unpredictable earnings.58
Despite over two decades of innovation on the internet and over half a century of innovation in computing, the idyllic, disruptive future the tech elite prophesied has not come to fruition. And while technology companies are certainly not entirely at fault for these disheartening trends, they do appear to be benefitting while the rest of us are struggling.
Google was founded in 1998, and is now worth nearly $500 billion. Facebook was founded in 2004, and is now worth more than $300 billion. Amazon was founded in 1994, and is now worth more than $300 billion as well.59 Over the past couple decades, while average Americans have been languishing, these fixtures of the new economy have grown from nothing to achieve unfathomable valuations.
Most of the value generated by these tech titans has been passed directly onto their leaders and early investors. The founders of all three can be found in among the fifteen wealthiest people in the world, each with a net worth greater than thirty billion dollars.60 Additionally, each has helped to mint several other billionaires, including four from Google and seven from Facebook.61 Technology, it seems, is best equipped for funneling large sums of money into the pockets of a select few.
Technology has also been incredibly successful in consolidating the distribution of information and communication. Half of Americans rely on the internet as a primary news source.62 Nearly two- thirds of internet searches in the US use Google.63 Nearly three-quarters of Americans are on Facebook.64 With this level of influence and limited regulatory oversight, internet companies can have a subtle, yet dramatic, influence on public perceptions and prejudices. They can reinforce gender inequality by advertising higher-paying jobs to men, or fortify socioeconomic divisions by targeting lower-income neighborhoods with higher-interest loans - both of which can be observed on Google today.65 They can also act anti-competitively by prioritizing their own services at the expense of customer experience, of which Google has also been accused.66
At the same time, technology companies have directly contributed to income volatility and general instability, as readily admitted by vocal pundits like Danny and Jeff. The new "sharing economy" or "gig economy" is fueled by new tech business models that connect consumers with massive networks of independent contractors to do their chores, deliver their food, and drive them from place to place. Though many of these supposed contractors rely exclusively on tech companies like TaskRabbit, Postmates, and Uber for their income, the companies often refuse to offer them basic employment rights like overtime pay, unemployment insurance, and workers' compensation.67
Clearly, tech companies are not delivering on the utopian notion of "disruption" put forward by industry leaders. They're not even disruptive in Christensen's sense of the word - at least, not anymore. Amazon legitimately disrupted the retail industry for many years, playing an instrumental role in the bankruptcy of established brick-and-mortar bookseller Borders in 2011. Google and Facebook, too, played disruptive roles in the advertising industry, leading the way in diverting ad spend from traditional media outlets to the internet. But we're talking about companies valued in the hundreds of billions now. It's hard to say that they're still creating new markets while cannibalizing established competitors - they are the established competitors.
New entrants would also be hard pressed to qualify for Christensen's definition of disruption, as they often do little more than undercut competitors by ignoring laws and regulations. What makes Uber "new" is not that it makes car services more accessible to a broader base of consumers, as most of Uber users already have access to taxis and public transportation. Instead, Uber is "new" in that it flouts the transportation and employment laws that burden its competitors. Whether or not such regulations are outdated or inappropriate is beside the point; Uber derives a distinct cost advantage by circumventing existing rules. Lower operating costs, in combination with more than ten billion dollars in funding,68 enable the company to engage in predatory pricing relative to its traditional competitors.
This isn't technological disruption - it's regulatory arbitrage. It's no surprise that Uber has 250 lobbyists around the country and has spent hundreds of thousands of dollars lobbying state and local governments in the past few years. The company even hired Barack Obama's presidential campaign manager to help execute on their political agenda.69 Airbnb, a company that facilitates vacation rentals, has taken a similar approach: after years of skirting short-term rental laws and hotel taxes, the company hired a former aide to Bill Clinton to head up global policy and public affairs and began actively campaigning for favorable regulations.70 In one particularly egregious example, the company funneled over eight million dollars into a campaign to defeat a San Francisco ballot measure averse to its business model.71 Merely 200,000 residents voted on the proposition, with 56% siding with Airbnb - meaning the company spent more than $75 for each favorable vote.72 This degree of corporate investment in policymaking poses a very real threat to the democratic process.
Conversely, startups that are unsuccessful in evading legal requirements inevitably fail. Aereo aimed "disrupt" the television industry by offering free streaming over the internet. However, when faced with the legal obligation to pay retransmission fees, it collapsed.73 Udacity aimed to "disrupt" the education industry by offering courses with marquee professors to massive audiences over the internet. However, when faced with the requirement to actually improve comprehension and learning at basic levels, the service proved a failure.74
This demand for favorable regulation is not isolated to a few recent startups. Google alone spent over sixteen million dollars lobbying the federal government in 2015, ranking it among the top 15 clients of lobbyists in the United States. Annual tech industry expenditures on federal lobbying have grown over 600% since 2000, to over fifty million dollars in 2015.75 And while they may no longer be truly disruptive, these companies continue to invoke the benevolent capacity of "disruption" as a justification for preferential regulatory treatment.
Take, for example, Google Chairman Eric Schmidt's personal lobbying efforts in Europe. Google has increasingly come under regulatory scrutiny on the continent; the search engine giant must comply with "right to be forgotten" laws,76 is currently subject to a long-running antitrust investigation into its shopping services and was recently hit with a second complaint regarding its Android licensing practices,77 and was even target of a non-binding resolution by the EU Parliament calling for its breakup.78 While these regulatory activities are being enacted by democratically elected officials, presumably in the interest of European citizens, Schmidt believes he knows better what's best for Europe.
In a 2014 article on the European Commission's website, Schmidt lays out an argument for Europe to "embrace disruption." He starts by blandly patting the entire continent on the back, suggesting Europe's "citizens have the education, skills, and ambition needed to create great technology companies that will drive economic growth and employment." One day, Europe, you too could create the next Google! Already, "Silicon Valley-style high-density hubs of talented thinkers are emerging." The more you can act like us, Europe, the better. But getting there "requires strong leadership," commands Schmidt, as well as pro-tech regulations. "Europe needs tax incentives and other proactive measures that make it easier for startups to get funding" - that is, give us tax breaks. "Labor markets are another key area for reform" - that is, make it easier for us to lay people off. "New businesses promoting new ideas should not be held back by bureaucratic or regulatory hurdles" - that is, don't regulate us.79
All of this is thinly veiled propaganda for Google's business model in Europe. And it climaxes with Schmidt's appeal to disruption:
"Most of all, Europe needs to accept and embrace disruption. The old ways of doing things need to face competition that forces them to innovate. Uber, for example, is shaking up the taxi market -- for the good. It offers riders convenience and cheaper fares. Understandably, the incumbent taxi industry is unhappy."80
Not only does this gloss over the economic, regulatory, and cultural context of taxi services - factors that notably vary across members of the European Union - but also it also unabashedly promotes a company that Google has a vested interest in - its venture arm has invested over two hundred million dollars in Uber.81
Rhetoric like this is not exclusive to Google. Netflix Chief Product Officer Neil Hunt, for example, takes the entire country of France to task for regulating the amount of foreign content media companies broadcast. "It's a bad idea for culture in general, and for France in particular," according to Hunt. "If we do the right job with our recommender systems, we can truly enable a global culture."82 The idea that Netflix knows what is best for French culture, or that a US video streaming company could somehow facilitate a global culture, is simply absurd. These comments are little more than marketing intended to break down regulatory barriers for the benefit of American technology entrepreneurs.
But technology executives don't typically appear disingenuous in their appeals to "disruption." They're not lying; many seem to think "disruption" really can deliver positive global change while also benefitting their bottom lines. Underlying their rhetorical appeals is a fundamental belief that the world can be modeled after the technology industry, that value can be understood through singular criteria, that moral and humanistic concerns can be simplified into straightforward I/O functions. That there are no shades of grey. "In technical matters, you have to get the right answers," Paul Graham reminds us. "If your software miscalculates the path of a space probe, you can't finesse your way out of trouble by saying that your code is patriotic, or avant-garde, or any of the other dodges people use in nontechnical fields."83 National pride and artistic experimentation - along with a wealth of other intangible values - are deprecated against inviolable scientific Truth. As an early Google employee put it, "for many engineers, and particularly for Larry and Sergey, truth was often self-evident and unassailable. The inability of others to recognize truth did not make it any less absolute... Truth is, after all, a binary function."84 It has a certain intuitive, utilitarian elegance to it, which makes it all the more persuasive. Technologists are disrupting dogmatism and mysticism to deliver on the promise of a truly rational society, ensuring that wealth is allocated to those that deserve it most: themselves.
"Wealth is what people want," reminds Paul Graham. And people want software. So "a programmer can sit down in front of a computer and create wealth," Graham tells us, as "they literally think the product, one line at a time." So it makes sense, to a technologist, that people should be compensated relative to the raw output of their work. For entrepreneurs, that means taxes and regulations should not get in the way of the fortunes they're entitled to when their startups go public. "This is why so many of the best programmers are libertarians," says Graham, as he lays out the economic proposition for becoming fabulously wealthy by writing code:85
"If you're a good hacker in your mid twenties, you can get a job paying about $80,000 per year. So on average such a hacker must be able to do at least $80,000 worth of work per year for the company just to break even. You could probably work twice as many hours as a corporate employee, and if you focus you can probably get three times as much done in an hour. You should get another multiple of two, at least, by eliminating the drag of the pointy-haired middle manager who would be your boss in a big company. Then there is one more multiple: how much smarter are you than your job description expects you to be? Suppose another multiple of three. Combine all these multipliers, and I'm claiming you could be 36 times more productive than you're expected to be in a random corporate job. If a fairly good hacker is worth $80,000 a year at a big company, then a smart hacker working very hard without any corporate bullshit to slow him down should be able to do work worth about $3 million a year...
Remember what a startup is, economically: a way of saying, I want to work faster. Instead of accumulating money slowly by being paid a regular wage for fifty years, I want to get it over with as soon as possible. So governments that forbid you to accumulate wealth are in effect decreeing that you work slowly."86
The problem is that no other industry, occupation, or human endeavor works this way. Software development is the outlier, a digital abstraction of labor unencumbered by the constraints of physical resources and the temporal cost of production. As an intangible, infinitely replicable product, software resembles an "ideal" good found nowhere else in the real world. This theoretical ideal simply cannot be taken as a model for how everything else should work.
Only about one and a half million Americans are employed as "Software Developers" today - Graham's "hackers," building (digital) products one line of code at a time. That's relative to the nine million "Production Occupations" in the US - the folks responsible for building physical products.87 And unlike software, hard goods are limited by natural resources and the productive capacity of manufacturing equipment. Machinists, assemblers, fabricators, and other production workers cannot simply decide to work longer, smarter, or more efficiently on their own. They are constrained by high input costs - operating costs for materials and capital expenses for property, plant, and equipment - that make it impossible for them to scale by themselves.
But even "production" workers represent a small fraction of the US workforce. Nearly 22 million people are employed in "Office and Administrative Support" occupations - traditional office jobs, including customer service representatives, stockroom managers, bookkeepers, administrative assistants, and other office clerks. Add to these other traditionally "office" occupations like business operations and management, and you cover over one-quarter of the US workforce. Another 20% is accounted for by sales and service occupations: retail salespeople, cashiers, food service workers, and other members of the service economy. Add the 16 million health and personal care professionals, and the 8.5 million education workers, and you capture a substantial majority of the jobs in America - none of which are principally focused on production.88
These are time-bound jobs: jobs where someone must be somewhere for some amount of time. Some have purely a clock-in, clock-out nature; you need a cashier to process transactions when they happen, a call center representative to take calls when they come in, and an administrative assistant to respond to the day-to-day needs of an office. They are jobs defined by the time required, and working harder, faster, or more efficiently would do little to change their temporal nature.
For others, the time is inexorably linked to the quality of the output. It takes time for a doctor to treat a patient, just as it takes time for a teacher to educate a student. Certainly, treatment and education can be provided more quickly and efficiently, but only to a point; patients need time to recover, and students need time to develop understanding.
In this way, many jobs fall victim to "cost disease" - something even some tech leaders recognize. Clay Shirky, for example, examines exactly this issue in education:
"Higher education has a bad case of cost disease (sometimes called Baumol's cost disease, after one of its theorizers.) The classic example is the string quartet; performing a 15-minute quartet took a cumulative hour of musician time in 1850, and takes that same hour today... Unfortunately, the obvious ways to make production more efficient - fewer musicians playing faster - wouldn't work as well for the production of music as for the production of cars.
An organization with cost disease can use lower paid workers, increase the number of consumers per worker, subsidize production, or increase price. For live music, this means hiring less-talented musicians, selling more tickets per performance, writing grant applications, or, of course, raising ticket prices. For colleges, this means more graduate and adjunct instructors, increased enrollments and class size, fundraising, or, of course, raising tuition." 89
Ironically, Shirky is identifying the exact levers the tech industry uses to "disrupt" other industries today. In time-bound fields, technology is not radically increasing productivity or quality; instead, technology is either reducing the cost of labor (in the case of Uber) or increasing the number of consumers per worker (in the case of Udacity).
More fundamentally, though, all jobs are bound by demand. There is only so much a worker can do to accelerate productivity if there are no customers for the fruits of their labor. Cashiers can't process more transactions by working longer or smarter - they are constrained by the demand for the products being sold. There is only so much a customer service representative can do if no one is calling in for support. The same is true for doctors, who are bound by the number of patients, and teachers, who are bound by the number of students. And the same is true for tech workers, too.
"You have to know what people want," Graham cautions. He sneaks this caveat into the tail end of his techno-libertarian manifesto: "Wealth is what people want, and if people aren't using your software... maybe it's because you haven't made what they want." Tech workers, too, are bound by the demand for their products. So remember that massive income multiplier you can expect from working at a startup? In practice, "it is that you're 30 times as productive, and get paid between zero and a thousand times as much. If the mean is 30x, the median is probably zero." After all, "most startups tank." Period.90
Not only are non-tech workers not privy to the benefits of an ideal, virtual economy - most tech workers aren't either. Even those that work harder, faster, longer, and smarter may not reap the benefits of their labor. In fact, most won't.
So what's really going on here? "Increasingly startups are evolving into a vehicle for developing technology on spec," Paul Graham confesses. For large, established companies, it is often more efficient to buy a startup and repurpose their technology than it is to build something new in-house. "It makes the guys developing the technology more accountable," Graham suggests, "because they only get paid if they build the winner."91 Or, more callously, startups have become a way of "buying a cheap call option on a guy who doesn't know that's what you're doing," according to another startup investor - that is, paying an entrepreneur a relatively small amount now in order to cash in on their hard work later on.92
Industry leaders roundly glorify startup workers as the lifeblood of the new economy. They declaim the virtue of the entrepreneur, who, through his own hard work and gumption, can lead the rest of us to a better world. But those leaders also stand to profit from the work of founders and early employees. The Paul Grahams and Peter Thiels of the world, they're making high-risk bets across a broad portfolio. The chances are low: three out of every four startups do not generate a return on investors' capital.93 But the upside can be huge: every so often, you'll catch a multi-billion dollar payout like Facebook or Google. These massive winners more than offset the loss venture capitalists take on the majority of startups, and, with enough investments, good venture capitalists can virtually ensure they'll score a few winners.
So they pat these young founders on the back and tell them they're doing God's work, all the while taking advantage of the substantial risk the founders are taking. "In the first dotcom boom, the risk was largely carried by the investors," Wired magazine contributor Gideon Lewis-Kraus tells us, "but now the risk has been returned to the youth."94 Venture capitalists have essentially found a way to cost- effectively fund the production of software. The lure of an equity jackpot, in addition to the techno- libertarian rhetoric, can motivate early employees to work longer and harder without commensurate compensation.
Programming for a software startup has become little more than glorified manual production. After all, "coding is the blue-collar job of the 21st century," as early Facebook employee Chamath Palihapitiya told us earlier.95 "Web development is more like plumbing than any of us, perched in front of two slick monitors, would care to admit," says writer and programmer James Somers. The infrastructure the industry has built to simplify software development has made programming ever more accessible to the average person. "I do most of [my] work with a tool called Ruby on Rails," Somers offers as an example. "Ruby on Rails does for web developers what a toilet-installing robot would do for plumbers." So what does it take to be successful in such a role? "I just read the instruction manual," says Somers. "In fact, I'm especially coveted in the job market because I read the instruction manual particularly carefully. Because I'm assiduous and patient with instruction manuals in general. But that's all there is to it."96
Industry pundits are quick to decry the paucity of STEM education - Science, Technology, Engineering, and Mathematics - in America today. But most tech companies aren't looking for more PhDs in mathematics or computer science. They're looking for coders - legions of coders - to physically type out the millions of lines necessary to deliver the latest and greatest in personalized, mobile, and social applications. They're looking for people to do little more than translate instructions into code. By way of comparison, while you only need a handful of brilliant engineers to design an automobile, you need scores of assembly line workers and even more local mechanics to build and maintain it at scale. The more kids we can teach basic front-end programing or simple database operations, the lower the operating costs will be for tech companies down the line.
Tech workers are portrayed as "entrepreneurs," not employees. They are "problem solvers," not plumbers. That which has been "disrupted" is contrasted with that which has yet to be "disrupted," when in fact the two are one in the same. It is a fake dichotomy, an artificial dialectic, used to cultivate favorable regulatory treatment and extract ever more value out of workers for the benefit of industry leaders and investors. The once empirical notion of disruption has been co-opted and perverted, then obscured, and finally wholly abandoned, replaced with an entirely fictional concept devoid of any true meaning at all.
DISRUPTING NOTHING
Were you to visit the Facebook headquarters in Menlo Park, California, you'd be forgiven for thinking you had set foot in Disneyland. The entire campus is modeled around an idyllic pedestrian mall, a veritable Main Street, U.S.A. complete with all of the trappings of childhood fantasy mixed with a dash of Palo Alto flair. As you walk down the tree-lined boulevard, you'll find nine distinct restaurants, serving everything from sushi to local coffee to hand-made ice cream and pastries - and nearly everything is free. Need a haircut? The local barber will take care of you. You can also get your dry cleaning done, and there's a bike shop right down the road in case you're looking for a tune-up. You might even catch yourself skipping across a miniature yellow brick road next to a cute little wooden cottage, complete with the legs of a Wicked Witch (adorned with the requisite glittering red slippers) popping out from underneath.97
"The 'Main Street, U.S.A.' feel is no accident," reports The New York Times. "Sheryl Sandberg, the chief operating officer of Facebook, also serves on the board of Disney, and she brought in consultants from Anaheim and Orlando to perfect Facebook's look." 98
Google's corporate architects, too, have a Disneyesque aim: "to create the happiest, most productive workplace in the world," according to a Google spokesperson.99 The company's New York office features "a labyrinth of play areas; cafes, coffee bars and open kitchens; sunny outdoor terraces with chaises; gourmet cafeterias that serve free breakfast, lunch and dinner; Broadway-theme conference rooms with velvet drapes; and conversation areas designed to look like vintage subway cars." "Next to the recently expanded Lego play station," a Times reporter noted, "employees can scurry up a ladder that connects the fourth and fifth floors, where a fiendishly challenging scavenger hunt was in progress."100
Amid this playful banter, there is "a sense that nothing is permanent" on tech industry campuses, "that any product can be dislodged from greatness by something newer. It's the aesthetic of disruption: We must all change, all the time." At the same time, though, "when companies feel that they are changing the world as much as these tech enterprises do, they don't need just offices. They need monuments."101
These monuments of disruption are real-world Neverlands, playgrounds for the eternally Lost Boy tech worker. "This is not my dad's company," exclaims the hoodie-and-sandals clad 20-something, perched upon his bean bag throne in his primary-colored kingdom as he furiously hammers out lines of code in service of a better world. But that's just the industry pandering to the whims of a new generation; while his father may have sought a sedan and a suburban home for his nuclear family, our tech worker seems content to play out his individualistic fantasy of perpetual adolescence. "Tech Work" versus "Real Work" is an artificial dialectic: a children's game of us-versus-them that obscures the fact that, ball pits and free snacks aside, a job is still a job. "Tech Workers" are little more than a white-collar office workers playing dress up in service of large corporations captained by billionaires.
Silicon Valley has a habit of adopting cultural artifacts as a way of tangibly differentiating "Tech Work" from "Real Work," often without regard for social and historical context. Certainly the irony of building a workplace modeled on Disneyland, itself modeled on an idealization of American values, is lost on Silicon Valley. The reference is circular; the real world was sanitized and simplified to create the simulated world of Disneyland, which is now being simulated by tech companies in real world offices. Has Silicon Valley become so far removed from American values that the best it can do to approximate a true community is co-opt the saccharine façade of Main Street, U.S.A.?
See also: Mark Zuckerberg's "Little Red Book," a pamphlet issued to new Facebook employees to articulate the company's corporate vision, sporting a bright red cover and booster verbiage like "[Facebook] was built to accomplish a social mission" and "changing how people communicate will always change the world" in an apparent homage to Chinese revolutionary propaganda.102 Certainly the intent is to elevate Facebook employees above the ranks of your typical cubicle dwellers, which makes the reference to a renowned communist text all the more confusing.
There's also the Crunchie Awards, an annual ceremony hosted by TechCrunch bloggers to commemorate the best and brightest in the startup world. With all of the pomp and circumstance of a red carpet event, Silicon Valley aims to appropriate Hollywood flair to distinguish itself from the typically drab ethos of capitalistic enterprise. But it misses the mark; though the quintessential air of self- importance is on display, the event fails at the one redeeming quality of most entertainment industry awards: the commemoration of good art. Instead, the Crunchies celebrate buzzworthy business, conferring upon many an unproven startup a statuette of a gorilla brandishing a bone atop a pile of old computers.103 Again, confusing allusions abound: do the editors of TechCrunch actually mean to suggest that 2015 winner Uber is foreshadowing the dawn of malevolent artificial intelligence, a la Stanley Kubrick's 2001?104
These are inarticulate attempts at meaning, the result of millennial technologists pawing at historical artifacts they have fleeting memories of. They are nominal disruptions of a cultural landscape not well understood, driven by the myopic thinking of an industry that pays no credence to historical, cultural, and sociopolitical factors that cannot be reduced to singular, technocratic measures of value. Perhaps that's why it's no surprise that so many of the most "valuable" technology startups are so inherently irrelevant.
"We wanted flying cars, instead we got 140 characters."105 Even Peter Thiel's Founders Fund, a pillar of the new economy, recognizes this fundamental disconnect. The "140 characters" is a reference to Twitter, an online service that enables anyone to share their every passing thought with the rest of the world. The company has never turned a profit, and yet is valued at $11 billion. Yelp, a website that enables anyone to share their unsolicited opinions about local businesses, offers a similar story. After securing a single income-positive year in 2014, the company has returned to losing money, and yet is still worth more than $2 billion.106
Certainly, these companies offer services that consumers value - hundreds of millions of people use each every month. But the services themselves seem little more than digital augmentations of existing services, be they for delivering news, personal communications, or business listings. It is hard to understand why these companies warrant such high valuations when the conveniences they offer are seemingly so banal.
Outlandish valuations seem to be yet another spurious signifier of difference between the tech world and the real world, a way of showing that tech companies are not beholden to the same rules as other businesses. It's almost as if the more trivial a startup is, the higher its valuation must be. Snapchat, a mobile application that enables teenagers to send photos to one another, is valued at $16 billion on the private market. Pinterest, a website that enables people to save and share images they've found from around the web, is valued at $11 billion.107 Tumblr, a website that does basically the same thing, was purchased by Yahoo for more than $1 billion.108 Digital storage provider Dropbox is worth $10 billion. Corporate chat client Slack is worth nearly $4 billion. Music identifier app Shazam is worth $1 billion.109 The list goes on.
Take a moment to think about that. These companies that do little more than organize and display media on the web are, in theory, worth just as much as established businesses with substantial physical assets and cash flow. The retailer Best Buy, for example, operates 1,631 stores across the US, Canada, and Mexico, employs 125,000 people, and generates nearly $40 billion in revenue and just under $1 billion in net income - and is valued at less than $10 billion on the public market.110 Yes, Snapchat is worth more than Best Buy. The airline JetBlue operates over 200 aircraft, averages 900 flights a day, carries 35 million passengers a year, and generates more than $6 billion in revenue and over half a billion in net income - and is valued at about $5 billion.111 Yes, Pinterest is worth more than JetBlue.
Much of the overvaluation in the tech world is the result of an irrationally exuberant investment community fearful of missing out. Eager to get in on the next Google or Facebook, investors are emptying their pockets to participate in ever frothier fundraising rounds in so-called "unicorns" - private companies valued at more than a billion dollars. They justify their bets by comparing against other investments in the tech industry, a self-perpetuating rationale without any basis in business fundamentals or fact. These valuations are meaningless, nothing more than circular references within the insular tech economy that seem to find their justification in the notion of "disruption": the more disruptive a company, the more valuable it must be.
We've seen this before. It wasn't so long ago that technology companies achieved impossibly high valuations with little or no basis. And that's the incredible thing about "disruption." It is not just empty - it's entirely ahistorical. That is the true sorcery of "disruption," or really "post-disruption": it is its own pure simulacrum, a self-perpetuating myth destined to repeat itself, over and over, ad infinitum.
The parallels between the 90s tech boom and today are striking. First, you have the insane valuations for companies doing essentially the same things. Remember Webvan? The startup promised to disrupt traditional grocers by offering online ordering and home delivery. It peaked with a value of $1.2B in 1999, only to fail two years later.112 Today, Instacart offers an identical service and is valued at $2B.113 Kozmo.com also offered home delivery, but focused on small consumer goods and guaranteed delivery in under an hour via courier. The company planned an initial public offering in 2000, but was forced to withdraw as the market began to slide, and was out of business the next year.114 Today, Postmates offers a similar courier service and is valued at close to half a billion.115 Flooz.com, a failed virtual currency, bears a striking resemblance to Bitcoin. Geocities has quite a bit in common with personal website service providers Weebly and Squarespace. Even the services offered by AOL at the turn of the century parallel those offered by Google today, including internet service, a portal to the web, and messaging.
The books are the same too. Books about how technology and entrepreneurship have fundamentally disrupted the economy to ensure unprecedented growth in the future: The Long Boom in 1999, and The Coming Prosperity in 2012. Books about titans of industry who led their companies through uncharted territory to overwhelming success: Gates: How Microsoft's Mogul Reinvented an Industry in 1994 and The Everything Store: Jeff Bezos and the Age of Amazon in 2013. Even books by titans of industry, offering advice to other business leaders on how to run their companies: High Output Management by Intel's Andrew Grove in 1995 and How Google Works by Eric Schmidt in 2014.
And, of course, the titans of industry themselves seem remarkably familiar. Bill Gates dropped out of Harvard in 1975 to build Microsoft, and became the youngest self-made billionaire yet at the age of 32. Mark Zuckerberg dropped out of Harvard in 2004 to build Facebook, and became the youngest self- made billionaire yet at the age of 23. Steve Jobs co-founded Apple in 1976, was ousted from the company, took control of a second company, Pixar, only to return to Apple as CEO and run both companies in parallel. Jack Dorsey co-founded Twitter in 2006, was ousted from his role as CEO, founded a second company, Square, only to return as Twitter's CEO and run both companies in parallel.
In this circular, ever repeating world of post-disruption, even this essay is redundant. Here's essayist Carmen Hermosillo, writing under the alias humdog:
"i suspect that cyberspace exists because it is the purest manifestation of the mass (masse) as Jean Beaudrilliard [sic] described it. it is a black hole; it absorbs energy and personality and then re-presents it as spectacle...
it is fashionable to suggest that cyberspace is some kind of _island of the blessed_ where people are free to indulge and express their Individuality. some people write about cyberspace as though it were a 60's utopia. in reality, this is not true...
what i am getting at here is that electronic community is a commercial enterprise that dovetails nicely with the increasing trend towards dehumanization in our society: it wants to commodify human interaction, enjoy the spectacle regardless of the human cost. if and when the spectacle proves inconvenient [sic] or alarming, it engages in creative history like, like any good banana republic."116
While humdog's words could easily be mistaken to reference modern social media, they were actually written in 1994. You see, even this essay is redundant, trapped within post-disruption's self-referential narrative.
"Disruption" was first introduced as a way of describing an empirical business trend: occasions when a new product addressed a new market of consumers while also cannibalizing some business from an existing product. Silicon Valley took this meaning and perverted it to promote its own interests, masking the absence of true disruption and obscuring the growing wealth and influence of technology industry leaders. Eventually, "disruption" devolved into nothing more than a marketing term used to curry favor with regulators and employees.
But the narrative of disruption, or post-disruption, is incomplete without this final step, the discovery that this is nothing new. That the "disruption" of today is just the "internet" of another day. That public appeals to disruption are the same appeals the technology industry made in the late 90s, and that will probably be repeated again in the not too distant future.
These calls for disruption do serve to solidify the wealth and status of industry leaders. But they also betray the absurdity of the whole cyclical process - the myopic cultural appropriations, the trivial products and services, and the fact that investors and industry leaders, too, are drawn to irrationally invest in disruption.
We find ourselves stuck in an ahistorical loop, a circular reference that perpetuates the myth of disruption in service of a select few who stand to profit from our own commodification. But even their profit rings hollow as it is reinvested in absurd advancements as the charade of cyclical progress continues.
"Or even more, that of an 'interactive' couple who continuously project the entirety of their relationship onto the Internet in real-time. Who watches them? They watch themselves, but who else does, since everyone can get off, virtually speaking, from the same domestically integrated circuit? There will soon be nothing more than self-communicating zombies, whose lone umbilical relay will be their own feedback image - electronic avatars of dead shadows who, beyond death and the river Styx, will wander, perpetually passing their time retelling their own story."
- Jean Baudrillard117
Disclosure: I am currently an employee of Apple Inc. The views articulated in this essay are my own, in no way reflect the views of my employer, and are, to the best of my ability, not informed by any knowledge or experience I have gained while working at Apple. As an employee, I do not feel comfortable making public statements about Apple, and I have, wherever possible, avoided discussion of my employer in this essay.
Special thanks to Corey Goerdt, Jeff Lian, Leo Moauro, and David Paesani for providing feedback and guidance on an early draft of this essay.