A brief history of tech skepticism
Writing off emerging technologies too soon is a centuries-old practice. Are we right to be so dismissive?
New technologies often seem to inspire equal—and often counteracting—surges of enthusiasm and skepticism. The metaverse, which is prompting both its champions and naysayers to express themselves, is just the latest. Of course, this is nothing new—neither the sense of skepticism that surrounds new technologies nor the concept of the metaverse itself.
The idea of virtual worlds dates back as far as the 1935 short story Pygmalion’s Spectacles, by Stanley Weinbaum. And metaverse-like real-world use cases have regularly been explored through fiction since at least the 1950s. If you define the metaverse as an immersive, highly interactive, always-on digital space, I first got excited about its potential in the early ’80s via a fuzzy bootleg VHS of Tron. Later, like a lot of other tech aficionados, I spent many a happy hour exploring the online game Second Life with my Snoopy-headed avatar in 2003–04.
To be sure, the killer app that will drive mass adoption of something like the metaverse has yet to be identified. Even the platform’s name has changed—“the metaverse” has also been called “the spatial web,” “cyber-reality,” “the virtual world,” “extended reality” (or “XR”), and more—and it will likely change again. But that’s no reason to write the concept off. It’s been around for decades because it’s an intriguing one, imbued with so much possibility.
There’s another reason to be somewhat sanguine about the future of technological innovations: their past.
History is positively littered with examples of world-changing innovations being dismissed by the sharpest, most-credentialed observers as pointless, dangerous, funny, or all three, before those same innovations went on to prove their critics wrong. (George Gershwin wrote a great song about it.)
Rail travel (1825): “The gross exaggerations of the powers of the locomotive steam-engine…may delude for a time, but must end in the mortification of those concerned.”
—Quarterly Review
The telephone (1878): “The Americans have need of the telephone, but we do not. We have plenty of messenger boys.”
—William Henry Preece, Chief Engineer of the British Post Office
Light bulbs (1879): “Everyone acquainted with the subject will recognize [Thomas Edison’s experiments] as a conspicuous failure, trumpeted as a wonderful success.”
—Henry Morton, President of the Stevens Institute of Technology
AC electricity (1889): “Fooling around with alternating current is just a waste of time. Nobody will use it, ever.”
—Thomas Edison
The automobile (1899): “The ordinary horseless carriage is, at present, a luxury for the wealthy; and although its price will probably fall in the future, it will never, of course, come into as common use as the bicycle.”
—Literary Digest
Planes (1911): “Airplanes are interesting toys but of no military value.”
—Marshal Ferdinand Foch, Supreme Commander of the Allied Armies in World War I, 1918–20
Sound in films (1928): “I don’t think people will want talking pictures long…. Talking doesn’t belong in pictures.”
—Joseph M. Schenck, President of United Artists
Nuclear power (1932): “There is not the slightest indication that [nuclear energy] will ever be obtainable.”
—Albert Einstein
Television (1946): “Television won’t be able to hold on to any market it captures after the first six months. People will soon get tired of staring at a plywood box every night.”
—Darryl F. Zanuck, Head of 20th Century Fox
Satellite communications (1961): “There is practically no chance communications space satellites will be used to provide better telephone, telegraph, television, or radio service inside the United States.”
—T.A.M. Craven, US Federal Communications Commission
Home computers (1977): “There is no reason anyone would want a computer in their home.”
—Ken Olsen, Founder of Digital Equipment Corporation (DEC)
Laptop computers (1985): “For the most part, the portable computer is a dream machine for the few…the real future of the laptop computer will remain in the specialized niche markets.”
—New York Times
The internet (1998): “By 2005 or so, it will become clear that the internet’s impact on the economy has been no greater than the fax machine’s.”
—Paul Krugman, Winner of the 2008 Nobel Memorial Prize in Economic Sciences
The iPhone (2006): “Everyone’s always asking me when Apple will come out with a cell phone. My answer is, ‘Probably never.’”
—David Pogue, Technology Editor of the New York Times
Twenty-twenty hindsight
Why have so many been so skeptical of developments whose success, in hindsight, seems obvious?
One reason is that some technologies take time to reach maturity and mass adoption—and rely on the development of infrastructure that doesn’t yet exist. The ancient Greeks invented the aeolipile steam engine some 1,700 years before Thomas Newcomen created one deemed useful for industrial work. It took another 65 years before James Watt’s adaptations ushered in the true age of steam, a further quarter-century before the first steam locomotives began to appear, and another 20-odd years before the first passenger services became available. On this time line, the metaverse is in its infancy. Some shrewd observers—like author Matthew Ball, one of the world’s leading metaverse analysts—expect it will be years if not decades before the idea reaches its full potential.
First impressions count
As humans, we are afflicted with tendencies that can skew our ability to objectively assess the potential of unfamiliar things. Our cognitive biases condition us to be suspicious of that which is novel or different. Many in business will be familiar with “not invented here” syndrome and status quo bias, which favor the familiar and the stable. Which titan of industry, after all, wants to imagine their own demise?
Most dangerous for new technologies is the self-reinforcing trifecta of negativity bias (a focus on negative rather than positive information), congruence bias (an overreliance on an initial hypotheses), and confirmation bias (privileging information that justifies those hypotheses, which leads to a spiral of self-satisfied assumptions that our opinions are facts). These biases are exacerbated when promised breakthroughs take much longer to arrive than we’d hoped.
Age can also have something to do with it. As the late, great Douglas Adams put it in his posthumous collection The Salmon of Doubt:
1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
3. Anything invented after you’re thirty-five is against the natural order of things.
The above assembly of prognosticating whiffs by often-celebrated minds should fill us with equal doses of schadenfreude and humility. The truth is, of course, no one can predict the future. All we can do when it comes to emerging tech is be inquisitive and actively seek out our blind spots; dig into the use cases, the business cases, and the constraints; and then come to our own conclusions about the likely utility of these unfamiliar things.
But, as we evaluate and assess the prospects of new innovations, we should try to maintain some self-awareness. Confronted with the unfamiliar, people tend to fall back on heuristic approaches or instinctive responses. Take Plato’s account, in the Phaedrus, of negative reactions to the discovery of writing: “This invention will produce forgetfulness in the minds of those who learn to use it, because they will not practice their memory.” Nearly 2,400 years later, in 2008, the Atlantic unironically dismissed another new knowledge-sharing technology as similarly harmful, and for similar reasons. The headline: “Is Google making us stupid?”
It’s easy (and, let’s be honest, fun) to dismiss critics of innovations and new technologies as reactionary or unimaginative. But it’s also sometimes too easy to dismiss others as gullible for buying into “fads” that, in the moment, seem like nonsense.
Author profile:
- James Clive-Matthews is a senior editor in global thought leadership at PwC.