Best Business Books 2021: The right time to yell “fire!”
A collaborative manifesto by academics urgently calling for change is the most provocative technology book of the year.
Your Computer Is On Fire
edited by Thomas S. Mullaney, Benjamin Peters, Mar Hicks, and Kavita Philip (MIT Press, 2021)
A simple ethos guides many executives today: build businesses that can harness the power of new technologies, scale up as quickly as possible, and show investors that these businesses can compete in an uncertain future. The result has been the rapid proliferation of digital (and digitally powered) companies that have become ever more relevant and essential to our lives. But these companies also have produced a wide range of unintended harmful consequences.
The impetus for business to address these consequences—including breaches of personal information, biases in algorithms, the propagation of misinformation, widening inequality, and damage to the environment—is growing by the day. Activist shareholders, among many other stakeholders, are advocating for “responsible technology” policies and for tighter linkages between tech ethics and executive compensation packages. Socially and environmentally conscious consumers are voting with their wallets, encouraging businesses to reappraise their products and purpose, including their role as employers of diverse, engaged workforces. The global pandemic has only added to the momentum for change.
But how can companies maximize the positive impacts of technology while minimizing all the bad ones? That’s the next great challenge facing corporate leaders, and our system as a whole. And that’s why Your Computer Is On Fire, which is critical reading for business leaders seeking to tackle this question head-on, is the best technology book of 2021.
In some ways, it’s an unusual choice. The book consists of 16 essays rather than a single narrative. And they are written by academics for an intended audience of STEM students, humanists, technologists, and social scientists. But executives who make their way through the 400-plus pages of Your Computer Is On Fire will finish it more than a little unsettled. The authors fearlessly dismantle the technology industry’s most sacred assumptions, forcing a rethinking of everything we’ve come to accept as true about our digital lives and the multibillion-dollar digital transformations going on inside our companies. Titles such as “Gender Is a Corporate Tool,” “A Network Is Not a Network,” and “Coding Is Not Empowerment” pull no punches.
How can companies maximize the positive impacts of technology while minimizing all the bad ones? That’s the next great challenge facing corporate leaders, and our system as a whole.
In the first and most provocative essay of the collection, “The Cloud Is a Factory,” Indiana University associate professor Nathan Ensmenger challenges readers to think differently about one of the most transformative technologies for business in a generation: the cloud.
What exactly is the cloud? The quick answer is that it’s a set of computing-software services—anything from email to inventory-tracking software—that users can access via the internet, not through desktops or internal servers. Cloud computing platforms have proven to be a powerful means of testing new approaches and experimenting with new technologies, including advanced analytics and 3D printing.
But in much simpler terms, the cloud is a bunch of computers located in a data center somewhere else—computers that need physical materials such as metal and plastic, as well as electricity, water, and people. Kind of like…an industrial factory. As Ensmenger observes, a typical data center draws between 350 and 500 megawatts of power and requires roughly 400,000 gallons of fresh water daily for cooling.
Yet because the term cloud has been used as a metaphorical device, and because the cloud tends to be thought of as a benign, virtual technological solution, the computer industry has managed to circumvent the long history of regulating physical infrastructural resources. In the past, when a traditional factory polluted water supplies or maimed workers, public policy responded, even if tardily. But the cloud remains largely unregulated, with all its negative factory-like effects underreported. “Let us bring back to earth this deliberately ambiguous and ethereal metaphor by grounding it in a larger history of technology, labor, and the built environment—before it is too late,” Ensmenger implores.
In another essay, “Your Robot Isn’t Neutral,” Safiya Umoja Noble, associate professor at the University of California, Los Angeles, calls for a deeper, commonsense understanding about the processes involved in the formation of data, which she posits is essentially a social construct. Just as race and gender are social constructs—things we decide on rather than things that are immutable or naturally existing—so, too, is the data that has come to dominate our lives. The problem is that there is no connection between the making of that data and the historical social practices that inform their construction. When data is developed from a set of discriminatory social processes, such as the creation of statistics on policing a city, it’s often impossible to recognize that this data also reflects procedures such as over-policing and disproportionate arrest rates in African-American, Latinx, and low-income neighborhoods, Noble argues. “The concepts of the purity and neutrality of data are so deeply embedded in the training and discourses about what data is that there is great difficulty moving away from the reductionist argument that ‘math can’t discriminate because it’s math,’” she writes.
The essays also tackle gender inequality, another problem for the high-tech world. In “Sexism Is a Feature, Not a Bug,” Mar Hicks, associate professor at the Illinois Institute of Technology, tells the story of sexist hiring and firing practices in England’s computing sector, and how electronic computing technology became an “abstraction of political power in machine form.” “These failures are not simply accidents,” Hicks writes, “they are features of how the systems were designed to work and, without significant outside intervention, how they will continue to function.”
Though each writer looks at a different issue through his or her own unique lens, the collection of essays in Your Computer Is On Fire achieves narrative cohesion. History—particularly the history of computers and industrial society—serves as a purposeful and cunning organizational device, because the tech industry isn’t wired to look backward, only forward. The industry built itself on the notion of constant reinvention, and the writers know that the act of mining history for lessons is not in its DNA. But it should be. As the writers of Your Computer Is On Fire make clear, there’s a lot at stake when we ignore history and fail to think more humanistically about computing.
Without doubt, companies need to address the harms created by technology to keep the damages from outpacing the gains. The book does not offer any concrete recommendations for crafting such responsible technology policies, nor does it outline broad policy changes. But Your Computer Is On Fire succeeds by forcing us to adjust how we think and talk about the essential issues at the center of business and society. And that, the authors note, is a great starting point for change.
As University of Tulsa professor Benjamin Peters, author of “A Network Is Not a Network,” explains: “Tech will deliver on neither its promises nor its curses, and tech observers should avoid both utopian dreamers and dystopian catastrophists. The world truly is on fire, but that is no reason it will either be cleansed or ravaged in the precise day and hour that self-proclaimed prophets of profit and doom predict. The flow of history will continue to surprise.”
Honorable mentions:
Futureproof: 9 Rules for Humans in the Age of Automation
by Kevin Roose (Random House, 2021)
Artificial intelligence and advanced robotics are making it possible for machines to take on tasks that once required a person. By some accounts, almost half of all jobs in the US economy could be made obsolete. But what if our future reality is more nuanced than that? What if automation displaces millions from their jobs and at the same time improves healthcare diagnostics and slows climate change? And how do we thrive in this kind of hybrid environment? These are the questions at the heart of the compelling book Futureproof, penned by New York Times columnist Kevin Roose. With honesty and humor, Roose attempts to correct some faults in how we think about AI and suggests ways we can make the most of our advantages. Whether he’s advocating making “consequentialist thinking” part of a standard STEM curriculum or encouraging “digital discernment,” Roose adds an important contribution to the scholarship surrounding our AI future in this immensely readable and actionable book.
A World without Email: Reimagining Work in an Age of Communication Overload
by Cal Newport (Portfolio/Penguin, 2021)
Did you get my email? Email—and the ever-increasing volume of it—has become the bane of the 21st-century worker’s existence. But Cal Newport, associate professor of computer science at Georgetown University, believes we can live without it. In his great book, A World without Email, Newport tackles how workplaces create the “hyperactive hive mind”—always and rapidly communicating, responding, and sharing information—and the problems that result. This style of work, Newport argues, forces people to constantly check their inboxes or message platforms, which reduces their ability to concentrate and focus, causing mental fatigue and contributing to dissatisfaction with work. His highly accessible book lays out four simple principles for redesigning the world of work without email: the attention capital principle (treat attention like a value resource), the process principle (develop work processes that maximize the value generated from your attention), the protocol principle (structure work processes to optimize coordination between employees), and the specialization principle (permit employees to work on fewer things more deeply). Though changing our workplace email culture won’t be easy, Newport notes that it is “one of the most exciting and impactful challenges” we face today.
Author profile:
- Paul Barbagallo is a senior editor of strategy+business.