After reading Mr. Friedman, James Howard Kunstler’s The Long Emergency: Surviving the Converging Catastrophes of the Twenty-First Century (Atlantic Monthly Press, 2005) is a bucket of cold water in the face. If Mr. Kunstler is correct, we’re sliding down the ugly slope of the petroleum production peak and all hell is likely to break loose as a result, rendering Mr. Friedman’s picture of the world obsolete. It is possible that industrial civilization’s inability to come up with energy replacements for petroleum would render Globalization 3.0 as outdated as the dot-com bubble. Take a look at the trend in oil prices over the past few years and it’s hard not to pay attention to Mr. Kunstler’s argument. Even if the catastrophe Mr. Kunstler fears is averted, he asks questions that are difficult to ignore about the local impact of globalization and its perhaps unsustainable driver.
Mr. Kunstler’s concerns are twofold: First, he believes that the “Hubbert Peak” (the point at which total oil production begins to decline) has been reached, and that “the fossil fuel efflorescence was a one-shot deal for the human race.” Second, he fears that the economic system that grew in the era of cheap energy, and that undergirds the globalized world that Mr. Friedman describes, will be unable to function as it does now: “The so-called global economy was not a permanent institution, as some seem to believe it was, but a set of transient circumstances peculiar to a certain time: the Indian summer of the fossil fuel era.”
World-girdling, hyperefficient, delocalizing supply chains, Mr. Kunstler asserts, have sucked capital, civic association, and know-how out of communities. Now those communities are unprepared for a world in which local self-reliance will become important because the global support infrastructure will no longer be affordable: “Conditions over the past two decades made possible the consolidation of retail trade by a handful of predatory, opportunistic corporations, of which Wal-Mart is arguably the epitome.… In effect, Americans threw away their communities in order to save a few dollars on hair dryers and plastic food storage tubs, never stopping to reflect on what they were destroying.”
Mr. Kunstler’s phrase “the Long Emergency” is both the title for his book and the name of the scenario he thinks is most likely over the rest of the 21st century. If you accept the author’s argument about the impending end of cheap fossil fuels as plausible, his warning about the consequences is sobering: “Virtually all of the economic relationships among persons, nations, institutions, and things that we have taken for granted as permanent will be radically changed during the Long Emergency. Life will become intensely and increasingly local.”
Even if the world’s economies and societies dodge or avert that catastrophe, there are other huge impending changes to worry about. There’s the matter, for instance, of what will happen when scientists crack the secrets of evolution and intelligence. This is examined in Joel Garreau’s Radical Evolution: The Promise and Peril of Enhancing Our Minds, Our Bodies — and What It Means To Be Human (Doubleday, 2005). Whether you regard the author’s far-reaching vision of decades to come as a disaster warning or a utopian forecast depends on what you think “human” ought to mean. With germ-cell-line engineering that gives your great-grandchildren extra brain cells, more muscle power, or, for that matter, gills or superhuman computer intelligence, the possibilities that technological innovation may make practical in this century pose questions that used to be confined to philosophy, theology, and fiction.
Mr. Garreau is a zesty storyteller, a gonzo futurist who builds alternative universes from solid science, and he’s neither a technology booster nor a Luddite. The questions his moral quandaries raise are among the deepest questions we know how to ask: What kind of creatures are we — the apelike animals from which we evolved, or the angels we imagine we can become? If we accept the Darwinian explanation of our origins, where do we want to go next, now that we’re harnessing the very engines of evolution? Is there a “too far” for biotechnology, nanotechnology, and artificial intelligence? And what would anyone be able to do about it if there really is a line that technology shouldn’t cross — a line that could mean the end of Homo sapiens?