Skip to contentSkip to navigation

Another Universe?

A review of Turing’s Cathedral, by George Dyson.

(originally published by Booz & Company)

Turing’s Cathedral: The Origins of the Digital Universe

by George Dyson
Pantheon, 2012

With Turing’s Cathedral: The Origins of the Digital Universe, George Dyson, a technology historian and builder of Aleut kayaks, extends his speculative theme that not one, but two universes exist: one inhabited by us and another inhabited by digital machines. As the preface of this important and challenging new book suggests, Dyson is offering us a “creation myth” for this other universe that revolves around one of the first computers, MANIAC, which was built at the Princeton Institute for Advanced Study (IAS) in the years following World War II. It is a place and time with which the author is intimately connected — he grew up in Princeton, N.J., in the 1950s, while his father, physicist and mathematician Freeman Dyson, worked at IAS.

Some reviewers have taken issue with the younger Dyson’s version of history for barely mentioning British contributions to the development of the digital universe, such as the creation of the Williams Tube, an early form of computer memory that was eventually used in MANIAC; the Manchester Baby, the world’s first stored-program computer; and even the title’s namesake, founding father of computer science Alan Turing. But in focusing on the details, these reviewers are obscuring the true significance of this story: The great achievement of the group at Princeton was software. It is software that can reproduce and modify itself. Code is what animates computers — making it the life force of the digital universe.

Whether or not we agree with Dyson’s dual-universe ideas (which he shares to varying degrees with other notable technologists, such as Kevin Kelly, cofounder and a longtime editor of Wired magazine, and Ray Kurzweil, whose concept of singularity captured the imagination of Google’s founders), we should take them seriously. Increasingly, business is being run by digital systems. The volume on the NYSE and other exchanges is being dominated by “algorithmic” trading — software talking to software. The work that Google and many other digital businesses do is also overwhelmingly based on algorithms, with few, if any, humans in the loop. Manufacturing productivity has skyrocketed because people don’t do the work anymore; instead, they service machines that are operated by complex software. Ten or 20 years from now, it will likely seem that software is running the world.

Dyson believes that elements such as computer viruses and the millions of computers in the Google cloud are “numerical organisms” and that they should be studied as a new form of biology. And he is dead serious when he uses the term digital universe, as he made clear in a recent interview on Edge.org: “We’re missing a tremendous opportunity. We’re asleep at the switch because it’s not a metaphor. In 1945 we actually did create a new universe. This is a universe of numbers with a life of their own that we only see in terms of what those numbers can do for us.” His impatience with human-centered thinking is also long-standing. It was reflected in the preface to his book Darwin among the Machines: The Evolution of Global Intelligence (Perseus, 1997): “In the game of life and evolution, there are three players at the table: human beings, nature, and machines. I am firmly on the side of nature. But nature, I suspect, is on the side of machines.”

The notion that our inventions take on a life of their own didn’t originate with Dyson. It was an important theme in the work of Marshall McLuhan, whose ideas experienced a revival when Kelly adopted him as Wired magazine’s original “patron saint.” For McLuhan, technology begins as an extension of ourselves, as noted in the subtitle of his book Understanding Media: The Extensions of Man (McGraw-Hill, 1964), but then can “self-amputate” from us and take on an independent existence. In the process, we can lose our apparent mastery over technology and become deeply shaped by our own inventions. McLuhan did not propose that new universes were created, however, even though the effects of sweeping technological change could be “cosmic,” and he did not see our loss of control over technology as an inevitable outcome, but one dependent on our willingness and ability to understand what was happening to us.

Another notable predecessor in the conception of a digital universe was MIT-based mathematician Norbert Wiener, among the few comparable in stature to Princeton’s John von Neumann. Both are key characters in Turing’s Cathedral. Wiener was deeply concerned about the effects of new technologies on humans. In 1964, when asked in a U.S. News & World Report interview whether there was any danger that machines would someday get the upper hand over humans, he replied, “There is, definitely, that danger if we don’t take a realistic attitude. The danger is essentially intellectual laziness. Some people have been so bamboozled by the word machine that they don’t realize what can be done and what cannot be done with machines — and what can be left, and what cannot be left, to the human beings.”

The same danger exists for executives today. You have to know how to manage the code that ultimately defines your company’s output and, in many ways, its success. What might have appeared to be a simple question of automation in the 1950s is now an exponentially more complex task of figuring out how these apparently living systems will interact with one another. Dyson doesn’t offer prescriptive advice, but his message — that much more effort is needed to study the biology and physics and even anthropology of code — is critically important because we can’t manage what we don’t understand. Without this understanding, we don’t have a chance of remaining competitive in the digital economy.

Author profile:

  • Mark Stahlman is a technology strategist and founder of TMT Strategies LLC, a research company focused on the impact of digital technology on the economy and society.