Even as new technologies revolutionize everything from health care to media to warfare, it’s important to remember that our world runs primarily on products and technologies long in use — everything from aspirin to the internal combustion engine. In his new book, The Shock of the Old: Technology and Global History since 1900, David Edgerton, the Hans Rausing Professor of Science, Technology, and Medicine at Imperial College London, argues for a new view of the history of technology that focuses not only on “Big Bang” innovations — Sony’s Walkman or a future spaceship that could carry humans to Mars — but also on incremental change and how societies use the technologies they invent or, just as important, borrow. As Edgerton writes, we must shift our focus “from the new to the old, the big to the small, the spectacular to the mundane, the masculine to the feminine, the rich to the poor.”
Such a history would reveal, he says, that many of the fastest-growing countries of the 20th century, such as Japan and China, were not the most inventive. Indeed, most growth comes as a result of technology transfer, not of innovation. That understanding has significant consequences for trade policy in a global economy. For example, if a country such as the U.S. demands strict intellectual property protections from another nation before agreeing to trade with it, to what degree does that stymie technology transfer and the flowering of new products that often result? By reorienting the way we view technology, we can also better understand whether we’re making wise decisions in adopting new technologies, such as genetically modified crops — especially when there are often less expensive alternatives that, although perhaps not as innovative, may be less risky.
What’s wrong with our current understanding of technology?
EDGERTON: When we think about technology, we immediately think about invention and innovation and the future, and not about how things come into use. We’re always so enthusiastic about what’s going to happen in five or 10 years’ time. But we lack an explicit history of technology, by which I mean a history of the vast number of products that are in use at any particular time — as well as a history of innovation, outlining all of the inventions (large and small) from a particular period. Instead we have an unsatisfactory mixture of the two, leaving us with little more than excitable descriptions of the early life of some of the earth-shattering technologies that later became widely used. For example, we have a rich history of the development of the atom bomb during World War II, but there has been little or no discussion of the critical role of the horse during the war; Hitler’s army used more horses in its invasion of Russia than Napoleon did 130 years earlier.
That’s a major gap, because if we’re interested in the relationship between technology and society, we need to know what’s in use and what advances are being made throughout a culture at any particular time. And it is just as important to understand the inventions that failed as it is to study those that succeeded. In fact, the majority of all inventions fail, and they fail for a reason. The Concorde was an economic failure, for example, because it was a dreadful waste of money. We must recognize that in order to determine on what basis to go about producing another Concorde.
Describe your sense of the history of technology. Is it a conservative view?
EDGERTON: I know I could be interpreted as arguing that the new doesn’t matter and is overrated, while the old stuff is still hugely important. But I don’t really want to argue that. I insist that there are very, very many new things under the sun. So I do want to emphasize and, indeed, celebrate rapid change, but what I’m very hostile to is the celebration of pseudo-change or the cult of future change.
Many people who talk about invention and innovation actually want to keep the world very much as it is. These people argue for the importance of technical change in the future so as to avoid making any changes now. An example of that would be global warming. People talk about the necessity for new technology investment in technological solutions that will come into play in, five, 10, or 20 years’ time as a way of avoiding the kind of actions that we could take now.
Your new book includes a table listing the top corporate spenders in R&D, the point being that they’re mostly very old companies. Why is that?
EDGERTON: The top five innovation spenders for 2003 were Ford Motor Company, Pfizer Inc., DaimlerChrysler AG, Siemens AG, and Toyota Motor Corporation. All but Toyota were in business before 1914. There’s extraordinary stability at the top of the list, and it’s certainly not the biotech and the IT companies that are up there, even though you’d expect that to be the case if you were listening to the hype of the futurists and innovation cheerleaders.
Why are several car companies at the top? Well, it costs a lot of money to develop a new car. Most people think that research expenditures are driven by the nature of the technologies themselves and their productivity. But that’s not the case. R&D levels are determined by what people want to pay for. And if people still want to pay for new cars — an invention running on an idea from 100 years ago — you’re going to have a lot of R&D in that area.
What does that tell us about innovation on a larger scale?
EDGERTON: I’ve argued that there is no positive correlation between levels of research spending and levels of economic growth. Most discussion of science policy by serious people is predicated on the belief that there is such a positive correlation, that if countries spend more on innovation, they get back everything they spend and more in return. In fact, if anything, there is a negative correlation, at least for fast-growing countries.
That’s because most technology in newly developing countries comes from abroad. So countries that are quick to adopt technology can benefit a great deal in terms of growth. I can think of only one important exception to that rule: the U.S. in the middle of the 20th century. But that was a very exceptional case, and the U.S. accounted for something like 50 percent of world production and the same or more of world inventiveness. That’s no longer the case, of course.
What can that tell us about the role of corporate R&D in the innovation process?
EDGERTON: There’s a lot of mythmaking about what R&D, even at universities, was like in the past. People exaggerate the extent to which blue-skies research dominated research. Now, that’s not to say that research with no particular object in mind has not been important. But thinking about actual uses of technology has always been much more important in research than people have made out.
That’s why it’s important to bear in mind that a lot of inventions emerge from use. The transistor didn’t come out of Bell Labs simply because they were doing blue-skies research. They had a long tradition of thinking about rectifiers and semiconductors. Yes, it involved a lot of creative thinking, but in a context in which people were immersed in a world of use of particular technologies.
We think we know what innovation means, but to study it, we use fiscal measures that don’t actually tell you much about innovation or invention. Consider patents. What are patents? They’re legal documents. They’re not in themselves a measure of inventiveness. Research and development spending is a measure of how much you spend on research and development. It doesn’t tell you anything about outputs.
A lot of people talk about the need for a new kind of business model and the spinout from universities and the small firm and the entrepreneurial small firm. That’s been the talk of the last 20 or 30 years, has it not? And yet, here we have IBM, which is a terrifically old company and has gone through many generations of technologies, but it is obviously the same corporate entity, the same high-level innovator. There are lots of different kinds of successful innovating firms.
One way of putting it is that the danger is not just in believing that there has been one model in the past and there should be one model in the future. It’s believing that we actually understand the history of technology, of processes of discovery, of what was in use and of invention, as well.
Edward Baker (email@example.com), former editor of CIO Insight magazine, is a contributing editor to strategy+business.