The divergence in economic growth rates is not a recent development, of course. It’s been more than 20 years since the term Eurosclerosis was coined to describe the furring of Europe’s economic arteries. Moreover, the average growth rate in European countries has slowed each decade since the 1970s, falling well behind U.S. growth. By the second half of the 1990s, Europe averaged annual growth of 2.6 percent, compared with 4.1 percent in the United States, which only encouraged more breathless millennial hype about America’s New Economy — the shorthand for technology-driven, turbocharged growth without inflation.
The bursting of the dot-com bubble and the U.S. recession in 2001 seemed to put an end to the New Economy mania, and these events were greeted by some Europeans with a sense of relief. For them, surely this also meant an end to harangues about “why can’t Europe be more like America and share in the productivity miracle being brought about by computers?” The respite was only temporary, it turns out. While Americans may be haranguing their European compatriots less these days, the competitiveness gap remains. Europe is still struggling economically, with the biggest continental economies — France, Germany, and Italy — continuing to stagnate. Meanwhile, the U.S. economy has bounced back and, what’s more, has seen consistently impressive productivity growth.
Several recent research papers and books from both sides of the Atlantic offer compelling statistics and explore the reasons for America’s continuing economic advantage, which have as much to do with trends in corporate organization and political choices as with investment in new technologies. They suggest the United States’ growth will continue to outpace Europe’s unless European businesspeople and political leaders accept the need for workplace change. But the research also examines why reform is so hard in Europe, where many people resist the darker side of the New Economy: rising income inequality.
A research report published in December 2004 by the Federal Reserve Bank of New York, which took the lead in documenting the effects of new information and communications technologies (ICTs) on productivity, confirms that America’s productivity growth is accelerating despite all the turmoil since early 2001. Between 1995 and 2003, the growth in this measure of the economy’s potential was more than twice the average growth of earlier decades. In “Will the U.S. Productivity Resurgence Continue?” economists Dale W. Jorgenson, Mun S. Ho, and Kevin Stiroh even raised their earlier estimate of the rate at which private-sector output per worker can grow, from 2.2 to 2.6 percent a year.
About half of the improvement is attributable to business investment in new high-tech computer equipment, as well as non-IT equipment. Still, almost as much can be explained by improvements in the efficiency with which businesses use all their inputs, whether that’s additional employees or other new investment. Economists call this “total factor productivity.” It includes the effect of technological progress, such as faster semiconductor speeds reflecting Moore’s Law (the doubling of computer power, and halving of its cost, roughly every 18 to 24 months). It also includes other sources of improvement in business efficiency such as better organization and management, developed in response to greater competitive pressures.
This makes intuitive sense. Information technology can only affect productivity if companies use it effectively after the technology is installed. This takes time, however. For many years, New Economy skeptics spoke of the “productivity paradox.” American businesses spent millions of dollars on computers during the 1980s, but there was no sign of a pickup in productivity in the economy until the mid-1990s. Yet, eventually, competitive markets provided the external pressure for businesses to make organizational changes; these were the internal mechanisms by which the adaptation finally occurred.