The main difference between computing and the industrial technologies Mr. Carr cites is that the stored-program computer is a “universal” tool, which can be programmed to do all manner of tasks. The general-purpose nature of computing — especially software, a medium without material constraints — makes it more like biology than like railroads or electricity. It has the ability to evolve and take new forms. Speech recognition, natural language processing, and self-healing systems are just three of the next evolutionary steps on the computing horizon.
Mr. Carr might dismiss such comments as romanticized nonsense — and he certainly could be right. Yet understanding the nature of the technology is crucial to determining whether computing is truly graying or, more likely, whether some parts of the industry are maturing while new forms emerge further up the computing food chain. Are we seeing old age — or merely the end of one stage in a continuing cycle of renewal?
Mr. Carr notes that the technology bubble of the 1990s resembled the booms and busts of railway and telegraph investment, which marked the passing of youthful exuberance in those industries. In the computer industry, however, there already had been two previous boom-and-bust cycles — in the late 1960s, when mainframe time-sharing services appeared to be the computing utilities of their day, and in the mid-1980s, when legions of personal computer companies were founded and soon perished. Again, the pattern seems to be cyclical and evolutionary, as innovations accumulate and eventually cross a threshold, opening doors to broader market opportunities.
Let’s take one potential example, Web services. The nerdy term refers to the software protocols that could allow a new stage of automation as data and applications become able to communicate with each other over the Internet. More broadly, Web services are seen as the building blocks of a new “services-based architecture” for computing. Mr. Carr briskly brushes Web services into his “vendorization” bucket. He writes, “Here, too, however, the technical innovations are coming from vendors, not users.” The vendors — IBM, Microsoft, Sun Microsystems, and others — are working jointly only on the alphabet soup of software protocols: XML, UDDI, WSDL, and so on.
Yet when technologists talk of a services-based architecture, they are speaking of a new computing platform that they see as the next big evolutionary step in decentralizing the means and tools of innovation — much as the minicomputer was a new platform that decentralized computing away from the mainframe, and then the personal computer put power in many more users’ hands. Computer scientists regard the Web as a “dumb” medium in a sense. It is, to be sure, a truly remarkable low-cost communications tool for search, discovery, and transactions, but the Web is mostly raw infrastructure because it is not very programmable. Web services hold the promise of making the Internet a programmable computing platform, which is where differentiation and potentially strategic advantage lie.
I cite this as only one example of where Mr. Carr’s desire to fit everything neatly into his thesis leads him astray. There are others. He mentions Linux, and its adoption by Internet pacesetters such as Google and Amazon, as proof that commodity technology is plenty good enough for almost any need. Linux, the open source operating system, does allow those companies to build vast computing power plants on low-cost hardware from the PC industry. But the other great appeal of Linux — and open source software in general — is that it also frees those companies from the vendors. The rocket scientists at Google and Amazon can tweak the software and change it without seeking permission from Microsoft or Sun Microsystems or anyone else. Today, Google is both a brand name and verb. But technological differentiation has been the bedrock of its comparative advantage. It is the better mousetrap in Internet search. As an example, Google undermines, rather than supports, Mr. Carr’s point.