Skip to contentSkip to navigation

Is Genius Enough?

Britian had it all — brains, ideas, and inventions like radar and pencillin — but the U.S. brought the best to the market. The lesson is sobering: Native brilliance needs a national backup drive.

(originally published by Booz & Company)

Take a moment from riding the Nasdaq roller coaster for a risk-free technology quiz....

Which company was the first commercial underwriter of electronic computers and where? What was the starting point of the first passenger-carrying hovercraft? When was the first successful trial of the jet engine and where? Which country pioneered radar? Where was the first antibiotic discovered? Where was the first theater to use incandescent lighting? And what was the nationality of the man who wrote the basic code for the World Wide Web?

Are those your final answers? If they are correct, you should experience a million-dollar glow, because you also have figured out the connecting thread.

All these innovations had their origins not in America, or Japan or Germany, but in Britain. To fill in the blanks: It was the British tea company J. Lyons & Company, of the corner-house cafes, that reaped commercial benefit from first financing computer research in 1947. Maurice Wilkes and his engineers at Cambridge University developed Leo (Lyons Electronic Office), based on Wilkes's Electronic Delay Storage Automatic Calculator (EDSAC), the prototype stored-program computer. Leo began doing clerical work in 1951.

The first hovercraft, invented by Christopher Cockerell in 1955, carried passengers across the Channel from England to France in 1959. The father of the jet engine was Frank Whittle, a Royal Air Force officer who, in 1930, when he was only 24, designed and patented a gas turbine to produce jet propulsion; the first ear-blasting run was made in Rugby, England, on April 12, 1937. Robert A. Watson-Watt tested his radar system on the Suffolk coast in 1935; five years later, it helped to win the Battle of Britain.

Most people, I daresay, will have rightly identified the founder of antibiotic therapy as Alexander Fleming, who discovered penicillin at the University of London Medical School in 1928. Most will also identify Thomas Edison as the provider of the theater lighting; if so, most of you are wrong. It was Joseph Wilson Swan who illuminated the Savoy Theatre in London with 1,200 of his incandescent lamps in October 1881, three years after he developed the all-glass hermetically sealed bulb. Edison and Swan solved their patent disputes by a merger in 1883. Finally, Tim Berners-Lee, who created the first World Wide Web software in November 1990, is British (although he now works at MIT).

The point of this quiz is not to wave the Union Jack. Rather, the opposite. Except for the hovercraft, British industry, finance, and government failed to exploit the genius of their nation's scientists and inventors. America picked up the dropped balls and ran for touchdowns. To make penicillin available to the masses, Oxford's Howard Florey and Norman Heatley had to come to the U.S. to achieve large-scale production in time for the D-Day landings. The inertia of the British Air Ministry and the skepticism of the National Academy of Sciences delayed production of the Whittle Meteor jet fighter plane until 1943, six years after the engine's first test. The designs were given to the United States — where they were swiftly used for Bell Aircraft's experimental P-59 Airacomet — and the U.S. came to dominate jet engine manufacture. Whittle, impressed by America's openness and enthusiasm for his achievements, ended up as yet another enriching immigrant to the United States, finally working as a research professor at the U.S. Naval Academy.

Precise identifications of the national origins of invention are tricky and clouded by chauvinism (try searching "jet engine" on the Web). As Peter Hall and Paschal Preston observed in their study of diffusion, The Carrier Wave (Unwin Hyman, 1988), English-language histories tend to give undue weight to American and British innovations and to downplay German and other contributions. Similar self-congratulating happens in German and French histories. Still, it is fairly clear that as early as the last two decades of the 19th century, Britain, the workshop of the world, was losing its innovative momentum.

American enterprise was already a matter of concern. A British turn-of-the-century commentator, writing under the headline "The American Invaders," complained: "The average citizen wakes in the morning at the sound of an American alarm clock; rises from his New England sheets, and shaves with his New York soap, and Yankee safety razor. He pulls on a pair of Boston boots over his socks from West Carolina [sic], fastens his Connecticut braces, slips his Waterbury watch into his pocket and sits down to breakfast...rising from his breakfast table, the citizen rushes out, catches an electric tram, made in New York, to Shepherds Bush, where he gets into the Yankee elevator. At his office of course everything is American. He sits on a Nebraskan swivel chair, before a Michigan roll top desk, writes his letters on a Syracuse typewriter, signing them with a New York fountain pen, and drying them with a blotting sheet from New England. The letter copies are put away in files manufactured in Grand Rapids."

If that was already the perception at the turn of the century, why did the Brits allow American dominance not merely to continue but to accelerate? They couldn't help it, not in the short term, anyway. Britain was caught on the wrong side of a long-wave movement in which, over decades, America was able to exploit high-technology innovations within the big, multi-unit-managed corporation so well documented in Arthur Chandler's Visible Hand (1977), to create whole new industries.

Innovative Waves

Economists and historians debate the nature, geographic incidence, and duration of the long waves of innovation. In the 1980s, it seemed that "Japan Inc." was surfing a long wave, all set to do to America what America had done to Britain. Even as recently as 1987, Business Week published a special supplement entitled "Can America Compete?" It made exactly the same point that critics had for years been making about Britain's apparent inability to exploit its inventions: "The Japanese created a manufacturing infrastructure that can respond with blazing speed to market demands and changing opportunities." The implication was that what happened to Britain could happen to America at a similar zenith of its power. Victorian Britain, too, had seemed invincible. Why did it crumble when it had given birth to the Industrial Revolution and laid out the principles for marrying technology and industry? Does America, too, have termites in the basement?

The Japan Inc. scare today seems quaint — America at the beginning of the 21st century appears immune to economic challenge, soaring to a new summit of innovation and productivity. Yet this is a dangerous moment. The apex of the curve is the point where vulnerability begins. There is financial, emotional, intellectual, and social investment in the infrastructure of success. It can lead to inertia. Confidence shades into complacency. People become less amenable to new ideas. If it ain't broke, don't fix it.

In short, technological success has within itself the impetus for its own destruction. Consider the rise and fall of the U.S. steel industry. Consider the resistance of RCA to Edwin Armstrong's invention of FM radio. The moment of ascendancy is precisely the moment to develop what does not now exist, and systematically to identify new technologies and encourage their diffusion.

Even with the few examples in the opening quiz, it is obvious that, at Britain's economic apex, there was a grave failure of commercial follow-through. But there was a deeper problem: lack of energy in marketing and innovation in industrial organization. British inventiveness itself had dried up. This competitive weakness began to show at the turn of the 20th century, and it persisted. It manifested itself strikingly in the IT industries. From the 1960s on, British inventors had a smaller and smaller proportion of U.S. patents. For the five years of 1966, 1970, 1974, 1978, and 1982, American inventors won an average of 45,500 patents a year. The British average was 2,725 a year; the German, 5,100; and the Japanese, 4,900. Too many Brits hanging out in their garages were not Wozniaks and Hewletts; they were washing their cars.

It is true the British establishment is especially gifted in putting people down. The National Academy of Sciences issued a lofty rebuke to Whittle, a mere engineer, when it concluded that "the present internal combustion engine used in airplanes weighs about 1.1 pound per horsepower, and to approach such a figure with a gas turbine seems beyond the realm of possibility with existing materials." As Whittle said later, "Good thing I was too stupid to know this." Yet there is no evidence that individual inventors wilted in the face of an increase in the gross national product of such skeptics and scoffers, or that British brains were punished by some celestial discrimination. No, the fall-off in recorded inventiveness was of such a scale and duration as to suggest more sweeping forces than pique were at work. The failure in R&D has, I think, four roots.

1. Scale. Despite the Sherman Anti-Trust Act, big American companies merged and became bigger. Their market was a continent, and it was long protected by high tariffs. Market size enabled America to lead in computers, once it woke up to the potential after 1950. Mass production made standardization imperative. By contrast, British industry — relatively small, fragmented, and vulnerable to foreign competition in its own market — lost economies of scale; American-style standardization was rare. The City of London financiers were not big on mergers. And layers of middlemen between the producers and the customers indulged the British talent for eccentricity. For instance, until 1928, when government intervened, lamp companies were producing 1,000 different types of lamps just to meet variations in local wattage. Larger American companies with lower unit costs had more profit to plow into R&D, and thereby more money to pay technologists and, in turn, teachers of technology.

2. Education. Although the quality of teaching in pure science and technology was high in Britain, it was available to few. The U.S. went in for mass technical education from early on. In the U.K., generations of talented individuals were held back, at least until the 1960s, by authority's conviction that only some 4 percent of the population was fit for higher education, and that it was best granted to the better-off 4 percent, give or take the freak working-class genius. After World War II, the government paid the tuition for some ex-servicemen, but only those whose academic careers had been interrupted. There was nothing comparable to the open-ended GI Bill. U.S. elitists also ridiculed the notion of millions of ordinary men being fit for scholarship, but those veterans helped America soar in the 1950s and '60s. In Britain, in the '50s, I well recall the outrage when more universities were proposed. The cry went up in the organs of the establishment, "More means worse!" In government, in the City, and in education, there was a general disdain for science. In 1956, government said it would spend £10 million over 10 years to expand technical education, but then it set its face against allowing any college of technology to give degrees.

3. Unions. Throughout the last century, trade unions in Britain hunkered down, protecting both jobs and ways of doing those jobs irrespective of scientific advances. The trade unions in America could be difficult, but they had a broader view. In the '70s, I was a member of the board of the leading British newspaper company, Times Newspapers, when it tried to introduce "new technology" — computerized typesetting. We failed. As the years of negotiation went on, "new" became a running joke. The machines stayed in their shipping boxes. Every effort at conciliation ended in wildcat stoppages. This was more than a local issue. There was a chance that very considerable British skills in editing and design could be married to software talent to produce systems with export potential. Yet nothing came of it. Even the journalists, in the name of solidarity, were reluctant to touch their "brother's" keyboard. In the end, an innovating new foreign owner, Rupert Murdoch, deployed the necessary cunning and courage to confront the print unions with a fait accompli. But by then, a decade of R&D had been wasted.

4. R&D. Britain relied too much on the brilliance of individuals, such as the three W's of the quiz — Whittle, Watson-Watt, and Wilkes. Although America has celebrated the legend of the solitary inventor, the U.S. has been more systematic in organizing collaborative effort, whether through the state or private enterprise, creating arenas for the swift exchanging and testing of ideas. The lone patent-peddling genius cannot match the focused, concerted effort of a corporate research lab, with its teams of specializing physicists, engineers, and inventors, backed by an army of lawyers to fight the inevitable patent litigation.

The history of radar is telling. It is one of the few examples in which the British government did direct an organized effort in research and manufacturing. By 1939, two Birmingham University scientists had built on Watson-Watt's work to develop the cavity magnetron, a thermionic tube that would allow pilots in combat to detect enemy planes. The U.K. had left all its competitors (and enemies) behind, but making the magnetron widely effective required a large team of researchers to develop new magnetic alloys, new high-voltage electronic switches, new types of conductors, better cathode-ray tubes for display, and more reliable transformers, resistors, and capacitors.

The British had to come across the Atlantic in 1940, initially carrying the secret magnetron in a small box for tests at Bell Labs. America was not yet in the war, but the U.S. government acted swiftly to set up the Radiation Laboratory at MIT. Thirty labs were commissioned to research semiconductors. By 1945, the U.S. had 4,000 researchers on radar, with a rich spin-off for the foundation of the electronics industry, and the British originators eventually fell behind.

In Britain, as a young science reporter in Manchester, I was at the other end of the telescope. In 1955, I visited the government-funded National Physical Laboratory (NPL). It was doing exciting work developing an electromagnetic robot typewriter. It had just built a couple of computer engines, Ace and Deuce. Its scientists were brilliant. Watson-Watt was a senior researcher at the NPL when he made his breakthrough on radar. But the NPL was thinly staffed. They were doing what they could on a budget of only £900,000 a year and were worrying that even that was under threat. The U.S. government spent billions of dollars on industrial R&D, but it cannily disguised it as defense research.

If state investment in R&D and education has been more wisely directed in the U.S. than Britain, British industry can hardly complain. It has never given R&D anything like the priority its competitors have. In 1933, the top 160 American firms had in-house research labs; in Britain in 1936, only 20 of the largest 200 firms had such labs. The trend continued in the 1980s, when British R&D spending relative to GDP had fallen lower than any industrial country except France. The contrast with the focus and energy of American business is remarkable.

Perhaps the most brilliant innovation in the business of innovation was Bell Laboratories. Bell coordinated advances in theoretical physics, metallurgy, and design engineering to develop transistor technology, and then in 1952 made the remarkable decision to share its secrets with the international community. The propagation of Mother Bell's Cookbook, setting out the techniques and possibilities of the transistor, and the Bell Symposium of 1952, produced a new industry within the decade (in addition to a revolution in science teaching, with prolific feedback from academia).

The Bell story suggests a final question. Those of you who scored 100 percent on the opening hors d'oeuvre can take this for an entree. Bell Lab's unique diffusion of transistor technology was carried out to deflect a threat from the U.S. Department of Justice for allegedly monopolistic control of patented inventions. Could there be any comparable fall-out from Justice's threatened breakup of Microsoft?

Reprint No. 00303


Authors
Harold Evans, evans_harold@strategy-business.com
Harold Evans, author of The American Century (Alfred A. Knopf, 1998), was the editor of The Times (London) and The Sunday Times, and the president and publisher of Random House trade group from 1990 to 1997. Mr. Evans is a contributing editor of U.S. News & World Report, and is working on two illustrated American history books: The Innovators and We the People.
Get s+b's award-winning newsletter delivered to your inbox. Sign up No, thanks
Illustration of flying birds delivering information
Get the newsletter

Sign up now to get our top insights on business strategy and management trends, delivered straight to your inbox twice a week.