Management in the Second Machine Age
Future leaders will succeed by being entrepreneurial and by rethinking the balance between financial and social goals.
The developed world stands at the cusp of a major transformation unlike anything experienced since the Gilded Age. That period (the name was coined by Mark Twain and his neighbor Charles Dudley Warner in their 1873 satire of the times) offered immense economic expansion and wealth creation in the United States, but it also led to a major disruption in the occupational mix of the citizenry and an associated set of social upheavals. Entrepreneurs built colossal businesses while laborers shifted from farms to factories.
Erik Brynjolfsson and Andrew McAfee of MIT have coined a new term for the coming era in the title of their latest book, The Second Machine Age (W.W. Norton, 2014). They chronicle the advance of Moore’s law (the seemingly inexorable doubling of microprocessor power every 1.5 to two years over the last half-century) as it yields technological advances such as autonomous cars that easily conquer the complex task of driving.
More importantly, the authors note that respected scholars as recently as 2004 had highlighted driving as an example of a task too complicated for computers and inherently requiring human capacity. Even a casual scan of the traditional media uncovers further examples of this ferocious progress, including robots that can run, developed for the military, and a computer program that inferred Newton’s second law of motion from the movement of a double pendulum—a device that creates a chaotic pattern to the human eye.
Although no one can confidently predict how this new age will unfold, most economists take an optimistic view—they believe that the positive economic effects will offset the inevitable disruption of employment: Those in traditional blue-collar occupations are facing dislocation as computers take over jobs like truck driving, factory work, call center support, and even burger flipping. But just as our rural, agrarian society eventually settled into an urban, manufacturing economy, this disruption will ultimately yield a stronger economy and better standard of living.
The obvious historical transition of farm to factory offers hope, but it misses a less obvious transformation that occurred over the last century. A fresh look at the U.S. census data reveals that the big shift in the 20th century wasn’t all about labor. In fact, there was a huge shift into managerial occupations. And today it’s not just the working class that faces disruption, but the managerial class as well.
Businesses today are drawing upon smart machines using statistical models to cull valuable insights from the exabytes of new digital information created daily. Machines such as IBM’s Jeopardy-winning Watson are being trained to displace highly trained experts as disparate as medical diagnosticians, financial advisors, and professional chefs. Smarter machines will reduce the number of traditional management jobs in the second machine age and force a change in both the practice and philosophy of management for the millennials poised to become the next generation of managers.
An examination of the Gilded Age offers two lessons for the coming disruption. First, managers must become entrepreneurial again: Number-crunching computers will replace number-crunching managers. Second, the new generation of managers must address the social challenges of the emerging disruption. Unlike the entrepreneurs of the Gilded Age, they should incorporate a social mission into their definition of business success, rather than making philanthropic gestures following the achievement of success.
Number-crunching computers will replace number-crunching managers.
The Rise of the Manager
The 1920 census of the U.S. documented the occupational mix of its 41 million working citizens within a hierarchy defined by industry and role. For example, 12.8 million people were employed in the “manufacturing and mechanical industries” and nearly 11 million were employed in “agriculture, forestry, and animal husbandry.” Those results already reflected a critical economic transition from the 1910 census. Merely a decade earlier, more U.S. workers— 12.7 million, representing nearly a third of the population—had been employed in agriculture, forestry, and animal husbandry. But during the second decade of the 20th century, manufacturing employment grew 21 percent in the United States while the more traditional agriculture-related segment dropped 13 percent. Of course, we all know that over the following century U.S. employment in manufacturing continued to grow, then ultimately waned as the U.S. economy shifted to services in the 1980s.
That’s the conventional wisdom, but it misses an important reality. A look at the latest occupational census data shows a more fundamental shift. The modern U.S. economy isn’t based so much on service as it is on management. At 38 percent of total employment, “management, professional, and related occupations” was the largest occupational employment category for the 142.5 million people classified by the 2012 census. Accounting for more than 30 percent of employment a century ago, the “farming, fishing, and forestry occupations” category now accounts for less than 1 percent of employment and is subsumed under the broader category of “natural resources, construction, and maintenance occupations,” which in aggregate accounts for only 9 percent of employment.
In fairness, the large management category includes professionals and those working in financial operations. The subcategory of “managerial occupations” totals only 16 million people—of which 1.5 million are classified as “chief executives.” But let’s put that into perspective: Chief executives now outnumber the entire U.S. workforce of farmers, fishermen, and foresters by about 50 percent.
Lessons of the Gilded Age
In the Gilded Age, there really were no professional managers running businesses. Instead, entrepreneurs with no formal education in management used their intuition to build business empires based upon creative ideas. Consider the case of Andrew Carnegie. Born to a working-class Scottish family, Carnegie first worked in a Pittsburgh cotton factory—12 hours a day, six days a week, at age 13—before becoming a telegraph messenger boy at age 15 and eventually a telegraph operator at age 18. From there, he advanced through the railroad industry and began accumulating wealth through savvy investing. He turned his attention to the steel industry in 1864 and eventually built a business empire by adopting the vastly more efficient Bessemer steel-making process and vertical integration. By the end of the 19th century, the U.S. dominated global steel production and Carnegie ran the largest and most efficient steel company in the world.
Carnegie faced a challenge, however. His empire had scaled beyond his personal capacity to manage it. U.S. financier John Pierpont (J.P.) Morgan offered a solution: a transfer of power from owner–entrepreneurs to professionally managed, publicly traded companies. In 1901, Morgan merged Carnegie’s steel empire with other players to form the United States Steel Corporation. Now the world’s richest man, the 66-year-old Carnegie turned his attention full-time to philanthropy. He advocated for, and demonstrated by example, “the gospel of wealth,” arguing that the rich had a moral obligation to use their wealth for the good of society. Over the course of his lifetime, Carnegie ultimately gave away US$350 million, part of which went to fund more than 2,800 public libraries.
Noticing that a growing number of his graduates were entering the world of commerce, the president of Dartmouth College, William Jewett Tucker, approached Dartmouth alumnus Edward Tuck with the idea of creating the first graduate school in commerce. Tuck, a successful banker, donated several hundred thousand dollars’ worth of railroad stock in 1900 to found the Amos Tuck School in honor of his father. In 1908, Harvard invented the master of business administration degree and created the Harvard Graduate School of Business Administration with a faculty of 15, attracting 80 students.
Frederick W. Taylor, the father of scientific management, played a role at both Tuck and Harvard in their early years. Born to a wealthy Quaker family, Taylor decided to forgo a planned path from Phillips Exeter Academy to Harvard in 1874 and instead started his career as a manufacturing laborer. He advanced through a series of positions while pursuing a degree in mechanical engineering and ultimately became the chief engineer of Midvale Steel, a company with only a single plant, but one known for an analytic management style. Taylor applied his own analytic skills to define the “one right way” to do each and every task, which led to a doubling of worker productivity. In 1890, he began consulting to show other companies how to drive such worker productivity. And by 1911, he had codified his philosophy in The Principles of Scientific Management. During his time as a consultant, Taylor also conducted research at Dartmouth and served as a professor at the new Tuck School. In addition, the dean of the new Harvard Business School recruited Taylor to create a foundational course in manufacturing and industrial organization, further establishing the importance of the analytic approach to management.
This focus on quantification provided needed control to the new empires that had exceeded the managerial capacity of their entrepreneurial founders. While successful entrepreneurs like Carnegie turned their attention to philanthropy, legions of less talented but professionally trained managers amassed data and depended on analysis to make up for their lack of creative insight. Their numbers-based, yet simplistic, search for “best practices” (to use the modern jargon) squeezed out creativity and ignored the disruptive effects that this impersonal mind-set had on laborers. Perhaps the transition from owner–entrepreneurs to professional managers was inevitable in an era driven by physical labor and scale economies. However, returning to modern times and looking ahead, a focus on “the numbers” means management will increasingly be subsumed by computers. Future managers will need to use their creativity to challenge the constraints to both commercial success and social welfare.
Creating the Future
Taylor misused the term scientific, and frankly so do many scientists today. They tend to equate science with math, employing a reductionist mind-set that seeks to quantify everything. However, a truly scientific method applies a hypothesis-driven approach designed to eliminate flawed theories. No theory can ever be proved through the scientific method—only tested and disproved or corroborated. As scientific philosopher Karl Popper famously put it, “No matter how many instances of white swans we may have observed, this does not justify the conclusion that all swans are white.”
Scientists tend to develop theories that explain how things came to be. For example, they use Darwin’s theory of evolution to explain how people came to cooperate with one another, and the big bang theory to explain why stars formed in the universe. Science also depends on controlled experiments to test theories. For example, a number of experiments over the past 50 years have provided evidence confirming Albert Einstein’s general theory of relativity, and recent experiments at the $9 billion Large Hadron Collider in Switzerland have uncovered evidence of the existence of the Higgs boson, a fundamental particle implied by the standard model of quantum physics.
Management, on the otherhand, doesn’t spend a lot of time worrying about how things came to be and doesn’t have the luxury of performing controlled experiments. A business strategy offers the managerial equivalent of a scientific theory. Managers need to develop hypotheses of what will work in the future in order to set the company’s current strategic direction. Instead of simply testing hypotheses, management must create the future. The future can’t be created (or even uncovered) by simply examining the past, even with the massive computer power employed in “big data” analyses. The strategic answer can’t be found in the numbers, not even in that central tool of the MBA: the net present value calculation. At the same time, managers can’t run a company based on a set of untested hypotheses: The right business strategy requires creativity and analysis.
The best managers use their intuition to form hypotheses based on a belief about why something occurs, not just based on data demonstrating correlation. We uncover novel patterns by hypothesizing root causes, effectively tapping strategic models to explain why a particular pattern might emerge. Those strategies need to be explicitly articulated and tested before pursuing action. As social psychologist Kurt Lewin proclaimed, “There’s nothing so practical as a good theory.” Computers can analyze massive quantities of data and discover patterns by drawing on inferential statistics. But even big data computers don’t form the hypotheses needed to develop new strategies designed to break existing constraints and create new business models. Accordingly, managers who seek to break constraints and embrace a hypothesis-driven approach will not face extinction but will instead create the future.
Breaking Constraints
Consider the case of Taiichi Ohno, the father of the Toyota production system. He didn’t infer his new paradigm from a big data analysis of historical patterns. He drew upon an analogy—the U.S. supermarket—to inform his intuition that the current system of “pushing” automobiles through mass production imposed inefficiencies that could be eliminated through a “pull system” producing cars in lot sizes of one.
In fact, analysis would have suggested his paradigm was impossible. Optimal lot sizes were defined by the changeover time of equipment, making a lot size of one infeasible because the huge presses in automotive manufacturing required 12-hour changeovers.
Undeterred, Ohno sent his top industrial engineer, Shigeo Shingo, to benchmark the best in the world. Shingo learned that Volkswagen set the benchmark at six hours, but by incorporating other observations he managed to reduce Toyota’s press changeover time by 67 percent to four hours. Unimpressed, Ohno pushed Shingo to make changeovers in less than 10 minutes, not because he had data to justify such a target but because the approach he envisioned, which would become lean manufacturing, demanded it.
That extreme target forced Shingo to fundamentally rethink the production process, and by doing so he broke the key constraint to achieving Ohno’s vision. Today, manufacturing managers routinely dismiss the simplistic notions of “mass production,” and the world benefits from higher quality at lower cost, thanks to the creativity of Ohno and Shingo.
More recently, Blake Mycoskie of Toms Shoes sought to break the traditional constraint of Carnegie’s gospel of wealth. Rather than turning to philanthropy only after achieving financial success, Mycoskie integrated it into his entrepreneurial business model. During a vacation in Argentina in 2006, he noticed a local style of canvas slip-ons called alpargatas, which he began to wear. During that same trip, he spent time with a nonprofit organization helping poor children in the outskirts of Buenos Aires who often went barefoot. Integrating the two ideas, he formed Toms Shoes as a for-profit social enterprise designed to both make money and do good. Marketing the Argentine-inspired shoes to U.S. consumers on a “one-for-one” basis, he sold 10,000 pairs in six months and then distributed 10,000 free pairs to Argentine children in need. In 2011, Toms expanded its one-for-one business model to eyeglasses. And it recently announced plans to launch a coffee business that will donate clean water to the poor in coffee-growing regions in South America and Africa. Though not yet 40 years old, Mycoskie has reportedly amassed a multimillion-dollar level of personal wealth while his company has given away more than 1 million pairs of shoes.
Lean Startups
I won’t attempt to predict the future, but I have little doubt that the future will bring dramatic change. Change is the only constant. The millennials, who are now reaching adulthood, grew up in the digital age and are already suffering some of its effects. Despite being well educated, this group faces high unemployment, massive student debt, and a less rosy economic future than either of the previous two generations. Yet according to a recent Pew survey, although millennials are less trusting of other people in general, they have great confidence that the future will be better than the past.
I see positive signs that this new generation is embracing the hypothesis-driven approach in order to break constraints and build new business models. The “lean startup” movement, spawned from the entrepreneurial culture of Silicon Valley, argues that initially, all an entrepreneur has is a set of untested hypotheses. The entrepreneur’s goal should be to produce the minimum viable product to test those hypotheses with real customers. The business models change on the basis of customer feedback. The company pivots to a different strategy if customer feedback proves that the current strategy is fundamentally flawed, or if better ones present themselves. For example, PayPal started out as a way to process payments between Palm Pilot users. But cofounder Peter Thiel saw a bigger opportunity in partnering with eBay (which acquired PayPal for $1.5 billion). While still supporting eBay transactions, PayPal positioned itself as a broader payment processing business with major growth in mobile, a technology not imagined at the company’s founding.
The lean startup movement has also taken root among social enterprises seeking young management talent. Consider the annual competition for the Hult Prize, initiated in 2010 by the Hult International Business School, which has campuses in Boston, Dubai, London, San Francisco, and Shanghai. The first competition challenged more than 300 business school students to develop business models in support of the “One Laptop per Child” nonprofit. The 2014 Hult Prize sought business plans for social enterprises to reduce chronic illnesses among the urban poor worldwide. It attracted more than 10,000 applicants who competed in teams for six regional prizes of $50,000, and a $1 million grand prize of seed funding for the winning proposed social enterprise.
The millennials have grown up in the earliest days of the second machine age. Although they are aware of the massive quantity of information now available, they understand that new business models aren’t discovered through a historical pool of big data but are instead invented through a process of management that starts with hypotheses, which are tested with data. Big data will allow them to test far more hypotheses, far more cheaply. But data—or the machines that collect it—won’t in itself create the innovative business models of the future, especially those that seek to balance commercial and social goals.
Most of my students possess a broad world view and exude both creativity and passion. Having taught these bright minds over the past decade, I have developed both hope for and faith in their future as managers. And I believe they will embrace this sentiment: “The best way to predict the future is to create it.”
Reprint No. 00252
Author profile:
- Tim Laseter is a senior executive advisor for Strategy& and a professor of practice at the University of Virginia’s Darden School. The author or coauthor of four books, including The Portable MBA (Wiley, 2010) and Strategic Product Creation (McGraw-Hill, 2007), he draws upon decades of experience in business strategy and academia to serve as a contributing editor of strategy+business.