Genius Is a Team Effort
Michael Schrage reviews two notable books by veteran Silicon Valley journalists that seek to explain collaborative charisma.
The Intel Trinity: How Robert Noyce, Gordon Moore, and Andy Grove Built the World’s Most Important Company
by Michael S. Malone, Harper Business, 2014
Haunted Empire: Apple after Steve Jobs
by Yukari Iwatani Kane, Harper Business, 2014
As business clichés go, the charismatic entrepreneurial genius can’t be beat. This trope asserts that biography is destiny. Talent does what it can, but genius does what it must. From Edison, Ford, and Rockefeller to Jobs, Bezos, and Zuckerberg—genius changes the world.
But whose genius? In reality, the most innovative enterprises are seldom the lengthened shadow of one person. More often, they’re wrought from a collaborative genius that depends on complementary talents and character traits. Could Steve Jobs have become Steve Jobs without the design brilliance of Jony Ive and the supply chain savvy of Tim Cook? Would Bob Noyce, Gordon Moore, and Andy Grove be as celebrated if they had been solo acts?
In reality, the most innovative enterprises are seldom the lengthened shadow of one person.
Explaining collaborative charisma is a tougher rhetorical task than explaining the singular kind. But two notable new books by veteran Silicon Valley journalists, Michael S. Malone’s The Intel Trinity and Yukari Iwatani Kane’s Haunted Empire, take on this narrative challenge, albeit with radically different sensibilities.
In both books, the substance and illusion of charismatic genius is central to the story, but the essential tensions and conflicts emerge from each company’s dueling collaborators. For Malone, Intel’s founding trinity of Noyce, Moore, and Grove collaboratively created “the world’s most important company”—so labeled by the author for its ability to transform markets through technology better than any other enterprise.
For Kane, Apple’s prowess at innovation is in postmortem decay because the ghost of genius past doesn’t inspire creative collaboration; it simply spooks the living. Apple post–Steve Jobs might as well be Disney post–Walt Disney or Polaroid post–Edwin Land—the situation isn’t hopeless, but it has less hope. “We conducted the experiment,” Kane quotes Oracle founder and Jobs friend Larry Ellison saying. “I mean, it’s been done. We saw Apple with Steve Jobs. We saw Apple without Steve Jobs. We saw Apple with Steve Jobs.... Okay, I’ll say it publicly…they will not be nearly so successful because he’s gone.”
The reportage in both books is impressively comprehensive. But Malone delivers insight with a style and intensity that Kane never quite matches. The reasons are revealing: First, Malone seems to understand and respect his protagonists, their technologies, and their company in ways that Kane does not. So, where he sounds edgy, she sounds cynical. Second, and more importantly, Malone grasps how and why Intel’s superiority transcends its separate parts. Kane, by contrast, treats every slice of her Apple thoroughly, but individually. She gives a polite nod to the Jobs–Ive design ethos and solidly tells stories spanning Apple’s fraught Foxconn relationship and Tim Cook’s assiduously cultivated low profile, but it’s the reader who must put them together.
Malone’s The Intel Trinity aspires to be as definitive, entrepreneurial, energetic, technical, and, yes, brash as the company it chronicles. And it largely succeeds. You cannot read this sweeping and detailed history without coming away with a better grasp of what it takes to successfully achieve capital-intensive innovation on a global basis.
Intel becomes as much a character in the book as its leaders—a rhetorical feat that Kane, alas, can’t match. Intel’s rigor, relentlessness, and ruthlessness are exhaustive and exhausting—there’s no soulless corporation here. This is an enterprise that is culturally committed to constantly reinventing itself and the silicon circuitry that digitally defines postmodern life.
Did Intel make mistakes? Yes, and they could be huge—technically, strategically, and organizationally. But Malone documents a collective leadership approach that, in the finest tradition of the software its products enable, has learned how to turn a bug into a feature. “That Intel has survived all of these decades and stayed at the top of the most complex and dynamic industry imaginable is testament to this remarkable ability—by a singular combination of ferocious competitiveness, uncompromising management, and seemingly bottomless reservoirs of employee commitment and energy—to not only recover from mistakes but capitalize on them,” he writes. “Intel, more than any [Silicon] Valley company before or since, had learned to learn.”
That’s not just cant. Malone revels in each new generation of silicon ingenuity and the market disruptions it brings. His book revolves around what Intel learns and, just as importantly, what it knows. Intel worships and invests in the exponential ideology of its cofounder Gordon Moore. Nearly 50 years on, the company has yet to violate Moore’s Law, and as Malone points out, it’s almost impossible to overstate the organizational impact and cultural influence of Intel’s faithful adherence to it.
Like the microprocessors they pioneered and made ever more sophisticated, Noyce, Moore, and Grove were and are multilayered and complex. And they are reasonably, nay, generously, rendered by Malone: Each is heroically ingenious and entrepreneurial, each is riven with flaws tempered and ameliorated by the other members of the trinity. Malone also captures their internal rivalries and resentments—Noyce hated interpersonal confrontation; Grove relished it and thought Noyce the lesser for ducking it, Moore mediated. And who, by the way, did Steve Jobs turn to for mentoring? Bob Noyce. (Jobs was never nonconfrontational, but one imagines that Noyce’s influence enhanced his appreciation of technical talent.)
These are fundamentally good people, and, as Malone implicitly observes, they’re even better technologists. Yet the Intel they built wasn’t quite an extension of themselves. “Were you to create an amalgam of Noyce, Moore, and Grove, you would not end up with the stereotypical Intel employee,” observes Malone. “Yet were you to parse those Intel employees’ traits, you would find all of them in the three founders—Andy’s combativeness and plainness, Gordon’s competence and confidence, and Bob’s competitiveness and vision. It’s a powerful combination, but not a complete one. In Intel’s stripped-down and polished soul, there is little of Grove’s wit, Moore’s humility, or Noyce’s maverick entrepreneurship—and that is a pity. It is as if Intel took those pieces from the Trinity it needed to win, not those it might have used to be its best.”
This distinction is vital. Entrepreneurial genius—individual or collective—is a constant, torturous struggle between “being your best” and winning. Greatness without triumph isn’t truly great, but winning without being the best you can be is also a failure on some level. In this, Kane and Malone are aligned: The craving for real, recognized, and undeniable greatness is as important to the successors of Intel’s trinity as it is to the successors of Steve Jobs. The notion of Nobel Prize–winning Herbert Simon, “satisficing,” is anathema at both companies. The benchmark of true success at Intel and Apple is nothing less than setting today’s and tomorrow’s global standards. Leadership means your rivals mimic you.
Being indisputably the best and the winner gives rise to another excruciating leadership reality: Astonishing success invites an astonishing amount of litigation and regulation. Being an Intel or an Apple means that lawyers become partners and peers in your global struggle for greatness. And that means that rule of law and administrative caprice increasingly define and dominate top management discourse.
Take the lawsuits and legal hostilities out of The Intel Trinity and Haunted Empire and both books shrink by nearly half. Intel’s history is studded with intellectual property disputes, licensing deals gone awry, regulatory threats, and omnipresent investigations, and Apple’s litigatory zeal is a narrative centerpiece of Kane’s book.
Apple’s legal wrangling with Samsung is practically a book within a book. As Kane tells it, Samsung had little shame in being as great a high-tech imitator as an innovator. But she misses what may be the salient point: Perhaps Samsung is even more ruthless and relentless than Apple. Can’t the conflict between greatness and winning be resolved by being great at winning? Maybe that is Samsung’s ethos.
The possibility that the Google Boys and then CEO Eric Schmidt betrayed Steve Jobs’s trust and confidences by launching their Android ecosystem also spawned years of lawsuits (finally resolved by a truce in May 2014). And, of course, Apple was enmeshed in litigation around its foray into e-books. From Apple’s pinnacle of global greatness, the view is clear: litigators and regulators as far as the iPad can see.
Of course, it’s impossible to know whether a Jobs-less Apple will come up with a killer product that will exorcise the demanding ghost of its founder. Similarly, the pace of technological disruption in silicon is such that tomorrow’s Intel may be no more dominant than today’s General Motors.
But the most compelling executive takeaway from these two books is not the transcendence of individual genius. It is the role of organizations as multipliers of genius. That is, the books show us how leaders can create environments and processes that consistently and tirelessly make the entrepreneurial whole qualitatively greater than the sum of its parts. Intel’s trinity built a company around that ethos; by contrast, Apple was a company built around Steve Jobs’s vision and highly constrained collaborative sensibility. And I wouldn’t bet against either company.
- Michael Schrage is a contributing editor of strategy+business and holds appointments at MIT’s Sloan School of Management and London’s Imperial College. He is the author of Who Do You Want Your Customers to Become? (Harvard Business Review Press, 2012) and The Innovator’s Hypothesis: How Cheap Experiments Are Worth More Than Good Ideas (MIT Press, 2014).