Planning for the Unexpected
Typical crisis planning focuses on specific potential shocks. But how do you prepare for an unforeseen “asymmetric” threat — one that comes out of nowhere, with no rule book to follow?
If you’re a government or business leader, you probably spend much of your time trying to anticipate threats and preparing for them. (In a 2018 PwC survey of chief executives on crisis management, 73 percent of respondents said that they expect to be hit by at least one crisis in the next three years.) You know that heading off a crisis is less difficult and expensive than trying to fix the damage after the fact. Yet for all the time and money spent on crisis prevention, companies and communities are still regularly blindsided by terrible events that somehow slip through the cracks.
Consider, for example, factors that affected the preparedness for two well-known, relatively recent disasters:
• When Hurricane Maria, the most powerful storm to hit Puerto Rico in nearly a century, made landfall on September 20, 2017, it was known that rescue and recovery would be difficult. But the statistically improbable circumstances of the storm season exacerbated the crisis. Within the previous month, Hurricane Harvey had struck Texas and Louisiana, and Irma had ravaged Florida. This left rescue and recovery agencies overburdened and exhausted. Relief efforts lacked the staff, supplies, private contributions, and administrative attention they might otherwise have had.
• The October 1, 2017, massacre in Las Vegas circumvented the conventional security measures for large outdoor concerts. Nobody planned for a scenario in which a heavily armed shooter would begin firing from the window of an adjacent high-rise hotel. Resulting in more than 900 casualties, including 58 deaths and 422 others wounded by gunfire, this was the deadliest mass shooting by an individual in U.S. history.
In hindsight, the causal logic of a crisis — how one occurrence led to another, and then another — is often clear. That does not mean it could have been easily predicted, let alone prevented. The combination of factors at play gives the event a high level of asymmetry: the seemingly low probability of the event versus the high cost of preparing for such an event and the immense costs and destruction if it occurs. (The military term asymmetric threat, which refers to attacks by small groups on large countries, is just one type of asymmetric event described in this article.)
Most catastrophes have some asymmetric aspect. The devastation rarely happens the way people think it will: It isn’t possible to keep track of all the factors or anticipate how they will combine. With destructive wildfires, for example, factors such as wind velocity, the materials of the underbrush, and the temperature at night interact in ways that physicists don’t fully understand, and that make the blaze unpredictable. In other natural disasters, escape routes can be unexpectedly blocked — as they were during the eruption of the Kilauea volcano in Hawaii in May 2018. Even in deliberately generated threats, such as some of the cyber-attacks of early 2018, there are surprises. Millions of home Wi-Fi routers in the U.S. and Europe were targeted, a move that wasn’t widely anticipated.
This unexpectedness is even true of major geopolitical crises, including wars. As of late 2018, for example, many major corporations undoubtedly have contingency plans in place for war on the Korean peninsula, or for more open trade war between China and the United States. They know events like these are unlikely, but if they occur, they could have global impacts on a scale not seen since the 1940s — disrupting trillions of dollars of trade, shutting down sea lanes, causing massive casualties, and displacing millions.
Unfortunately, most contingency planning is predicated on recognized signals of pending conflict: breakdowns in diplomacy, the mass movement of troops, the deployment of naval assets, and so on. All of these involve substantial lead times. The plans are based on having time to prepare. But a real war, on the Korean peninsula or elsewhere, could start accidentally and suddenly, triggered by some small factor, in an asymmetric way. Global businesses, with their multifaceted international value chains, might be among those affected first by any resulting conflagration.
Threat Ecosystems and Meta-Readiness
Call them black swan events, surprises, blind spots, or asymmetric threats. The challenge for organizations is to find a strategic way to mitigate these often lethal, always unpredictable risks without “boiling the ocean” with multiple analyses, wasting their money trying to prevent a wide variety of potential crises, or having multiple departments develop separate crisis-prevention functions.
Even though the precise time, place, form, and effects of these events can’t be foreseen, such events can be better prepared for. This requires a fundamental mind shift from a focus on battling specific threats to a threat-agnostic approach. When you take this approach, you focus on what might be called meta-readiness: preparing your own innate ability to handle any type of crisis that emerges. You develop the ability to judge when a crisis is imminent; to respond to it swiftly and effectively; to receive and send critical information at the speed of business relevance; and to take whatever actions are required in the moment — with flexibility and the kind of organizational muscle memory that comes from multiple rehearsals.
In short, you build up your capabilities to manage the chaos that follows any large-scale upheaval.
There are at least four broad classes of asymmetric threats:
- Unprotected infrastructure
- Vulnerable technology
- Underestimated disasters
- Innovative geopolitical attacks
Each of these threat ecosystems has its own way of surprising people. Each can be spurred, through a combination of factors, to spin off a sudden tornado-like disaster. The best stance is to recognize their dynamics, and then look closely at your company’s missing capabilities, and the critical path for closing that gap.
1. Unprotected infrastructure. A natural disaster, a terrorist attack, or simply a prolonged deterioration can diminish the continued operation and efficiency of embedded large-scale infrastructure — including public systems for transportation and power, or private systems such as commercial ports and financial exchanges. Also vulnerable are many of the legal measures and regulations put in place to protect these infrastructures. These systems, upon which society depends, have been shown to be vulnerable to such a complex combination of factors that organizations and governments struggle to identify them. Given limited resources, businesses prepare and governments regulate for the risks that they can see. But those are often not the right ones to focus on.
Consider the artery of a nation: the U.S. power grid — a highly complex, highly decentralized enterprise that is vulnerable to attack. More than 3,300 utilities deliver power from natural gas, coal, nuclear, hydroelectric, and other power plants through 200,000 miles of high-voltage transmission lines; more than 5 million miles of distribution lines bring power to hundreds of millions of U.S. homes and businesses. A structural breakdown or a coordinated attack — which, according to the U.S. Department of Homeland Security, Russian cyber actors may already be positioned to accomplish — could cause power losses across large portions of the United States lasting from hours to weeks. In Puerto Rico, Hurricane Maria left residents without power for as long as 10 months. Weather-related outages, coupled with aging infrastructure, are estimated to have cost the U.S. economy an inflation-adjusted annual amount ranging between $18 billion and $33 billion between 2003 and 2012.
Beyond the substantial direct economic costs of a large-scale failure in the grid, the damage could cascade to other critical nodes of a region’s infrastructure. Without easy access to electric power, nearly everything would sooner or later be disrupted: banking, the Internet, the stock market, the water supply, the food supply, sewage, roads, hospitals, military operations, fuel, and air traffic control.
The catastrophic earthquake that struck Haiti in January 2010, for example, damaged roads and buildings essential for escape and recovery, including the City Hall of Port-au-Prince, the Presidential Palace, the National Assembly building, and the headquarters of the United Nations Mission in Haiti. It also destroyed or severely damaged virtually all vital infrastructure — including telephone and other communication systems, hospitals and other medical facilities, and airports and seaports — essential to response and recovery. And because the quake struck during the afternoon, when business, government, and NGO offices were open, many experienced professional crisis managers were killed. These factors, which were not anticipated, gravely compounded the effects of the earthquake.
2. Vulnerable technology. Society’s dependence on the Internet (and, increasingly, on connected devices in the Internet of Things) makes it exceedingly vulnerable to asymmetric threats. Paradoxically, the Internet itself evolved out of an attempt to forestall an asymmetric threat. The original Internet, called the “ARPANET,” after the U.S. Defense Department’s Advanced Research Projects Agency (ARPA), was an effort to make critical control of communications technology invulnerable to nuclear attack, through the concept of distributed computation, or “packet switching.” In fits and starts, the ARPANET extended its reach, until ultimately it became open to all, including those who see advantage in selectively undermining it.
As companies surrender more and more of their operations (and even agency) to automated systems, they tend to expect that the technology will always work as designed. This expectation becomes an Achilles’ heel — a vulnerability that will continue to be exploited by individual hackers and, increasingly, by sophisticated nation-states. And the odds are in their favor: With most digital technology, sabotage costs far less, and has a far greater ROI, than prevention. Even low-tech, low-resource mechanisms such as off-the-shelf ransomware acquired on the Dark Web can disrupt the basic technology that undergirds virtually every institution’s operations.
The corrosive effect of repeated cyber theft and identity theft leaches down to the consumer level as well; it becomes a constant, unwelcome fact of life. A 2017 PwC consumer survey on cybersecurity reveals that only 10 percent of consumers feel they have complete control over their personal information, and that consumer trust in both businesses and the government with respect to protection of their personal data is fading. Businesses, meanwhile, are struggling to find the right balance between missing (or not disclosing) breaches and sending too many false positive warnings to consumers. The sheer scale and frequency of these kinds of threats are so potentially terrifying that individuals (and organizations) can become numb to them — a passive attitude that further wears down their resistance when a breach happens.
Society’s dependence on the Internet (and, increasingly, on connected devices in the Internet of Things) makes it exceedingly vulnerable to asymmetric threats.
Other technologies are also vulnerable. Breakdowns in autonomous vehicles, in dams and water management systems, and in health-related technologies are far more manageable when they are expected to fail. When technological failure is seen as unacceptable or impossible, this threat becomes more serious.
3. Underestimated disasters. Sometimes the worst-case scenario actually happens. And, typically, human beings are biased against foreseeing and preventing it, because of economic concerns, liability issues, a lack of long-term memory, or simply denial or rationalization.
The most common case in this category is a potential threat that leaders are aware of — for example, a natural disaster such as a hurricane, flood, drought, earthquake, tsunami, or wildfire — whose damage is easy to underestimate. At the same time, leaders may overestimate their capacity to handle it. This perception, rather than the threat itself, poses the greatest risk. This category, however, does offer a great opportunity for learning, preparedness, and future mitigation, on the part of both the public and private sectors.
When compounded by the first two asymmetric threats — unprotected infrastructure and vulnerable technology — underestimated disasters can expand to inconceivable magnitude.
In March 2011, when the magnitude 9.0 Tōhoku earthquake and tsunami struck near Fukushima, Japan, it was seen at first as manageable. But there were other factors at play at the Daiichi nuclear power plant nearby. These included litigation fears and cultural barriers to candor, which made it difficult to communicate clear information about, for example, evacuation routes; a rigid hierarchical structure, starting at the prime minister’s office, which constrained decision making; insufficient safety guidelines and response preparations; and technological systems, including the diesel generator–based backup cooling system, that failed to operate. Another human error: a tendency to “fight the last war” — in this case, the 1995 Kobe earthquake, which had a far less intense magnitude of 7.2.
These factors, along with the original decision to build the plant in an unsuitable location, contributed to a crisis of epic proportions — “a profoundly manmade disaster that could and should have been foreseen and prevented,” in the words of the chairman of the Fukushima Nuclear Accident Independent Investigation Commission. If there is one useful way to see the Fukushima tragedy, it is as a cautionary tale, spotlighting as few others could the fundamental lessons of good crisis management.
4. Innovative geopolitical attacks. In discussing cyber-attacks, it is important to distinguish between two variants. The first is cyber theft: crimes, including the taking of identity information and intellectual property (IP), that are intended to achieve economic gain or competitive advantage. The other kind is even more pernicious: cyber sabotage, generally committed in order to destroy the electronic infrastructure on which digital society depends. Either variant can be committed by independent hackers, terrorist groups, or governments, taking advantage of the shaky reliability of digital technology. The asymmetric nature of these threats reflects the continuous technical R&D that goes into them, as well as the unpredictable nature of their impact and the response to them.
Russian President Vladimir Putin famously stated that whichever nation leads in artificial intelligence will rule the world, and that wars of the future might be determined by drone, rather than weapon, supremacy. Although it would be an exaggeration to suggest that digital technology will completely supplant weaponry as a determining factor in victory, wars are already being won with lines of code rather than bullets. Cyber-threats will increasingly be designed as part of coordinated military action.
Like other forms of espionage, state-based cyber-attacks are intended to grow a country’s overall capacity to influence events in other countries. An attack against a country’s power grid, for instance, can distract the attention of its government and degrade or delay its response. The 2015 hacking of the U.S. Office of Personnel Management, an “advanced persistent threat” most likely carried out by a nation-state, gathered highly sensitive personal information — including security clearance forms, Social Security numbers, and, in some cases, fingerprints — of more than 20 million past and present government employees, contractors, family members, and even credentialed members of the press.
Similarly, industrial espionage today is often conducted by state actors with the power to penetrate firewalls undetected, and steal “crown jewel” IP. If they do so, they enable their countries’ own enterprises to skip one or several generations of R&D. By the time the original organization realizes what has been lost, it is often too late.
This means that businesses that are counting on a technological edge may need to think about it disappearing faster than expected. It also means that any company considering an acquisition today should investigate the Dark Web for signs of chatter about stolen IP as part of its due diligence.
Adopting a Meta-Readiness Approach
We’ve established that asymmetric threats are unpredictable and prohibitively costly to protect against. Nobody has the resources to prepare for every eventuality, and (as the military aphorism puts it), no plan survives first contact with the enemy.
But that doesn’t mean you are helpless. It’s less important to develop the precise response to every particular threat than it is to establish your own broad categories of threat response. Adopting a threat-agnostic approach, in fact, is key. That means working to reduce potential adverse outcomes to a negligible level by careful planning, stress-testing, and red-teaming (hiring an independent group of attackers to test your defenses).
Below are some guidelines to follow.
First, acknowledge that one day you are likely to encounter an existential threat that you didn’t expect, and set up an ethic of continuous crisis-response improvement. Put in place a meta-readiness plan in which your overall readiness — not any particular crisis readiness — is the goal. Be sure your CEO (who will bear ultimate responsibility for the management and outcome of the crisis) drives it.
Second, look at the universe of risk through the lens of the four broad threat categories laid out in this article. Create one plan for each of these broad categories, instead of trying to parse among them.
Third, for each larger threat category, assess your current crisis-response capability. How tightened up are your processes, procedures, contact lists, and phone numbers? Who will have what authority to make the crucial decisions needed on the ground? What are the resources you will need, where are they, and how quickly and safely can they be deployed? Are your systems of data analytics and desktop architecture up to snuff? What lessons were learned from past experience? Codify them if possible.
Many organizations quickly discover they have gaps at this stage. Fill in those gaps, adding whatever tools or technologies are best suited today. Then, hold wargames on an unexpected crisis (ideally one designed from the point of view of a hostile actor, with no advance warning to the responding team). Who makes the key decisions at the moment of greatest stress? After the drill, reassess the gaps, assess the resources you have and need to have, evaluate how well your team and tools performed, and redefine your roles and responsibilities to handle the situation better. Then set up another wargame to rehearse and test your new approach.
Fourth, build up your business intelligence capability. Business intel can not only serve as an early warning radar — a wider look at threat or enforcement trends in your industry or region — it can also help you optimize your existing resources to meet the threat head-on. Both structured and unstructured data (within legal constraints) is available. And here’s another reason to increase this capability: Regulators are already mining the same data. So having a mechanism to “tune” your data is as essential to compliance as it is to mitigating some of the asymmetric threats we have described here.
Fifth, look at the way your own internal practices contribute to (or create) asymmetric threats. Consider the growing risk of enforcement actions due to breakdowns in antibribery or other fraud-fighting controls. Many marketing departments gather and analyze voluminous amounts of data using advanced technology — but don’t necessarily cross-link that data with, say, internal controls or compliance. Other companies implement aggressive internal sales incentives, but fail to design controls to check for abuses.
Finally, build your ongoing capability for managing asymmetric crises. Test again and again until the muscle memory for how to handle an unexpected crisis has been acquired by your full management team. Then regularly schedule further testing, so as to keep both team members and tools fresh.
Planning for the Unexpected
One of the benefits of adopting a threat-agnostic approach is that it champions flexibility and resilience over rigid protocols that may not survive the crisis (as the Fukushima example illustrates). Your planning process itself can only improve under fire, because it forces the organization to think about how to deal with the unexpected, rather than the expected. The process (and the capacity to respond to new information as it comes in) is more important than the piece of paper that comes out of a plan.
We believe that the management of a company has a fiduciary responsibility to build its meta-readiness and manage asymmetric threats as a key aspect of its overall approach to managing risk.
The reputation with which you emerge from a crisis will depend more on how seriously you anticipated the possibility of a disaster and how well you responded than it will on how well you predicted the precise threat. And even if it feels as though you are living in constant uncertainty, unable to predict your future, there is one certainty you can hold on to: Adopting the right stance and set of processes can almost guarantee you a better outcome. All things considered, those are good odds.
Author profiles:
- Paul Wolfowitz is a visiting scholar at the American Enterprise Institute and chairman of the U.S.–Taiwan Business Council. He served as the 10th president of the World Bank, the U.S. ambassador to Indonesia, the U.S. deputy secretary of defense, and the dean of the Paul H. Nitze School of Advanced International Studies at Johns Hopkins University.
- Kristin Rivera is a partner with PwC US, based in San Francisco. She leads PwC’s global forensics team of more than 3,400 forensics specialists worldwide.
- Glenn Ware is a principal with PwC US, based in Washington, D.C. He leads the firm’s global anti-corruption, intelligence, and threat practice.
- Also contributing to this article was Dyan Decker, a principal with PwC US and U.S. forensics leader.