strategy+business is published by PwC Strategy& Inc.
 
or, sign in with:
strategy and business
Published: January 19, 2011
 / Spring 2011 / Issue 62

 
 

A Corporate Climate of Mutual Help

One electric utility company I studied, Alpha Power — I can’t reveal its real name — was under pressure from regulators to improve its environmental record. Management told employees, “Every oil spill on every sidewalk must be reported immediately and cleaned up.” A lot of electrical workers said, “That’s not me. I’m not a janitor. I splice big, heavy cables.” Alpha responded that this was an order, not an option, and that workers would be trained in cleaning up spills safely.

Some electrical workers quit, but most were retrained. After about five years, the workers were asked, “How do you feel about Alpha’s environmental policies?” They answered, “It’s the right thing to do. We should be cleaning up the environment.” That wasn’t what they’d said five years earlier. But once they embraced the behavior, the values caught up.

S+B: Cultural change can’t be as easy as just demanding that people change their behaviors. What if people only pretend to comply?
SCHEIN:
That’s why the role of management is so critical. Culture is multifaceted, and every company has many subcultures. At the top, there might be an executive subculture, trained in finance, which wants good numbers above all else. There’s also probably an engineering subculture, which assumes that crises can be prevented only with fail-safe, redundant systems that kick in automatically. There are other subcultures for middle management, supervisors, the union, and marketing. Every company combines those subcultures in very different ways that have become ingrained over decades. In any change program, when you encounter resistance, you have to then ask, “Is this just an individual resisting, or are group norms at play, based in a particular subculture?”

For example, when a transformer exploded at Alpha, tests showed high levels of airborne PCBs [polychlorinated biphenyls], a dangerous chemical to which firefighters, health workers, and others in the community were exposed. Alpha was criticized for not revealing these high PCB levels upon first discovering them.

But an engineer had tested that transformer every year for 20 years and never found PCBs. When the explosion occurred, he didn’t immediately believe the data. As an engineer, he wanted to be certain that the data was accurate, since it ran counter to his decades of testing. So he waited for the samples to come back from the lab before he said anything. It turned out that there were PCBs in the transformer’s sealed sound protection struts. The PCBs were released during the explosion but would have been undetectable otherwise.

Alpha may have looked malfeasant — failing to rapidly report an environmentally dangerous situation that the company clearly knew about — but in reality the delay was a consequence of a strong engineering subculture. It could have punished the engineer, but that wouldn’t have changed the culture. If Alpha wanted to be safe and environmentally responsible, it had to demand behavioral change. So the company sent out a strong, coercive message that spoke to the heart of the engineering subculture: “You are required to report any observed environmental event or unsafe situation immediately, before analyzing it. Sound the alarm the minute you think there is a dangerous situation. Don’t wait until you’ve figured it out.” Alpha’s leaders got very specific about what environmental responsibility and safety meant in terms of the behavior they expected from their employees.

Resistance might also come from other cultures, including that of the top executives. In some crises, like the Challenger space shuttle disaster in 1986, I’ve heard people argue that the engineers weren’t competent. In fact, they raised concerns in advance about the O-ring at least twice, but they were overruled, and stopped squawking. Should they have held their ground? Marc Gerstein, who wrote Flirting with Disaster: Why Accidents Are Rarely Accidental [with Michael Ellsberg; Union Square Press, 2008], argues that the fault was not with the engineers, but with the NASA culture. People need to be able to raise concerns, and persist in raising them, in a way that cultures like NASA, aimed at results, can accept.

 
 
 
Follow Us 
Facebook Twitter LinkedIn Google Plus YouTube RSS strategy+business Digital and Mobile products App Store

 

 
Close