Friday, May 28, 2010

"...high-tech catastrophe is embedded in the fabric of day-to-day life”

Human-designed systems are becoming increasingly complex, yet they are downright amateurish when compared to the complexity of Nature's incredibly integrated design linking and locking the biosphere, atmosphere, hydrosphere and lithosphere in an intricate regenerative dance.

Given this, shouldn't systems thinking be part of core curricula from elementary school through college? Unfortunately as Bucky Fuller pointed out time and again, our educational system is more concerned with telling children what to think instead of teaching them how to think. Or as ecological geneticist Wes Jackson put it, we reward cleverness over wisdom. (GW)

Drilling for Certainty

By David Brooks
New York Times
May 27, 2010

In the weeks since the Deepwater Horizon explosion, the political debate has fallen into predictably partisan and often puerile categories. Conservatives say this is Obama’s Katrina. Liberals say the spill is proof the government should have more control over industry.

But the real issue has to do with risk assessment. It has to do with the bloody crossroads where complex technical systems meet human psychology.

Over the past decades, we’ve come to depend on an ever-expanding array of intricate high-tech systems. These hardware and software systems are the guts of financial markets, energy exploration, space exploration, air travel, defense programs and modern production plants.

These systems, which allow us to live as well as we do, are too complex for any single person to understand. Yet every day, individuals are asked to monitor the health of these networks, weigh the risks of a system failure and take appropriate measures to reduce those risks.

If there is one thing we’ve learned, it is that humans are not great at measuring and responding to risk when placed in situations too complicated to understand.

In the first place, people have trouble imagining how small failings can combine to lead to catastrophic disasters. At the Three Mile Island nuclear facility, a series of small systems happened to fail at the same time. It was the interplay between these seemingly minor events that led to an unanticipated systemic crash.

Second, people have a tendency to get acclimated to risk. As the physicist Richard Feynman wrote in a report on the Challenger disaster, as years went by, NASA officials got used to living with small failures. If faulty O rings didn’t produce a catastrophe last time, they probably won’t this time, they figured.

Feynman compared this to playing Russian roulette. Success in the last round is not a good predictor of success this time. Nonetheless, as things seemed to be going well, people unconsciously adjust their definition of acceptable risk.

Third, people have a tendency to place elaborate faith in backup systems and safety devices. More pedestrians die in crosswalks than when jay-walking. That’s because they have a false sense of security in crosswalks and are less likely to look both ways.

On the Deepwater Horizon oil rig, a Transocean official apparently tried to close off a safety debate by reminding everybody the blowout preventer would save them if something went wrong. The illusion of the safety system encouraged the crew to behave in more reckless ways. As Malcolm Gladwell put it in a 1996 New Yorker essay, “Human beings have a seemingly fundamental tendency to compensate for lower risks in one area by taking greater risks in another.”

Fourth, people have a tendency to match complicated technical systems with complicated governing structures. The command structure on the Deepwater Horizon seems to have been completely muddled, with officials from BP, Transocean and Halliburton hopelessly tangled in confusing lines of authority and blurred definitions of who was ultimately responsible for what.

Fifth, people tend to spread good news and hide bad news. Everybody wants to be part of a project that comes in under budget and nobody wants to be responsible for the reverse. For decades, a steady stream of oil leaked out of a drill off the Guadalupe Dunes in California. A culture of silence settled upon all concerned, from front-line workers who didn’t want to lose their jobs to executives who didn’t want to hurt profits.

Finally, people in the same field begin to think alike, whether they are in oversight roles or not. The oil industry’s capture of the Minerals Management Service is actually misleading because the agency was so appalling and corrupt. Cognitive capture is more common and harder to detect.

In the weeks and hours leading up to the Deepwater Horizon disaster, engineers were compelled to make a series of decisions: what sort of well-casing to use; how long to circulate and when to remove the heavy drilling fluid or “mud” from the hole; how to interpret various tests. They were forced to make these decisions without any clear sense of the risks and in an environment that seems to have encouraged overconfidence.

Over the past years, we have seen smart people at Fannie Mae, Lehman Brothers, NASA and the C.I.A. make similarly catastrophic risk assessments. As Gladwell wrote in that 1996 essay, “We have constructed a world in which the potential for high-tech catastrophe is embedded in the fabric of day-to-day life.”

So it seems important, in the months ahead, to not only focus on mechanical ways to make drilling safer, but also more broadly on helping people deal with potentially catastrophic complexity. There must be ways to improve the choice architecture — to help people guard against risk creep, false security, groupthink, the good-news bias and all the rest.

This isn’t just about oil. It’s a challenge for people living in an imponderably complex technical society.


Post a Comment

<< Home