David Brooks has a good column today on the Deepwater Horizon disaster that sums up a significant problem my last post touched on: modern life is made possible by various complicated technological-bureaucratic systems. And these things can go south rather quickly and surprisingly. Part of the problem is that they’re complex, and not managed well. That’s par for the course. But the tricky thing is our collective expectations: we (and often the people running them) expect them to just work, and our expectations are way wrong:
Over the past years, we have seen smart people at Fannie Mae, Lehman Brothers, NASA and the C.I.A. make similarly catastrophic risk assessments. As [Malcolm] Gladwell wrote in that 1996 essay, “We have constructed a world in which the potential for high-tech catastrophe is embedded in the fabric of day-to-day life.”
So it seems important, in the months ahead, to not only focus on mechanical ways to make drilling safer, but also more broadly on helping people deal with potentially catastrophic complexity. There must be ways to improve the choice architecture — to help people guard against risk creep, false security, groupthink, the good-news bias and all the rest.
This is about right. But not exactly. (more…)