A controlled burn of oil in the Gulf of Mexico, May 19

David Brooks has a good column today on the Deepwater Horizon disaster that sums up a significant problem my last post touched on: modern life is made possible by various complicated technological-bureaucratic systems. And these things can go south rather quickly and surprisingly. Part of the problem is that they’re complex, and not managed well. That’s par for the course. But the tricky thing is our collective expectations: we (and often the people running them) expect them to just work, and our expectations are way wrong:

Over the past years, we have seen smart people at Fannie Mae, Lehman Brothers, NASA and the C.I.A. make similarly catastrophic risk assessments. As [Malcolm] Gladwell wrote in that 1996 essay, “We have constructed a world in which the potential for high-tech catastrophe is embedded in the fabric of day-to-day life.”

So it seems important, in the months ahead, to not only focus on mechanical ways to make drilling safer, but also more broadly on helping people deal with potentially catastrophic complexity. There must be ways to improve the choice architecture — to help people guard against risk creep, false security, groupthink, the good-news bias and all the rest.

This is about right. But not exactly. (more…)

It’s been a scant few weeks since the story about unintended acceleration in various Toyota models reached its apogee. Already it’s gone through a furious, though predictable, media arc – shocking revelations, public fear, congressional hearings, expressions of outrage, abject apologies from the company CEO, debates about damage to the Toyota brand, and even an alarming – though unresolved and possibly faked – acceleration incident while all this was happening.

My question is, WTF just happened? Because the statistics tell us that, essentially, nothing did.

Six million cars have been recalled, and the reports of Toyotas experiencing sudden, uncontrolled acceleration number in the dozens. Robert Wright, who drives a Toyota Highlander, did the math and concluded that there isn’t that much to worry about. You’re much more likely to die in a car accident than have an acceleration incident:

My back-of-the-envelope calculations (explained in a footnote below) suggest that if you drive one of the Toyotas recalled for acceleration problems and don’t bother to comply with the recall, your chances of being involved in a fatal accident over the next two years because of the unfixed problem are a bit worse than one in a million — 2.8 in a million, to be more exact. Meanwhile, your chances of being killed in a car accident during the next two years just by virtue of being an American are one in 5,244.

This doesn’t mean nothing is wrong. (more…)

James G. Rickards has an interesting op-ed in today’s Washington Post on the role of financial risk modeling in the banking crisis. This is an important topic on its own, but also has sweeping implications for environmental policy, disaster preparation, and other issues. As he explains it, Wall Street’s financial risk models (known as “value at risk”) aggregate the day-to-day risks of various securities:

What’s left is “net” risk that is then considered in light of historical patterns. The model predicts with 99 percent probability that institutions cannot lose more than a certain amount of money. Institutions compare this “worst case” with their actual capital and, if the amount of capital is greater, sleep soundly at night. Regulators, knowing that the institutions used these models, also slept soundly. As long as capital was greater than the value at risk, institutions were considered sound — and there was no need for hands-on regulation.

But there’s a forest-for-the-trees problem here. Aggregating individual risks is fine in a relatively stable system. But what if the system itself becomes unstable? The risk model will not anticipate it. It’s a familiar problem from complexity science:

Think of a mountainside full of snow. A snowflake falls, an avalanche begins and a village is buried. What caused the catastrophe? The value-at-risk crowd focuses on each snowflake and resulting cause and effect. The complexity theorist studies the mountain. The arrangement of snow is a good example of a highly complex set of interdependent relationships; so complex it is impossible to model. If one snowflake did not set off the avalanche, the next one could, or the one after that. But it’s not about the snowflakes; it’s about the instability of the system. This is why ski patrols throw dynamite down the slopes each day before skiers arrive. They are “regulating” the system so that it does not become unstable.

The more enlightened among the value-at-risk practitioners understand that extreme events occur more frequently than their models predict. So they embellish their models with “fat tails” (upward bends on the wings of the bell curve) and model these tails on historical extremes such as the post-Sept. 11 market reaction. But complex systems are not confined to historical experience. Events of any size are possible, and limited only by the scale of the system itself. Since we have scaled the system to unprecedented size, we should expect catastrophes of unprecedented size as well. We’re in the middle of one such catastrophe, and complexity theory says it will get much worse.

This problem – a reliance on computer modeling that cannot accurately anticipate catastrophe – isn’t restricted to finance. Our society bases all of its policy and financial decisions on such “hard” forecasting numbers. Insurance companies employ risk models to gauge likely hurricane losses. Less sophisticated, but still highly trusted, models were used to construct the pre-Katrina New Orleans levees. They rendered the likelihood of a Katrina-sized storm surge relatively small. Often, models are built on datasets with short histories, put together in times of relative stability. But the capacity for a bigger, systemic event to sweep in and take place is always there – the fact that it is very hard to quantify doesn’t mean it’s not real.

The problem now is that the world – its financial, energy, and food systems and the physical environment itself – are changing rapidly. That means more unexpected events will occur – floods, droughts, shortages, gluts, crashes. Government and private institutions need to recognize that the future won’t be like the past, recognize the limits of their current numbers, and prepare accordingly.

Also: Here’s Nassim Taleb’s take on the same topic.

Follow

Get every new post delivered to your Inbox.