Tomorrow's Washington Post has an interesting
article inside the front page that postulates why so many people in the Administration couldn't seem to anticipate the -breach in the levies-, -the rise in gas prices-, the Iraqi -insurgency- civil war.
Hindsight bias is when
people make a judgment or choice and are later asked to recall their judgment. If, in the interim, they're told what the correct judgment would have been, their memory of their own judgment may become biased toward the new information. For instance, suppose a person was asked to estimate how many votes John McCain would get in the Michigan primaries. If before the election, he estimated 30%, and then learned that the actual figure was 50%, he may later recall that his answer was 40%.
But it's more complex than just filling the blanks of our recollection with new data. It's about the ability of humans to predict future uncertainties, which is what we ask of our intelligence apparatus as well as our political leaders.
Professor Nassim Nicholas Talebhas an interesting perspective on the concept of hindsight bias and our ability to predict the future. He defines events beyond the realm of normal expectations as "black swans."
Much of what happens in history comes from 'Black Swan dynamics', very large, sudden, and totally unpredictable 'outliers', while much of what we usually talk about is almost pure noise. Our track record in predicting those events is dismal; yet by some mechanism called the hindsight bias we think that we understand them. We have a bad habit of finding 'laws' in history (by fitting stories to events and detecting false patterns); we are drivers looking through the rear view mirror while convinced we are looking ahead."
Why are we so bad at understanding this type of uncertainty? It is now the scientific consensus that our risk-avoidance mechanism is not mediated by the cognitive modules of our brain, but rather by the emotional ones. This may have made us fit for the Pleistocene era. Our risk machinery is designed to run away from tigers; it is not designed for the information-laden modern world."
The Post article illustrates what happens when brains that are not equipped for processing information try to make judgements about the future.
In yet another experiment, Baruch Fischhoff, a psychologist at Carnegie Mellon University and a pioneer in the field of hindsight bias, found that Americans who made estimates about their danger after the Sept. 11, 2001, attacks recalled having made much lower estimates of risk a year later, after their fears failed to materialize.
Fischhoff testified about psychological factors in judgment at a meeting of the House intelligence committee last week.
While hindsight bias in the context of the Iraq war was real, the psychologist cautioned in an interview against misuse of the idea -- the argument by many supporters of the Bush administration that it was impossible to know ahead of time how the war would turn out.
"It's wrong for people who should be held accountable to hide behind hindsight bias and say this was totally unpredictable," Fischhoff said.
And here's the punchline.
Indeed, research by both Fischhoff and Arkes show that people can fight the hindsight bias only when they honestly and systematically try to explain how different outcomes are possible. Such self-doubt is the exact opposite of how modern politics works: In the age of the blogosphere, certitude is king.
At its core, in other words, the hindsight bias is a form of overconfidence. Clearly acknowledging how you might be wrong is the only weapon against the error, Fischhoff said, but that is one thing politicians hate to do.
"Many people who are offended by the president are offended by his lack of deliberateness," Fischhoff concluded. "We want leaders who look deliberately at the evidence and are candid about the gambles they are taking. . . . A lot of unease in this country is consistent with people not feeling like they are being leveled with."
Back to Professor Taleb, who had some pointed things to say about the 9/11 Commission in the context of the 'black swan paradigm.' The Commission, as he saw it, had three flaws.
The first flaw is the error of excessive and naïve specificity. By focusing on the details of the past event, we may be diverting attention from the question of how to prevent future tragedies, which are still abstract in our mind. To defend ourselves against black swans, general knowledge is a crucial first step.
The second is one I like to call "the Blame it on Clinton" problem. Professor Taleb says it
is also a prime example of the phenomenon known as hindsight distortion. To paraphrase Kirkegaard, history runs forward but is seen backward. An investigation should avoid the mistake of overestimating cases of possible negligence, a chronic flaw of hindsight analyses. Unfortunately, the hearings show that the commission appears to be looking for precise and narrowly defined accountability.
And if the Commission had been truly set up correctly, perhaps we wouldn't have been led to further complicating the bureaucracy and institutionalizing the very practices that led us to the flawed thinking in the first place. Thus, the Professor concludes
The third flaw is related. Our system of rewards is not adapted to black swans. We can set up rewards for activity that reduces the risk of certain measurable events, like cancer rates. But it is more difficult to reward the prevention (or even reduction) of a chain of bad events (war, for instance). Job-performance assessments in these matters are not just tricky, they may be biased in favor of measurable events. Sometimes, as any good manager knows, avoiding a certain outcome is an achievement.
The greatest flaw in the commission's mandate, regrettably, mirrors one of the greatest flaws in modern society: it does not understand risk. The focus of the investigation should not be on how to avoid any specific black swan, for we don't know where the next one is coming from. The focus should be on what general lessons can be learned from them. And the most important lesson may be that we should reward people, not ridicule them, for thinking the impossible. After a black swan like 9/11, we must look ahead, not in the rear-view mirror.
Now, I'm not going to sit here and say that all this "I told you so" is overhyped. Liberals were thinking out of the box, looking at all of the data and I know that I was, personally, very concerned about terrorism on our soil prior to 9/11. After all, we had experienced quite a few acts of it:
- shootings at the CIA building in Langley, VA by a crazed Arab gunman
- the bombing of a federal building in Oklahoma City
- the bombing in Olympic Park, Atlanta Georgia and subsequently at two abortion clinics.
But I can say that until we begin taking the steps to truly understand the threats against us and the people behind them, we're doomed to experience the "black swan event" again. We have very little understanding of that after the 9/11 Commission completed its work and what little we have has gone unabsorbed by this Administration.