Close Calls and Human Biases
Sep 5th, 2012
Close Calls and Human Biases, Near misses receive different interpretations in different countries, cultural backgrounds. Some societies consider them glorified achievements of intuitive semi-gods. In others they are just plain indicators of near failure.
When near misses become repetitive, it can be assumed they are the result of a systemic flaw. Systemic flaws only require a small “twist of fate” to turn into an accident, possibly a disaster.
In an industrial operation we know of, people became accustomed to a loud bang coming from a particular device. That continued until one day the loud bang became a monster explosion that levelled the whole operation. It was only after the catastrophe that those same people connected the dots. They understood that that bang was actually not benign at all, indeed the symptom of a systemic flaw.
The world would be a better place if we stopped glorifying those very few cases where, against all odds, near misses turn out to be prestigious successes, while forgetting about the scores of failures due to “selective bias” of information hinting to possible flaws. We Humans are indeed apparently wired to:
- See patterns that do not necessarily exist.
- Poorly conceptualize long arcs of time.
- Perceive what agrees with our pre-existing expectations, and ignore things that disagree with our beliefs.
- Forget our losers and over-emphasize our winners.
- Be better at processing good news than bad ones.
- Be vulnerable to anecdotes that mislead or present false conclusions unsupported by data.
At Riskope we believe that our Mission is to support any Human endeavour with rational methodologies that allow bypassing these biases.
Tagged with: cognitive biases, crisis, decision, Near misses, systemic flaw
Category: Hazard, Risk analysis
Leave a Reply