Cognitive biases, oversimplified assessments, vanishing flights and real life lessons
Mar 19th, 2014
Cognitive biases, oversimplified assessments, vanishing flights and real life lessons
It was a late afternoon in Tokyo, back in the year 2000. I was giving a presentation to a small group of executives from an Japanese airline. 9/11, Air France 447 and Malaysian Airline MH370 had not occurred yet.
The subject, I am sure you’d guessed right, was risk management.
I was talking about probabilities of events and long chains of small deviations that can lead to accidents.
All of a sudden one of the executives interrupted the presentation, stood up and very nervously said (word by word): “you know, in my industry we do not really care about all of this stuff, probabilities and so on. We care about having perfect maintenance on our planes, trained pilots. After that only two things can happen: the plane takes off and then lands. We have reduced the chance of a crash to almost nil, so we do not care about probabilities.”
I was flabbergasted by the series of cognitive biases hidden behind those few phrases.
On the top of my head I can quote a few:
- Normalcy Bias: which is the refusal to plan, or react to, a disaster which has never happened before.
- Illusion of Control: which is the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
- Framing: which is using an approach or description of the situation or issue that is too narrow.
- Base Rate Fallacy: which is to ignore available statistical data in favor of particulars.
- Bias Blind Spot: which is the tendency not to compensate for one’s own cognitive biases.
I was also very aware that each time we use event trees and other tools to determine probabilities of failure we have to be cautious to keep them as simple as possible, yet depict the reality. As I was listening my mind drifted to the oversimplified event tree that the executive was envisioning: basically two branches:
- Takes off and lands as expected.
- It does not perform 1 and crashes.
The probability of 1) being 99,999999% and the probability of 2) being (100-99,999999)%…i.e. around 10-6 (one in a million).
I was aware already at the time that the nuclear industry had produced risk assessments (WASH-1400, ‘The Reactor Safety Study’) that defined “absolute probabilities” in the range of 1 on 20’000 reactor total meltdown per year. Real life has demonstrated that the factual rate of major accidents (Class 5+) to date is about one order of magnitude above the theoretical one.
It took only one year for real life to demonstrate that another possible branch of the event tree for civil aviation is
- Someone hijacks the plane and purposely crashes it against a high-rise or a government building
We are now in 2014 and real life teaches us another lesson, adding one more branch to the event tree:
- A plane seems to crash, but apparently it was rerouted, it has vanished and we do not know why (yet?).
I wish I could talk with that executive again, and ask him what he thinks now. The moral of this story?
There are at least four I can think about right away:
- As risk assessors, we have to think about the unthinkable.
- Reality is almost certain to be more complex than yes/no.
- If we do not think about the unthinkable, be assured someone else or nature itself does and will surprise us.
- We cannot bias and censor our studies and we cannot oversimplify our models.
Tagged with: 'The Reactor Safety Study', 9/11, Air France 447, Base Rate Fallacy, Bias Blind Spot, cognitive biases, Framing, Illusion of Control, Malaysian Airline MH370, Normalcy Bias, probabilities of failure, WASH-1400
Category: Consequences, Crisis management, Hazard, Risk analysis, Risk management
Well said, lets be more careful and vigilant