- LATEST BLOG POST
- echo $post_date ?>
- In a world that used to be data-starved, is data indigestion a risk? Data starvation It is rare that we…
- Read More
Close Calls and Human Biases, Near misses receive different interpretations in different countries, cultural backgrounds. Some societies consider them glorified achievements of intuitive semi-gods. In others they are just plain indicators of near failure.
When near misses become repetitive, it can be assumed they are the result of a systemic flaw. Systemic flaws only require a small “twist of fate” to turn into an accident, possibly a disaster.
In an industrial operation we know of, people became accustomed to a loud bang coming from a particular device. That continued until one day the loud bang became a monster explosion that levelled the whole operation. It was only after the catastrophe that those same people connected the dots. They understood that that bang was actually not benign at all, indeed the symptom of a systemic flaw.
The world would be a better place if we stopped glorifying those very few cases where, against all odds, near misses turn out to be prestigious successes, while forgetting about the scores of failures due to “selective bias” of information hinting to possible flaws. We Humans are indeed apparently wired to:
At Riskope we believe that our Mission is to support any Human endeavour with rational methodologies that allow bypassing these biases.