[Reading time: 3 minutes]
The Space Shuttle Challenger crash in 1986 makes me as sad today as it did when I first saw the footage. I’ve always been in awe of the Shuttle and the complexities involved in every mission.
In his book ‘The Intelligence Trap‘, David Robson mentions the Challenger crash, which exploded due to a faulty seal.
As Robson says:
“[this disaster] would have been tragic enough had it been a fluke.”
However, it was not a fluke.
The root cause was known and it had caused problems before. But on each previous occasion, the outcome was a near-miss rather than a catastrophe.
A double tragedy
The Challenger crash was a tragedy.
The second tragedy was that the organisation (in this case, NASA) didn’t use these near-misses as early warning signs of future catastrophe.
The near-misses were simply regarded as future “housekeeping matters” because the outcomes of these near-misses were benign.
But as Richard Feynman (a member of the investigation team) noted:
“When playing Russian roulette, the fact that the first shot got off safely is little comfort for the next”.
This was not their fault. It’s in our nature.
No-one is pointing the finger at the NASA engineers. Many had raised their concerns to senior management.
According to Feynman, the root cause of this organisation’s blindness to future danger seems to be our “outcome bias”, described by Robson as:
“[Our] focus on the actual consequences of a decision without even considering the alternative possible results.”
Our perception of future risk is influenced by past outcomes
We can see ‘outcome bias’ all around us.
I’ve previously talked about turkeys. Now let’s talk about drivers:
- Many drivers glance at their phones while driving. They know it causes a loss of attention, however temporary. They know it’s against the law.
- Someone you know probably has a habit of driving home from the pub after a couple of beers. They know alcohol influences their cognition. They know it’s against the law.
But every time they have done it in the past, no-one came to any harm.
Past outcomes lead them to believe that the future risk is non-existent.
But we know the risk is real, regardless of past outcomes.
Risk, risk, risk, risk, risk: Z..z..z..z
I talk to directors about risk all the time – IT risk, cybersecurity risk, outsourcing risk, fraud risk.
And I know when the topic of ‘risk’ comes up at board meetings, people’s eyes glaze over.
After all, many board members got to where they are because they were prepared to take risks in their earlier careers.
Take informed risks and learn from the near-misses.
I am not advising against taking risks. But I am advising that:
- You ensure you are fully-informed of the risks (especially the potential impact) before you decide on a course of action
- You consider the role of luck when a previous risk you took had a positive outcome
- You learn from near-misses, as you may not be as lucky next time
What got you here may not get you there.