January 28th was the anniversary of the Space Shuttle Challenger disaster. The Rogers Commission detailed the official account of the disaster, laying bare all of the failures that lead to the loss of a shuttle and its crew. Officially known as The Report of the Presidential Commission on the Space Shuttle Challenger Accident – The Tragedy of Mission 51, the report is five volumes long and covers every possible angle starting with how NASA chose its vendor, to the psychological traps that plagued the decision making that lead to that fateful morning.  There are many lessons to be learned in those five volumes and now, I am going to share the ones that made a great impact on my approach to risk management. The first is the lesson of overconfidence.

In the late 1970’s, NASA was assessing the likelihood and risk associated with the catastrophic loss of their new, reusable, orbiter. NASA commissioned a study where research showed that based on NASA’s prior launches there was the chance for a catastrophic failure approximately once every 24 launches. NASA, who was planning on using several shuttles with payloads to help pay for the program, decided that the number was too conservative. They then asked the United States Air Force (USAF) to re-perform the study. The USAF concluded that the likelihood was once every 52 launches.

In the end, NASA believed that because of the lessons they learned since the moon missions and the advances in technology, the true likelihood of an event was 1 in 100,000 launches. Think about that; it would be over 4100 years before there would be a catastrophic event. In the end, Challenger flew 10 missions before it’s catastrophic event and Colombia flew 28 missions before its catastrophic event, during reentry, after the loss of heat tiles during take off. During the life of a program that lasted 30 years, they lost two of five shuttles.

Evaluating risk in an environment where the perceived risk is very low is a dangerous proposition. And basing all of your subsequent risk analysis in that false assessment leads to a lack of understanding and a culture that accepts wildly inappropriate probability estimates as fact.  Even small mistakes in assessing and quantifying risk can eventually add up to huge problems. It is important to stay grounded in as much fact as possible when assessing risk. There is no place in the process for guessing.

So, if you have ever uttered the phrase, “it hasn’t happened in the last 10 years, why would it happen now,” and called it risk management, you are creating the culture that will deliver your own catastrophe.