An uneasy truth about the Challenger disaster
The story of Challenger in the public imagination could be summed up as ”’heroic’ engineers versus ’wicked’ managers”, which is a powerful myth but unfortunately just a myth. In reality? Well the reality is more complex and the causes of the decision to launch rest in part upon the failure of the participating engineers in the launch decision to clearly communicate the risks involved. Yes that’s right, the engineers screwed up in the first instance.
In the ideal, technical risk communication is the presentation of complex results and conclusions in a clear and unambiguous fashion. Unfortunately for the Challenger launch decision this was anything but the case. The meeting to discuss delaying the launch due to cold temperature was itself late to start and then chaotic in execution, among the problems:
- the use of slide charts to present complex data,
- said charts being made up in ‘real time’ prior to the meeting,
- the participants having to digest the information ‘in-meeting’,
- little attention paid to how the engineers presented data with many slides hand written,
- not one chart presented O-ring erosion and temp together, and (perhaps most importantly),
- not one chart referred to temperature data on all flights.
In contrast below is a single chart presenting the flight temperature that O-rings had been exposed to over the flight history of the shuttle fleet. Such a chart was not presented at the launch review, had it been then both NASA and Morton Thiokol management would have been confronted with how far outside the envelope they were taking the launch. Now I’m not going to say that the failure to communicate risk by the engineers to managers was the only reason that that a launch decision might have been made, but in an environment were the risk has not been clearly articulated it is all to easy to let other organisational objectives take precedence, see Vaughn (1996) for a fuller treatment of these.
Having observed the engineering culture at length I’ve concluded that engineers tend to fall into a view déformation professionnelle that the numbers and facts they work with are easily understood, universal in nature and thus that any conclusions they might draw are self evident truths. Having such a world view it’s a short step to a belief that no greater effort is required in communication than simply showing other folk the numbers, or worse, just your conclusions. This is probably also why engineers tend to love to use Powerpoint, and why poor technical communication (you guessed it, this time using Powerpoint) was also a causal factor in the Columbia accident. The reality, most especially in the case of risk, is that the numbers are never just the numbers and that effective technical communication is an essential element of risk management.
Postscript on the 30th Anniversary of the loss of Challenger
Reflecting on Challenger, there’s one other factor that we should consider in this story and that’s the way in which risk was perceived. You see the engineers weren’t dealing with the traditional risks as expressed through the risk equals probability times consequence. No, what they were dealing with was deeper uncertainty about the design. Unfortunately if you are accustomed to view risk through that Pascalian lens of probability theory it’s hard to grasp that there are other deeper uncertainties with their attendant risks, and even harder to create a language that deals with it.
Rogers Commission, Report of the Presidential Commission on the Space Shuttle Challenger Accident, Chapter V: The Contributing Cause of The Accident, Washington, D.C., 1986.
Vaughn, D., the Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, University of Chicago Press, 1996.