Archives For Communication

Icicles on the launch tower (Image source: NASA)

An uneasy truth about the Challenger disaster

The story of Challenger in the public imagination could be summed up as ”’heroic’ engineers versus ’wicked’ managers”, which is a powerful myth but unfortunately just a myth. In reality? Well the reality is more complex and the causes of the decision to launch rest in part upon the failure of the participating engineers in the launch decision to clearly communicate the risks involved. Yes that’s right, the engineers screwed up in the first instance.

In the ideal, technical risk communication is the presentation of complex results and conclusions in a clear and unambiguous fashion. Unfortunately for the Challenger launch decision this was anything but the case. The meeting to discuss delaying the launch due to cold temperature was itself late to start and then chaotic in execution, among the problems:

  • the use of slide charts to present complex data,
  • said charts being made up in ‘real time’ prior to the meeting,
  • the participants having to digest the information ‘in-meeting’,
  • little attention paid to how the engineers presented data with many slides hand written,
  • not one chart presented O-ring erosion and temp together, and (perhaps most importantly),
  • not one chart referred to temperature data on all flights.

In contrast below is a single chart presenting the flight temperature that O-rings had been exposed to over the flight history of the shuttle fleet. Such a chart was not presented at the launch review, had it been then both NASA and Morton Thiokol management would have been confronted with how far outside the envelope they were taking the launch. Now I’m not going to say that the failure to communicate risk by the engineers to managers was the only reason that that a launch decision might have been made, but in an environment were the risk has not been clearly articulated it is all to easy to let other organisational objectives take precedence, see Vaughn (1996) for a fuller treatment of these.

Having observed the engineering culture at length I’ve concluded that engineers tend to fall into a view déformation professionnelle that the numbers and facts they work with are easily understood, universal in nature and thus that any conclusions they might draw are self evident truths. Having such a world view it’s a short step to a belief that no greater effort is required in communication than simply showing other folk the numbers, or worse, just your conclusions. This is probably also why engineers tend to love to use Powerpoint, and why poor technical communication (you guessed it, using Powerpoint) was also a causal factor in the Columbia accident. The reality, most especially in the case of risk, is that the numbers are never just the numbers and that effective technical communication is an essential element of risk management.

O ring distribution

O ring distribution chart showing just how far the degree to which the Challenger launch temperatures conditions lay outside the experience of previous launches

References

Rogers Commission, Report of the Presidential Commission on the Space Shuttle Challenger Accident, Chapter V: The Contributing Cause of The Accident, Washington, D.C., 1986.

Vaughn, D., the Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, University of Chicago Press, 1996.

20131029-074234.jpg

Why risk communication is tricky…

An interesting post by Ross Anderson on the problems of risk communication, in the wake of the savage storm that the UK has just experienced. Doubly interesting to compare the UK’s disaster communication during this storm to that of the NSW governments during our recent bushfires.

Continue Reading…

While I’m on the subject of visualising risk the Understanding Uncertainty site run by the University of Cambridge’s Winton Group gives some good examples of how visualisation techniques can present risk.

Matrix (Image source: The Matrix film)

Just updated my post on Decision Theory and the Risk Matrix with some material on the semiotics of colour and the advantages, as well as disadvantages, that it’s use in constructing a risk matrix brings.

Matrix (Image source: The Matrix film)

Why the risk matrix?

For new systems we generally do not have statistical data on accidents, and high consequence events are, we hope, quite rare leaving us with a paucity of information. So we usually end up basing any risk assessment upon low base rate data, and having to fall back upon some form of subjective (and qualitative) method of risk assessment.

Risk matrices were developed to guide such qualitative risk assessments and decision making, and the form of these matrices is based on a mix of decision and classical risk theory. The matrix is widely described in safety and risk literature and has become one of the less questioned staples of risk management.

Despite this there are plenty of poorly constructed and ill thought out risk matrices out there, in both the literature and standards, and many users remain unaware of the degree of epistemic uncertainty that the use of a risk matrix introduces. So this post attempts to establish some basic principles of construction as an aid to improving the state of practice and understanding.

Continue Reading…

For the STS 134 mission NASA has estimated a 1 in 90 chance of loss of vehicle and crew (LOCV) based on a Probabilistic Risk Assessment (PRA). But should we believe this number?

Continue Reading...

The Newcastle 2007 storm

In part one and part two of this post I looked at Drew Warne Smith and James Madden’s article, “The science is in on sea-level rise: 1.7 mm”, in terms of it’s worth as a logical argument.

We live under a government of men and morning newspapers.

Wendell Phillips

While Smith and Madden’s argument turns out to be the usual denialist slumgullion it does serve as a useful jump off point for a discussion of the role of the media in propagating such pernicious memes (1) and more broadly in communicating risk. Continue Reading…