Gregory (Scotland Yard detective): “Is there any other point to which you would wish to draw my attention?”
Holmes: “To the curious incident of the dog in the night-time.”
Gregory: “The dog did nothing in the night-time.”
Holmes: “That was the curious incident.”
What you pay attention to dictates what you’ll miss
The point that the great detective was making was that the absence of something was the evidence which the Scotland Yard detective had overlooked. Holmes of course using imagination and intuition did identify that this was in fact the vital clue. Such a plot device works marvelously well because almost all of us, like detective Gregory, fail to recognise that such an absence is actually ‘there’ in a sense, let alone that it’s important.
The problem we face is that is we tend to focus over much on the evidence that is in front of us while under-estimating or overlooking the effect of factors that are not represented. In a famous study on the construction of fault trees Fischoff (1978) showed that when people were presented with an original fault tree against whose branches they allocated probabilities and then a ‘pruned’ fault tree, in which some of the original branches had been removed and an ‘all others’ branch included, they consistently failed to re-allocate the probability of the missing branches to the catch all branch thereby biasing probability estimates to the surviving branches.
Silvera (2005) interprets these results as evidence of an omission neglect bias, that is we exhibit a general insensitivity to missing information (in the case of a fault tree alternative causal mechanisms). Of course Fischoff”s experiment provided a clue to participants that there was missing information, so if that clue is also missing then omission neglect bias will be that much stronger.
Here’s a small Gedanken that neatly demonstrates the problem in safety analyses, below we have a fault tree for the failure to start of an emergency diesel generator taken from AS/NZS HB89-2013, let’s take a look at it and consider whether we’ve missed anything. Looks reasonable?
OK, so now mentally step back from the tree and look at the the corners of the first level of events and think about what’s not in the figure. Did you note that the causes of failure don’t include external events?
The point of this small example is not that we missed something, but that what was represented skewed our attention such that even though we ‘know’ about flooding we didn’t consider it. This misdirection effect is so strong and so well known that people make their living from it, we call them
safety analysts er, I mean magicians.
Now let’s engage in another small Gedanken, imagine that before I introduced the above fault tree I’d prefaced the diagram with the comment that the context of the system was an emergency generator system for a large industrial facility located on the edge of a bay, would we be more likely to identify flooding as a missing causal factor? Perhaps. Finally what if I stated that it was actually a nuclear power plant? Perhaps more so.
This misdirection effect is not just constrained to fault trees by the way, any diagram that purports to show all the causal factors or consequences but doesn’t include a ‘here be dragons’ leg will strongly misdirect attention away from what we don’t know towards what we do know. As another example the graphical nature of Goal Structuring Notation used in many safety cases again focuses attention upon the contents of the argument and away from what might actually be missing.
Unfortunately engineers love to use such diagrams as a medium of communication because they give a patina of objectivity and mathematical certitude to what is in reality a messy, subjective and value laden process of getting to a decision (Porter 1995). Unfortunately for us when we’re dealing with high consequence systems it’s actually Nassim Taleb’s unexpected and silent Black Swan events lurking in the outer darkness that we have to worry about and for which such diagrams appear to actually have a negative value because they spawn ontological risk.
Fischhoff, B., Cost-benefit analysis and the art of motorcycle maintenance. Policy Sciences, 8(2), 177-202, 1977.
Fischhoff, B., Slovic, P.,&Lichtenstein, S., Fault trees: Sensitivity of estimated failure probabilities to problem representations. Journal of Experimental Psychology: Human Perception and Performance, 3, 330–344, 1978.
Porter, T.M., Trust in Numbers: The Pursuit of Objectivity in Science and Public Life, Princeton University Press, Princeton NJ, 1995.
Silvera,D.H., Kardes, F.R., Harvey,N., Cronley, M.L., Houghton,D.C., Contextual Influences on Omission Neglect in the Fault Tree Paradigm, Journal of Consumer Psychology, Vol. 15, Issue 2, 2005, Pages 117-126, ISSN 1057-7408.