Why more information does not automatically reduce risk
I recently re-read the article Risks and Riddles by Gregory Treverton on the difference between a puzzle and a mystery. Treverton’s thesis, taken up by Malcom Gladwell in Open Secrets, is that there is a significant difference between puzzles, in which the answer hinges on a known missing piece, and mysteries in which the answer is contingent upon information that may be ambiguous or even in conflict.
As an example, the location of Osama bin Laden was a puzzle, i.e. this is the sort of problem that by searching for information we can refine and close in on the solution. But what al-Qaeda will be doing a year from now is a mystery, as the answer is contingent upon events in the past and future, some of which we may never know of, while vital information may actually be hidden within a mass of irrelevant data. Now safety is a complex, emergent and contingent attribute of a system and it’s environment. Often we find that information pointing to an impending accident are hidden in the noise of day to day operations, only becoming clear in hindsight after an accident. So from Treverton and Gladwell’s perspective we could consider safety as more of a mystery rather than a puzzle problem.
The point that Gladwell and Treverton both make is that knowing what type of problem you face is essential in trying to solve it, or at least recognising your limits. But I would add another dimension to their analysis in reflecting that whether we identify a problem as a mystery or a puzzle is strongly defined by our cultural background. Cultures that are intolerant of uncertainty will shy away from the mystery view of problems as uncertain, instead reinforcing their essentially reductionist world view that problems can be solved by finding the ‘missing bit’. In safety this reflects an approach of always hurrying to fix the last accident and uncertainty avoidance.
Unfortunately this approach may actually be counter productive because, as we add layers of safeguards, we also increase complexity. We then find ourselves on the horn of a dilemma, to understand the system we have built we would have to simplify the explanation so much that it becomes inaccurate, while to completely describe our complex system would have to add so much detail that we could not encompass it. While puzzles are transmitter-dependent with their solution turns on what you are told, mysteries are receiver dependent and their solution turns on the skills of the listener to interpret ambiguous, conflicting and hidden signals. In managing complex safety critical systems it turns out that the skill of understanding and interpreting the systems data may be just as important as the system design itself.