The psychological basis of uncertainty
There’s a famous psychological experiment conducted by Ellsberg, called eponymously the Ellsberg paradox, in which he showed that people overwhelmingly prefer a betting scenario in which the probabilities are known, rather than one in which the odds are actually ambiguous, even if the potential for winning might be greater.
This is important because it indicates that in considering risk in the general sense we can and do consider both probability and uncertainty. That in turn provides a cognitive basis for the recognised preference of engineers for older and better understood technologies over newer and more poorly understood technologies when it comes to safety applications. Their preference it seems is not about ‘classical’ or probabilistic risk aversion, but rather an aversion to epistemic or ontological uncertainty and risk.
The question remains though, if we have a theoretical foundation for a continuum of uncertainty and associated risk, why doesn’t our standard professional practices, ISO31000 for example, tell us how to recognise it and deal with it? Perhaps this a specific instance of the professional narrowing of focus that Verblen warned us of in his comment on professional disability.