In a recent NRCOHSR white paper on the Deeepwater Horizon explosion Professor Andrew Hopkins of the Australian National University argued that the Transocean and BP management teams that were visiting the rig on the day of the accident failed to detect the unsafe well condition because of biases in their audit practices.
Professor Hopkins gives examples where information ‘in the situation’ was available that indicated that the well mud replacement was not under control and explains this lack of attention as being due to a bias towards auditing conditions rather than behavior combined with a further bias towards personnel rather than major hazards safety.
I have to admit that I found Professor Hopkins explanations slightly unsatisfactory as the causal factors he identified are direct proximal behaviours. There is also an implicit presumption that at some level such biases were consciously arrived at, a presumption that I believe is unwarranted.
My belief is that the reasons or root causes of the ineffectiveness of the management teams audit lies much deeper than those identified by Professor Hopkins. And I would suggest reflect much more fundamental limits of human cognitive performance.
I would argue that the management team’s behaviour exhibits the classic symptoms of inattentional blindness. Briefly an incorrect assumption made by many is that we are fully aware of our environment all the time, in fact attentional resources are severely limited and that which we don’t pay attention to is simply not noticed, and worse we are not even aware of it’s absence (1).
So what were the management team paying attention to that caused this blindness? Here I agree with professor Hopkins that the focus of the team on OH&S issues was a major contributor, with current workplace safety regulations imposing rigorous OH&S obligations upon company management (2).
When as a manager you can be held responsible for injuries to personnel under OH&S legislation, statistics for such accidents are recorded and made available to management and you are evaluated on your performance in reducing the rates of such accidents, it is very clear where your attention is directed (3).
With such attention goes the converse inattentional blindness to other safety risks, such as that associated with the conduct of high consequence hazardous operations, for which you are not directly held accountable and for which there is at best only indirect evidence of increasing risk.
What the concept of in-attentional blindness tells us is that in these circumstances, with their attention firmly guided by legislative sanction and company accident rate statistics, the management team did not wilfully ‘overlook’ the lead indicators of the well blowout, they were simply unaware due to the attentional channeling caused by the business and legislative environment in which they were operating 4).
1. The classic real world demonstration of this effect is the number of motorists who turn into a lane and promptly collide with an invisible cyclist. In this case because they were focusing on looking for a car in the lane they were inattentionally blind to any cyclist.
2. As an example, in Australia the current generation of workplace health and safety legislation imposes both a general duty of ‘due diligence’ upon officers of the workplace and criminal sanctions for breaches of the act, including individual fines of up to $300,000.
3. An organisational pre-occupation reaching the height of logical absurdity in the current Zero Harm fad.
4. The effects of cognitive limitations and biases upon line or operational staffs behaviour is an accepted practice in accident investigations in the nuclear, petrochemical and aviation industries. That the same limitations might affect management staff does seems to be less often considered and as a result we have ended up with something of a double standard in evaluation human behaviour.