dailytelegraphdeathcult2-290x385

OK, so it’s over. Three dead, and it could have been worse

So what have we learned from all this? Well let’s start with the basics. This was a small event, in comparison 9/11 killed more than 3,000 people, while the Bali bombings killed 202. In fact today was not even close to the Sarin gas attacks in Japan which killed 5 people and severely injured 50 more.  So in the scheme of things this is small beer. And as oft turns out it was carried out by someone who was fundamentally crazy plus stupid. Contrary to what Hollywood, politicians and the security industry espouse most terrorists are inept, and most terrorist schemes fizzle out.

Continue Reading…

My favourite line, “Engaging the sleigh in a dive is forbidden under all circumstances… children expect to hear sleigh bells, not a Stuka diving horn”. Click on the cover to read the whole manual.

Sleigh Pilot Notes (Image source: Air Council)

4blackswans

Or how do we measure the unknown?

The problem is that as our understanding and control of known risks increases, the remaining risk in any system become increasingly dominated by  the ‘unknown‘. The more we demand in integrity for high integrity systems the more we end up in the situation of having to deal with residual risks that are unknown and unknowable. Well at least the day before the accident they are. What we really need is a way to measure, express and reason about deep uncertainty, and by that I don’t mean tools like Pascalian calculus or Bayesian prior belief structures, but a way to measure and judge ontological uncertainty.

Even if we can’t measure ontological uncertainty directly perhaps there are indirect measures? Perhaps we can infer something from the platonic shadows it casts on the wall, so to speak. Nassim Taleb would certainly say no, the unknowability of such events is the central thesis of his Ludic Fallacy after all. But I still think it’s worthwhile thinking about, because while he might be right, he may also be wrong.

*With apologies to Nassim Taleb.

h

Boeing 787-8 N787BA cockpit (Image source: Alex Beltyukov CC BY-SA 3.0)

The Dreamliner and the Network

Big complicated technologies are rarely (perhaps never) developed by one organisation. Instead they’re a patchwork quilt of individual systems which are developed by domain experts, with the whole being stitched together by a single authority/agency. This practice is nothing new, it’s been around since the earliest days of the cybernetic era, it’s a classic tool that organisations and engineers use to deal with industrial scale design tasks (1). But what is different is that we no longer design systems, and systems of systems, as loose federations of entities. We now think of and design our systems as networks, and thus our system of systems have become a ‘network of networks’ that exhibit much greater degrees of interdependence.

Continue Reading…

In by the out door

787 Battery after fire (Image source: NTSB)

The NTSB have released their final report on the Boeing 787 Dreamliner Li-Ion battery fires. The report makes interesting reading, but for me the most telling point is summarised in conclusion seven, which I quote below.

Conclusion 7. Boeing’s electrical power system safety assessment did not consider the most severe effects of a cell internal short circuit and include requirements to mitigate related risks, and the review of the assessment by Boeing authorized representatives and Federal Aviation Administration certification engineers did not reveal this deficiency.

NTSB/AIR-14/01  (p78 )

In other words Boeing got themselves into a position with their safety assessment where their ‘assumed worst case’ was much less worse case than the reality. This failure to imagine the worst ensured that when they aggressively weight optimised the battery design instead of thermally optimising it, the risks they were actually running were unwittingly so much higher.

The first principal is that you must not fool yourself, and that you are the easiest person to fool

Richard P. Feynman

I’m also thinking that the behaviour of Boeing is consistent with what McDermid et al, calls probative blindness. That is, the safety activities that were conducted were intended to comply with regulatory requirements rather than actually determine what hazards existed and their risk.

… there is a high level of corporate confidence in the safety of the [Nimrod aircraft]. However, the lack of structured evidence to support this confidence clearly requires rectifying, in order to meet forthcoming legislation and to achieve compliance.

Nimrod Safety Management Plan 2002 (1)

As the quote from the Nimrod program deftly illustrates, often (2) safety analyses are conducted simply to confirm what we already ‘know’ that the system is safe, non-probative if you will. In these circumstances the objective is compliance with the regulations rather than to generate evidence that our system is unsafe. In such circumstances doing more or better safety analysis is unlikely to prevent an accident because the evidence will not cause beliefs to change, belief it seems is a powerful thing.

The Boeing battery saga also illustrates how much regulators like the FAA actually rely on the technical competence of those being regulated, and how fragile that regulatory relationship is when it comes to dealing with the safety of emerging technologies.

Notes

1. As quoted in Probative Blindness: How Safety Activity can fail to Update Beliefs about Safety, A J Rae*, J A McDermid, R D Alexander, M Nicholson (IET SSCS Conference 2014).

2. Actually in aerospace I’d assert that it’s normal practice to carry out hazard analyses simply to comply with a regulatory requirement. As far as the organisation commissioning them is concerned the results are going to tell them what they know already, that the system is safe.

Here’s a short tutorial I put together (in a bit of a rush) about the ‘mechanics’ of producing compliance finding as part of the ADF’s Airworthiness Regime. Hopefully this will be of assistance to anyone faced with the task of making compliance findings, managing the compliance finding process or dealing with the ADF airworthiness certification ‘beast’.

The tutorial is a mix of how to think about and judge evidence, drawing upon legal principles, and how to use practical argumentation models to structure the finding. No Dempster Shafer logic yet, perhaps in the next tutorial.

Anyway, hope you enjoy it. :)