Archives For Sociology of Engineering

Waaay back in 2002 Chris Holloway wrote a paper that used a fictional civil court case involving the hazardous failure of software to show that much of the expertise and received wisdom of software engineering was, using the standards of the US federal judiciary, junky and at best opinion based.

Rereading the transcripts of Phillip Koopman, and Michael Barr in the 2013 Toyota spaghetti monster case I am struck both by how little things have changed and how far actual state of the industry can be from state of the practice, let alone state of the art. Life recapitulates art I guess, though not in a good way.

On being a professional

Currently dealing with some software types who, god bless their wooly socks, are less than enthusiastic about dealing with all this ‘paperwork’ and ‘process’, which got me to thinking about the nature of professionalism.

While the approach of ‘Bürokratie über alles’ doesn’t sit well with me I confess, on the other side of coin I see the girls just wanna have fun mentality of many software developers as symptomatic of a lack of professionalism amongst the programming class. Professionals in my book intuitively understand that the ‘job’ entails three parts the preparing, the doing and the cleaning up, in a stoichiometric ratio of 4:2:4. That’s right, any job worth doing is a basic mix of two parts fun to eight part diligence, and that’s true if you’re a carpenter or a heart surgeon.

Unfortunately the fields of computer science seems to attract what I can only call man children, those folk who like Peter Pan want to fly around never land and never grow up, which is OK if you’re coding Java beans for a funky hipster website, not so great if you’re writing an embedded program for a pacemaker, and so in response we have seem to have process*.

Now as a wise man once remarked, process really says you don’t trust your people so I draw the logical conclusion that the continuing process obsession of the software community simply reflects an endemic lack of trust, due to the aforementioned lack professionalism, in that field. In contrast I trust my heart surgeon (or my master carpenter) because she is an avowed, experienced and skillful professional not because she’s CMMI level 4 certified.

*I guess that’s also why we have the systems engineer. 🙂

…and the value of virtuous witnesses

I have to say that I’ve never been terribly impressed with ISO 61508, given it purports to be so arcane that it require a priesthood of independent safety assessors to reliably interpret and sanction its implementation. My view is if your standard is that difficult then you need to rewrite the standard.

Which is where I would have parked my unhappiness with the general 61508 concept of an ISA, until I remembered a paper written by John Downer on how the FAA regulates the aerospace sector. Within the FAA’s regulatory framework there exists an analog to the ISA role, in the form of what are called Designated Engineering Representatives or DERs. In a similar independent sign-off role to the ISAs, DERs are paid by the company they work for to carry out a certifying function on behalf of the FAA.

Continue Reading…

From Les Hatton, here’s how, in four easy steps:

  1. Insist on using R = F x C in your assessment. This will panic HR (People go into HR to avoid nasty things like multiplication.)
  2. Put “end of universe” as risk number 1 (Rationale: R = F x C. Since the end of the universe has an infinite consequence C, then no matter how small the frequency F, the Risk is also infinite)
  3. Ignore all other risks as insignificant
  4. Wait for call from HR…

A humorous note, amongst many, in an excellent presentation on the fell effect that bureaucracies can have upon the development of safety critical systems. I would add my own small corollary that when you see warning notes on microwaves and hot water services the risk assessment lunatics have taken over the asylum…

Just finished reading the excellent paper A Conundrum: Logic, Mathematics and Science Are Not Enough by John Holloway on the the swirling currents of politics, economics and emotion that can surround and affect any discussions of safety. The paper neatly illustrates why the canonical rational-philosophical model of expert knowledge is inherently flawed.

What I find interesting as a practicing engineer is that although every day debates and discussions with your peers emphasise the subjectivity of engineering ‘knowledge’ as engineers we all still like to pretend and behave as if it is not.