The HAL effect

09/09/2009 — 2 Comments

The enigmatic face of HAL

Automating our cultural biases

In 2001 A Space Odyssey HAL the omniscient and omnipresent onboard computer develops a cybernetic psychosis and murders his human crew… Oops say the software design team, that wasn’t suppose to happen.

As it turns out the reason for the psychosis is that HAL has been required by mission control to keep the true mission of the Odyssey spacecraft secret, resulting in aberrant psychosomatic behaviour. When the crew discuss shutting him down (due to the observed behaviour) HAL unilaterally decides that humans are the real weak link in the mission and murders all but one of the crew.

I’m sorry Dave, I can’t do that…

HAL 2000 computer 2001 A Space Odyssey

The ghost in the machine

Of course computer’s can’t have psychotic breaks. Well yes, but any system designed to interact with humans reflects a set of assumptions made by other human’s as to how such interactions should take place. So, in our current generation of aircraft flight control systems is it time to consider the possibility of ‘cybernetic psychology’ as a potential cause of accidents? In Outliers Malcom Gladwell posits that culture can have dire effects upon the communication between just the human members of an aircraft crew. This got me thinking about the whole question of embedded cultural assumptions in automation.

Cultural context

In turn these assumptions are based on the cultural context of the designers. In cultures with a high Power Distance Index (Hofstede 2004) subordinate crew will defer to the pilot in command leading to failure to question hazardous behaviour by the commander (2). In low PDI cultures the questioning of superiors by subordinates is tolerated and even encouraged as a means to ensure that decisions and actions are cross checked for safety. Logically one would expect a high PDI culture to build software that implemented a strict hierarchy of control and decision making in the cockpit (3), while a low PDI culture would develop software that emphasised a more equal team and cross checking based approach to decision making.

But what happens if the automation designer’s come from a high PDI culture? For example, if the developers were French who have a hgh PDI (68) while the pilot’s are Australian with a low PDI (36), would this difference lead to a breakdown of coordination between crew and automation? Alternatively if the crew and automation are both French (high PDI for crew and automation) would this lead them to defer to the automation and trust it’s decisions in situations where they should not?

Tolerant of uncertainty?

Another cultural aspect that Hofstede identified was the degree to which a culture teaches it’s members to tolerate uncertainty. For a high Uncertainty Avoidance (U/A) culture we would expect a strong emphasis on reducing uncertainty through strict safety laws, rules and safety measures. Again, to take up our preceding example, we would expect aircraft flight control  automation designed by a team within a culture such as France (a U/A of 86) to emphasis strict safety rules and protocols as a way to reduce uncertainty. On the other hand the same software designed by a US team with a lower U/A (46) would tend to place much less emphasis upon such rules and protocols as air crew would be expected to be ale to deal with far more ambiguous situations.

Conclusions

The above is in no way an argument that if you build automation is culture ‘X’ you’ll end up with ‘Y’, local effects such as a company’s culture and the biases of specific individuals would still play a part. But perhaps we should always be ready to critically question whether designing within the set of implicit assumptions of a cultural norm will really deliver safer systems.

Notes

1.  Power Distance Index is a measure of how much a particular culture values and respects authority (Hofstede 2004).

2.  In Outliers Gladwell points to the KAL poor accident record during the 1990’s as an example of a high PDI culture struggling with such a problem.

3.  The HAL 9000 class computer is of course the logical conclusion to the high PDI design appoach with human’s left as the ‘smart hands’ of the automation.

Further reading

Gladwell, M., Outliers, Little, Brown & Co., 2008.

Hofstede, G., Hofstede, G-J., Cultures and Organizations: Software of the Mind,  New York: McGraw-Hill U.S.A., 2004.

2 responses to The HAL effect

  1. 

    I am amazed by this. I never heard anybody speak of it before. Sadly, there are no people nearby to whom I could talk about this 8-(

    • 
      Matthew Squair 24/05/2011 at 9:16 am

      Glad you liked it. This is my attempt to explain why for example two sets of engineers (Boeing and Airbus) working on the same problem could come up with two different philosophies on how the automation and crew of their aircraft should interact. Boeing takes a pilot centric view with soft protection limitis while Airbus places firm constraints (bondage and discipline) on how the crew can fly the aircraft. If you want an even more striking example of cultural issues compare the way in which the Russian space program treats it’s cosmonauts versus the American. This was a huge cultural shock for American guest astronauts on Mir.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s