Just because you can, doesn’t mean you ought
An interesting article by John Kaag and Sarah Kreps on the moral hazard that the use of drone strikes poses and how in the debate on their use there arises a confusion of the facts with value. To say that drone strikes are effective and near consequence free, at least for the perpetrator, does not equate to the conclusion that they are ethical and that we should carry them out. Nor does the capability to safely attack with focused lethality mean that we will in fact make better ethical decisions. The moral hazard that Kaag and Krep assert is that ease of use can all to easily end up becoming the justification for use. My further prediction is that with the increasing automation and psychological distancing of the kill chain this tendency will inevitably increase. Herman Kahn is probably smiling now, wherever he is.
However, I disagree with the author’s statement that technology is itself value free. Some technologies, the pre-eminent example being nuclear weapons, inherently dictate an organisational and societal response. If you have the bomb, then you’ll end up with an organisation and culture looking much like Strategic Air Command, because the bomb’s extreme lethality and low tolerance for error inherently dictates a rigid, hierarchical, secretive and authoritarian nuclear priesthood to safeguard it. If you don’t have such a culture, or allow it decay, then ‘this’ can happen.
There is I think, behind the specifics of this case a broader theme of the proponents of technology consistently confusing what they can do with what they ought when it comes to new technologies. Technologists seem to have a near universal tendency to focus on the left side of technology, that is how well the technology works, and confuse that with the right side problem, that of how the system interacts with the rest of the world. In my view this is because technologists find dealing with technology so much easier when they can treat their systems as being closed and free of ethical context. Such technical problems shorn of disturbing ethical ramifications can, as someone once remarked, be so very sweet. And of course if you do start worrying about the ethical compass of your actions then you can also find yourself ending up like Robert Oppenheimer.