Copied in full from Adam Rawnsley posted to Danger Room.
Britain’s Ministry of Defense would like British policymakers to start discussing the ethical development and use of unmanned aerial systems — before the Rise of the Machines, that is.
As first reported by The Guardian, a new study published by the U.K.’s Ministry of Defense warns that tackling the ethics of drones is important to do now, before we’re up to our ears in robots. “The UK Approach to Unmanned Aircraft Systems” by the Ministry’s in-house think tank nudges British defense planners to question whether their growing reliance on unmanned systems will make war too remote (and frequent) or allow robots to take on responsibilities that may be better suited for humans.
“It is essential that, before unmanned systems become ubiquitous (if it is not already too late) that we consider this issue and ensure that, by removing some of the horror, or at least keeping it at a distance,” it warns, “we do not risk losing our controlling humanity and make war more likely.”
The report also points to U.S. drone strikes in Yemen and Pakistan as proof of how unmanned systems have made the use of force likelier in places where commanders may have otherwise opted out. “That these activities are exclusively carried out by unmanned aircraft, even though very capable manned aircraft are available, and that the use of ground troops in harm’s way has been avoided, suggests that the use of force is totally a function of the existence of an unmanned capability,” it argues.
Whether we’d be hitting targets in Yemen in their absence or not, it’s clear that acquisitions of unmanned systems and the technology that powers them are moving fast. Today, the Defense Department has at least 7,000 drones – so many that now one in 50 troops in Afghanistan isn’t even a human. Research and development is already underway on everything from ambulance drones that treat and ferry the wounded to hospitals to a next generation killer drone that can take off and land on aircraft carriers.
The authors of the report envision a future battlefield in which both complexity and the desire to save on manpower costs will drive demand for more autonomy in unmanned systems. With a shrinking role for humans in the operation of systems equipped to mete out lethal force, it’s important that the U.K. quickly develop policy on “acceptable machine behaviour in future.” Perhaps to prod that discussion, the authors use the Terminator film franchise as a reference to do some provocative wondering aloud.
“There is a danger that time is running out,” it says, “is the technological genie already out of the ethical bottle, embarking us all on an incremental and involuntary journey towards a Terminator-like reality?”