Skip to main content
padlock icon - secure page this page is secure

Saying 'No!' to Lethal Autonomous Targeting

Buy Article:

$54.00 + tax (Refund Policy)

Plans to automate killing by using robots armed with lethal weapons have been a prominent feature of most US military forces' roadmaps since 2004. The idea is to have a staged move from 'man-in-the-loop' to 'man-on-the-loop' to full autonomy. While this may result in considerable military advantages, the policy raises ethical concerns with regard to potential breaches of International Humanitarian Law, including the Principle of Distinction and the Principle of Proportionality. Current applications of remote piloted robot planes or drones offer lessons about how automated weapons platforms could be misused by extending the range of legally questionable, targeted killings by security and intelligence forces. Moreover, the alleged moral disengagement by remote pilots will only be exacerbated by the use of autonomous robots. Leaders in the international community need to address the difficult legal and moral issues now, before the current mass proliferation of development reaches fruition.
No Reference information available - sign in for access.
No Citation information available - sign in for access.
No Supplementary Data.
No Article Media
No Metrics

Keywords: discrimination; distinction; drones; ethics; military robotics; proportionality; robotics

Document Type: Research Article

Affiliations: Department of Computer Science, University of Sheffield, UK

Publication date: December 1, 2010

More about this publication?
  • Access Key
  • Free content
  • Partial Free content
  • New content
  • Open access content
  • Partial Open access content
  • Subscribed content
  • Partial Subscribed content
  • Free trial content
Cookie Policy
Cookie Policy
Ingenta Connect website makes use of cookies so as to keep track of data that you have filled in. I am Happy with this Find out more