•  
  •  
 
International Law Studies

Abstract

Already a controversial topic, legal debate and broader discussions concerning the amount of human control required in the employment of autonomous weapons—including autonomous cyber capabilities—continues. These discussions, particularly those taking place among States that are Parties to the 1980 Certain Conventional Weapons Convention, reveal a complete lack of consensus on the requirement of human control and serve to distract from the more important question with respect to autonomy in armed conflict: under what conditions could autonomous weapons “select” and “attack” targets in a manner that complies with the law of armed conflict (LOAC).

This article analyzes the specific LOAC rules on precautions in attack, as codified in Article 57 of Additional Protocol I, and asserts that these rules do not require human judgment in targeting decisions. Rather, these rules prescribe a particular analysis that must be completed by those who plan or decide upon an attack prior to exercising force, including decisions made by autonomous systems without meaningful human control. To the extent that autonomous weapons and weapons systems using autonomous functions can be designed and employed in such a way to comply with all required precautions, they would not violate the LOAC. A key feature of determining the ability of autonomous weapons and weapons systems using autonomous functions to meet these requirements must be a rigorous weapons review process.

html

Share

COinS