Killer robots in the battlefield and the alleged accountability gap for war crimes

17 Apr 2018

Event report

The lecture on ‘Killer robots in the battlefield and the alleged accountability gap for war crimes’ was held by Professor Paola Gaeta on 10 April 2018 at the Graduate Institute of International and Development Studies (IHEID) in Geneva. Gaeta’s preliminary remark was that lethal autonomous weapons systems (LAWS) are also defined as ‘man-out-of the-loop’ systems because once such a weapon has been activated, human action is not required anymore, thus the systems acts autonomously. She also defined such systems as ‘fire-and-forget’ mechanisms due to the fact that human operation is not required after the launch.

Her intervention focused mainly on the role of accountability and responsibility when the deployment of such weapons results in serious criminal offences on the battlefield, such as targeting civilians or carrying out disproportionate attacks. Who should be held accountable for such crimes?

First, she analysed the responsibility of the robot. She considered the anthropocentric nature of criminal law, i.e. being directed to individuals and pursuing different goals, namely retribution, deterrence, incapacitation, rehabilitation, and restoration. In the case of robots, the application of criminal law would be problematic for three reasons.

  1. Vis-à-vis the identification of the criminal act. Under criminal law, the definition of ‘act’ implies human qualities, namely that act has to be goal-oriented on the basis of past experience and has to be carried out with self-consciousness.
  2. Vis-à-vis the ‘mens rea’ element. A criminal act is defined as such if there is the element of guilty mind. However, in the case of robots this is problematic because they are inherently incapable of having a guilty mind as they cannot distinguish between what is good and bad.
  3. Vis-à-vis punishment. Under some national legislations, legal persons can be held accountable for criminal acts. Besides being highly disputed, the punishment inflicted by such rules ultimately targets individuals. In the case of killer robots, some authors have claimed that complete and definite deactivation of the machine could serve as a sort of capital punishment for individuals. However, Gaeta questioned the rationale of such punishments.

Second, she moved to the responsibility of the military user. She considered the following:

  1. Irrelevance of the commander’s responsibility. If we were to simply look at LAWS as lethal machines operated by an individual, then accountability and liability would be considered as the robots are no more than weapons operated by a human commander. However, in the case of LAWS, the consideration focuses on extremely complex machines that are able of making decisions autonomously, unforeseeable by the human supervisor. This generates an accountability gap that makes command responsibility irrelevant because liability would be difficult to establish in a situation where the commander does not foresee the robot’s actions.
  2. Difficulty in establishing the guilty mind of the operator. Gaeta considered that fulfilling the ‘guilty mind’ requirement for war crimes in the conduct of hostilities and when deploying LAWS, depends mainly on the application of a specific legal framework. If we consider the Rome Statute of the International Criminal Court (ICC), the applicable test is the ‘intent’, namely that the operator wants to kill civilians deliberately and intentionally. Under the Rome Statute, the accountability gap would disappear, because the killing of civilians could be the result of the robot’s decision and not of the commander. However, if we look at the jurisprudence of the International Criminal Tribunal for the Former Yugoslavia (ICTY), the applicable test would be ‘recklessness’. In this case, the operator could be held accountable because they willingly took the risks associated with a machine that operates in an unforeseeable way.

Third, Gaeta mentioned the possible responsibility of the company and reaffirmed that the criminal responsibility of legal persons is still being disputed.

She concluded by additionally considering state responsibility. She stated that under the current legal framework, the state would surely be responsible because state responsibility is a crime of result: civilian causalities caused by the actions of an enemy’s army would suffice.

Finally, Gaeta considered that the use of LAWS reduces the chain of command and thus it could make the commander potentially responsible for the crimes committed by the machine. However, accountability gaps still exist in the current legal frameworks and it is thus necessary to develop appropriate regulations.

The lecture on ‘Killer robots in the battlefield and the alleged accountability gap for war crimes’ was held by Professor Paola Gaeta on 10 April 2018 at the Graduate Institute of International and Development Studies (IHEID) in Geneva. Gaeta’s preliminary remark was that lethal autonomous weapons systems (LAWS) are also defined as ‘man-out-of the-loop’ systems because once such a weapon has been activated, human action is not required anymore, thus the systems acts autonomously. She also defined such systems as ‘fire-and-forget’ mechanisms due to the fact that human operation is not required after the launch.

Her intervention focused mainly on the role of accountability and responsibility when the deployment of such weapons results in serious criminal offences on the battlefield, such as targeting civilians or carrying out disproportionate attacks. Who should be held accountable for such crimes?

First, she analysed the responsibility of the robot. She considered the anthropocentric nature of criminal law, i.e. being directed to individuals and pursuing different goals, namely retribution, deterrence, incapacitation, rehabilitation, and restoration. In the case of robots, the application of criminal law would be problematic for three reasons.

  1. Vis-à-vis the identification of the criminal act. Under criminal law, the definition of ‘act’ implies human qualities, namely that act has to be goal-oriented on the basis of past experience and has to be carried out with self-consciousness.
  2. Vis-à-vis the ‘mens rea’ element. A criminal act is defined as such if there is the element of guilty mind. However, in the case of robots this is problematic because they are inherently incapable of having a guilty mind as they cannot distinguish between what is good and bad.
  3. Vis-à-vis punishment. Under some national legislations, legal persons can be held accountable for criminal acts. Besides being highly disputed, the punishment inflicted by such rules ultimately targets individuals. In the case of killer robots, some authors have claimed that complete and definite deactivation of the machine could serve as a sort of capital punishment for individuals. However, Gaeta questioned the rationale of such punishments.

Second, she moved to the responsibility of the military user. She considered the following:

  1. Irrelevance of the commander’s responsibility. If we were to simply look at LAWS as lethal machines operated by an individual, then accountability and liability would be considered as the robots are no more than weapons operated by a human commander. However, in the case of LAWS, the consideration focuses on extremely complex machines that are able of making decisions autonomously, unforeseeable by the human supervisor. This generates an accountability gap that makes command responsibility irrelevant because liability would be difficult to establish in a situation where the commander does not foresee the robot’s actions.
  2. Difficulty in establishing the guilty mind of the operator. Gaeta considered that fulfilling the ‘guilty mind’ requirement for war crimes in the conduct of hostilities and when deploying LAWS, depends mainly on the application of a specific legal framework. If we consider the Rome Statute of the International Criminal Court (ICC), the applicable test is the ‘intent’, namely that the operator wants to kill civilians deliberately and intentionally. Under the Rome Statute, the accountability gap would disappear, because the killing of civilians could be the result of the robot’s decision and not of the commander. However, if we look at the jurisprudence of the International Criminal Tribunal for the Former Yugoslavia (ICTY), the applicable test would be ‘recklessness’. In this case, the operator could be held accountable because they willingly took the risks associated with a machine that operates in an unforeseeable way.

Third, Gaeta mentioned the possible responsibility of the company and reaffirmed that the criminal responsibility of legal persons is still being disputed.

She concluded by additionally considering state responsibility. She stated that under the current legal framework, the state would surely be responsible because state responsibility is a crime of result: civilian causalities caused by the actions of an enemy’s army would suffice.

Finally, Gaeta considered that the use of LAWS reduces the chain of command and thus it could make the commander potentially responsible for the crimes committed by the machine. However, accountability gaps still exist in the current legal frameworks and it is thus necessary to develop appropriate regulations.