The human role in LAWS: How could control look like?

16 Apr 2018 02:00h

Event report

The event ‘The Human Role in LAWS: How could control look like?’ took place on 9 April 2018 during the meeting of the Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS)​ at the United Nations Office in Geneva (UNOG).

The session was opened by Mr Ian McLeod who presented the third ‘Focus on’ report by the International Panel on Regulation of Autonomous Weapons (iPRAW). He explained that iPRAW is an independent scientific advisory panel, funded by the German Federal Foreign Office with the goal of assisting the discussion on autonomous weapons from a researcher’s perspective. The group includes robotics engineers, lawyers, and former military personnel and political scientists, and was initially formed in March 2017. The work of the panel focuses on the legal, ethical and operational challenges posed by the use of LAWS. In particular, its work has considered seven aspects:

  • International humanitarian law

  • State of technology and operations

  • Computational systems within the system of LAWS

  • Autonomy and human control

  • Ethics, norms and public perception

  • LAWS risks and opportunities: recommendations to the GGE.

More specifically, the third report focuses on the human role and control on the use of LAWS. He explained that the panel first considered the problematic definition of autonomy. Such a concept is usually discussed as a spectrum with respect to those specific functions that could be delegated to a machine. It follows that defining a system ‘autonomous’ is rather confusing because we can only speak of autonomous functions and not of autonomous systems per se.

Mr Marcel Dickow considered how autonomy is actually embedded in weapons. He maintained that from a technical point of view, a system cannot present a continuum of autonomy, rather offer the possibility of switching between two modes (i.e. fully automated or manual). He claimed that we cannot speak of ‘autonomy’ without also considering ‘control’. In particular, he distinguished two types of control:

  • Control by design: i.e. technical requirements built into the machine that enables human control (e.g. interface)

  • Control in use: procedural requirements to enable human control

In order to maintain control of LAWS, the operative system should require a two-step approach. Firstly, the ability of the human to understand the situation and its context, including the state of the weapon systems as well as the environment (situational understanding). Secondly, the option to appropriately intervene and get human control back should be available at all times (intervention).

Considering all the four aforementioned elements, Dickow illustrated a spectrum of options requiring human control which ranges from the highest level of human involvement (i.e when there is need for frequent situational understanding and need for human action to initiate the attack) to the lowest (i.e. when there is no immediate situational understanding and no immediate option for intervention).

Ms Anja Dahlmann focused on the scenario requiring the least human control. She defined this situation as ‘boxed autonomy’ based on precautionary resolutions without an immediate situational understanding. In this scenario, all safeguards and information gathering would have been done in advance and the machine would follow predefined failure modes in case it detects problems by itself.

The session was closed by the German ambassador, Mr Michael Biontino, who shared a word of caution towards such technology because computation mechanisms depend on the quality of data and they can fail; which is why human judgement is a crucial component of the targeting cycle.