UN GGE on LAWS: Day 1 (morning)

26 Mar 2019 01:00h

Event report

The first session of the 2019 United Nations Group of Governmental Experts on Lethal Autonomous Weapons Systems (UN GGE on LAWS) was opened on 25 March 2019 and chaired by Mr Ljupčo Jivan Gjorgjinski (Minister Counsellor, Chargé d’affaires a.i. of North Macedonia). The first part of the meeting was dedicated to the adoption of the agenda, the confirmation of the rules of procedure, and the organisation of the work of the GGE. On the organisation of the work of the GGE, there was a strong proposal for two amendments: first, the removal of the clarifying questions to the agenda item, arguing that this would impact the time available for substantial discussions; and second, the shortening of the time for discussions on the human element. The first amendment was contested by some delegations, with the respective arguments that delegations can be pragmatic and decide when and in which circumstances to address the questions. With regard to the second amendment, other delegations stressed the importance of the human element. Therefore, a longer time is needed for discussing it. There was also a proposal to add a footnote in the organisation of the work, explaining that the clarifying questions are indicative questions from the chair, which are not subject to consensus. The proposal was rejected by one delegation, which suggested inserting the questions in an additional Food for Thought Paper. The chair stressed that this was the initial goal of the questions, which will eventually be discussed in dedicated sessions during the meeting.

The session then moved to item 5(c) Review of the potential military applications of related technologies in the context of the GGE’s work with the guiding questions:

  • How and to what extent is human involvement in the use of force currently exercised with existing weapons that employ or can employ autonomy in their critical functions, over different stages of their life cycle?  
  • How is responsibility ensured for the use of force with existing weapons that employ or can be employed with autonomy in their critical functions? Relevant existing weapons could include types of:
    Air defence weapon systems with autonomous modes or functions;
    Missiles with autonomous modes or functions;
    Active protection weapon systems with autonomous modes or functions;
    Loitering weapons with autonomous modes or functions;
    Naval or land mines with autonomous modes or functions;
    ‘Sentry’ weapons with autonomous modes or functions.

Interventions from the delegations were based on the experiences from High Contracting Parties, previous sessions’ written contributions, and the following working papers:

Submitted by the Russian Federation, the paper highlights the benefits of LAWS with regards to the decrease of application of weapons, the increase of accuracy of weapon guidance,  and the lowering the rate of unintentional strikes against civilians and civilian items. Furthermore, the Russian Federation underlined the possible uses of LAWS for the destruction of military facilities; protection and safekeeping of critical infrastructure (atomic power plants, dams, bridges, and so on); elimination of terrorist groups; and protection of civilians. Moreover, Russia argued that the existing automated systems used in military apparatus should not fall into a ‘special’ category requiring restrictions or bans. It further argued that it is the degree of automation that allows the system to operate in ‘dynamic combat situations and in various environments while ensuring an adequate level of discrimination and accuracy’. Russia’s position stressed that the compliance to international humanitarian law (IHL) is driven by the degree of automation, existing international frameworks are already applicable and they already limit automated weapons systems, therefore do not need to be updated. Among these are: indiscriminate and disproportionate use of LAWS, as well as their use against civilians or without precautions taken to protect civilians is unacceptable; any military use of LAWS should be conducted in compliance with the principle of proportionality between the military necessity and the damage caused; the decision on whether and how to use LAWS is made by a person planning the military operation and developing scenarios of the use (mission) of these systems.

Australia stressed its position in the use of military force, and the implementation of the ‘system of control’ which ‘incrementally builds upon itself, embedding controls into military processes and capability at all stages of their design, development, training and usage’. In Australia’s view, this would ensure respect of the principles of accountability and responsibility.

The paper was submitted by Japan. With regard to the definition of LAWS, Japan stressed the need to deepen the discussion on the notion of lethality and forms of human control. Moreover, with regards to the scope of rules, Japan highlighted that fully autonomous weapons systems with lethality do not allow a meaningful human control. International law ethics, as well as the principles of IHL should be included in the development of LAWS, and any violation of IHL should be attributed ‘to States or individual persons as is the case with conventional weapons systems’. Finally, Japan underlined the importance of information sharing and confidence building measures, necessary for ensuring secure transparency.

Crucial aspects that were recalled and re-stressed were the need for LAWS to comply to existing legal frameworks of international law (IL), IHL and especially compliance to the Martens Clause. New technologies complement military activities and have the potential to change armed conflicts. Many European delegations recalled the need for meaningful human control as the only element able to comply to principles of IHL. To this extent, it was proposed to push for awareness training with the aim of promoting responsible innovation for policymakers and businesses. Underlining this view, a delegation re-proposed the adoption of a political declaration outlining principles such as the necessity of human control in the use force, the importance of human accountability, and the elements of transparency and technology review. Finally, the need for trust between humans and machines was acknowledged, recalling the role of art. 36 API in addressing the deployment of new technologies.

Delegations recognised that the use of LAWS can have beneficial aspects in complementing military activities. It was argued that the discussions of possible options must involve discussions on how these technologies can enhance the protection of civilians and civilian items. Nonetheless, as other delegations pointed out, the disarmament machinery and arms control on LAWS can lead to an arms race from states and non-state actors. Additional issues were stressed with regards to the biases LAWS can perpetuate. For instance, artificial intelligence (AI) can be heavily biased and such algorithmic biases can rise at all stages of development. In addition to that, a delegation highlighted that discussions and decisions on LAWS should not amplify existing asymmetries.

A final intervention from the floor clarified that there is the need to look at the development of AI systems with impartiality and objectivity in order not to sacrifice the development of science and technology in discussions.