Group of Governmental Experts on Lethal Autonomous Weapons Systems (GGE LAWS)

25 Mar 2019 to 29 Mar 2019
Palais des Nations, 1211 Geneva 10
Geneva, Switzerland

Resource4Events

Event report/s:
Stefania Grottola

This session focused on item 5(a) An exploration of the potential challenges posed by emerging technologies in the area of LAWS to international humanitarian law (IHL).

This session focused on item 5(a) An exploration of the potential challenges posed by emerging technologies in the area of LAWS to international humanitarian law (IHL). The discussion was guided by the following questions:

  • Does autonomy in the critical functions of weapons systems challenge the ability of states or parties to a conflict, commanders, and individual combatants to apply IHL principles on the conduct of hostilities (distinction, proportionality, precautions) in carrying out attacks in armed conflict?  

  • Does autonomy in the critical functions of weapons systems challenge the maintenance of combatant and commander responsibility for decisions to use force?  

  • What is the responsibility of states or parties to a conflict, commanders, and individual combatants in decisions to use force involving autonomous weapons systems, in light of the principles of IL derived from established custom, from the principles of humanity and the dictates of public conscience (Martens Clause)?  

  • How can legal reviews of weapons with autonomous functions contribute to compliance with IHL? What are past or potential challenges in conducting weapons reviews of weapons with autonomy in their critical functions, and how can these challenges be addressed?

The session further discussed the application of existing legal frameworks for the deployment of LAWS, with a focus on the implementation of Art. 36 Additional Protocol (I) to the Geneva Conventions, 1977 (API); on the eventual necessity of negotiating new legal frameworks, meeting the challenges posed by autonomous technologies; and on the recurrent aspect of meaningful human control.

Additional points were raised, arguing that responsibilities for the use of systems exist regardless of the autonomy of the weapon being used. From the floor, the USA explained its working paper CCW/GGE.1/2019/WP.5 - Implementing International Humanitarian Law in the Use of Autonomy in Weapon Systems: existing IHL legal frameworks apply to the use of LAWS; nonetheless, the delegation acknowledged that emerging technologies in the area of LAWS could strengthen the implementation of IHL, by reducing the risk of civilian casualties, and facilitating the investigation or reporting of incidents involving potential violations. Another delegation tried to push the discussion a step further, arguing that the existing legal frameworks of IL and IHL should be complemented by Criminal Law as well. Furthermore, they recalled the importance of trust and the role of Confidence Building Measures (CBMs). Another intervention from the floor complemented this view by explaining that while IHL represents a good basis for discussions on the use of LAWS, additional legal instruments need to be developed to meet the specificities of the new technology involved. Proposals to regulate such systems through legal binding instruments were proposed by the Non-Aligned Movement and Other States Parties to the CCW, in accordance to the working paper, CCW/GGE.1/2018/WP.1 - General Principles on Lethal Autonomous Weapons Systems, proposed in April 2018.

Moreover, some aspects of LAWS were re-stressed with regards to the features of lethality, autonomy, and machine learning. On the last point, the chair underlined the risks involved in the use of different kinds of datasets which are not reviewed, and that therefore, create the most risks.

The delegations reacalled the achievements reached in previous sessions in agreeing that international humanitarian law (IHL) applies to LAWS, with a particular focus on how Art. 36 API provides the necessary legal reviews of new systems and weapons, even by countries that did not ratify the Protocol. Expanding the point, delegations recalled the need for establishing mechanisms of information and best practice sharing to address the challenges posed by LAWS. Furthermore, a delegation proposed the creation of a compendium collecting of best practices on the use of LAWS in compliance with IHL. An interesting proposition was additionally put forth with the aim of strengthening Art. 36 API. The establishment of an annual report mechanism on the development of LAWS and the creation of a checklist and toolkit or guiding principles to refer to was proposed. Nonetheless, a delegation raised some concerns about such legal reviews: Algorithms may produce different results in different environments, therefore, there is a crucial need to test them in realistic contexts and rely on certifications. Indeed, the self-learning capabilities of the systems represent important challenges for system reviewers. Finally, weapons systems always have a margin of error for which only the human being deploying the weapon can be held accountable. Another delegation reiterated, IHL is highly context-dependent and therefore, the critical functionalities of the system should be assessed case by case. Regarding the wording ‘critical functionalities’, an intervention from the floor contested its abstract use, arguing that it might create misunderstandings over its meaning.

In order to ensure the full applicability of IHL, delegations stressed the crucial importance of human control as the only variable able to ensure the respect of the principles of distinction, proportionality and precaution, accountability, and responsibility. Following this line, a delegation said that it is currently impossible for a machine to replicate the human experience and capacity for understanding a conflict situation; while another argued that LAWS do not have the ability to make decisions proportionally and to respect and comply to ethical values. The indispensable necessity to have meaningful human control in the use and development of LAWS was also stressed. It was explained that in a narrow human-in-the-loop situation (in which the human action is related to the deployment of one system) or a wider human-in-the loop situation (in which the human actor is in control of a broader range of systems), there exists the crucial necessity to always be able to control, modify, or abort the deployment of the weapon. To ensure this, new weapons reviews need to satisfy high standards of predictability and reliability. On this last point, delegations stressed the importance of a multi-dimensional approach, as well as the need to develop AI systems with an holistic approach.

A last intervention underlined the qualitative measurement and judgment needed in order to comply to the principles of proportionality, distinction, and precaution at the core of IHL, which can be assured only by human commanders and combatants. In addition to that, the need to have control by design in the development of new weapon systems, and control in the use of those systems, which is most important for the compliance of the conduct of hostilities under IHL was also stressed. The intervention reiterated the necessity of always having human supervision and the ability to intervene; predictability and reliability features embedded in the systems; and a possibility to always have operational constraints.

 

Cedric Amon

This session considered item 5(e): Possible options for addressing the humanitarian and international security challenges posed by emerging technologies in the area of LAWS in the context of th

This session considered item 5(e): Possible options for addressing the humanitarian and international security challenges posed by emerging technologies in the area of LAWS in the context of the objectives and purposes of the CCW without prejudiced policy outcomes and taking into account the past, present, and future proposals

The discussion was guided by the following questions:  .

  • What are the advantages and disadvantages of the proposed approaches to ensuring compliance with IHL and responsibility for decisions on the use of weapons systems and the use of force?

    • legally binding instrument;

    • political declaration;

    • guidelines, principles or codes of conduct; and

    • improving implementation of the existing legal requirements, including legal reviews of weapons.

  • Given that these options are not necessarily mutually exclusive, and the common goal of ensuring compliance with IHL and maintaining human responsibility for the use of force, what are possible next steps to be taken by the GGE?  

  • How can the GGE build upon the areas of convergence captured in the Possible Guiding Principles agreed in 2018? How can those principles be operationalised?

A delegation supplemented its points from the previous day and added concrete examples in order to foster a better understanding of its positions. Some have considered that the GGE should establish standards for LAWS (i.e. tolerable levels of unpredictability, autonomy). In their opinion, this would only be possible if the GGE was composed of technical experts and if the delegations had the appropriate technical understanding about these issues. The delegation further believes that more discussions would be useful to identify how IHL applies to newly developed weapons. They explained that in their national reviews, IHL assessment is already implemented in weapons acquisition processes wherein weapon reviews test the effectiveness and survivability among other criteria. An example from 2015 about the performance testing of ANTPQ53 (a counter-battery radar which can direct artillery fire to the place where enemy mortar attacks originate from) was then provided. The Internet link of the review will be provided to interested parties. They explained that the acquisition of this system also entailed a particular legal review in the process. The assessment includes a distinction between military objects, civilians, military forces and objects. The data used for that system was collected between 2012 and 2015.

The review found that the radar acquired targeted quite well but did have difficulties distinguishing between mortar and shells. False target occurrence gave off warnings in the situation where no threat was present. The DoD regulation makes sure that the weapon operates in a tolerable limit of mistakes.

Another delegation explained that there are two criteria used to determine the lawfulness of weapons: its intended use and proportionality. LAWS’ lawfulness therefore depends on the operationalibility in which they work. However, today’s conflicts are mostly fought in cities and involve many civilians. Therefore, compliance with IHL would be extremely difficult for automated systems that do not rely on human oversight. Additionally, they noted that the law applies to humans and not to machines. Moreover, according to the principles of international law, states continue to be responsible for the development and use of autonomous systems. In this view, the lawfulness of weapons thus has to be determined by its intended use and additional legal review systems would be necessary in order to adequately assess LAWS.

Other delegations welcomed the proposal calling for the start of negotiations on a binding instrument.

A delegation reminded the participants of the experiences regarding the ban of cluster ammunition which became part of IHL in 2010, which has had effects even on countries which did not ratify the treaty. They noted that IHL is not static and underlined that prior to the ban of cluster ammunition, no treaties to preventively prohibit the use of specific weapons had been signed. The delegation called upon states to undertake weapons reviews and to improve the review processes worldwide. It was further noted that before the treaty to ban cluster ammunition, IHL was not enough to cover that type of weapon, and so, it is very unlikely that IHL would be enough to cover LAWS, as well as any new and unknown systems. Art. 36 on its own was deemed insufficient, thus a negotiating mandate of the GGE was fully welcomed.

An intervention from the floor emphasised that weapon reviews are necessary but not enough. They noted that it will become increasingly difficult for commanders to understand systems and whether the use of LAWS is lawful in a specific context. With additional processes, Art. 36 reviews can be improved.

The Chair explained that a report summarising the issue is expected and so the issue will be further discussed in the following days.

 

Stefania Grottola

This side event, organised by the Campaign to Stop Killer Robots, was moderated by Ms Mary Wareham (Human Rights Watch) and focused on the public views on fully aut

This side event, organised by the Campaign to Stop Killer Robots, was moderated by Ms Mary Wareham (Human Rights Watch) and focused on the public views on fully autonomous systems. The event featured contributions by Dr Thompson Chengeta (International Committee for Robots Arms Control), Ms Alena Popova (Founder, Ethics and Technology), and Ms Liz O’Sullivan (Activist and Operational Leader).

Chengeta talked about the notion of human control and the importance of establishing its degree. He structured his presentation around two main questions: What makes the human control a necessity? And, what determines the level of human control that should be taken? With regard to the first question, he explained that from the military, philosophical, ethical and judicial viewpoint, human control is seen as necessary. With regard to the second question, what determines the level of human control, he argued, is a sum of different parts. First, there are state obligations determined by jus ad bellum (‘right to war’) and the laws on the use of force. Second, there is the production stage, in which control by design is implemented during the stages of development. He stressed that it is in the production stage that human control is first enabled. Third, human control must be present while using weapons in order to comply with jus in bello (international humanitarian law (IHL)). In the last stage, termed ‘after use’, the result of an action has to reflect the intention of its initiator. Following these concepts, he outlined the four stages of human decision-making, from the production to the use of weapons:

  • Human decision-making during the production of the weapon: civil/business liability

  • Human decision-making when state authority uses the weapon: state responsibility

  • Human decision-making during the initial command to deploy: command responsibility

  • Human decision-making during targeting: individual responsibility

Finally, he concluded that states should support a legally binding system that would ban the development and use of autonomous weapons systems.

O’Sullivan stressed the dangers of delegating critical functions to algorithms, and pressed that lethal autonomous weapons (LAWs) are weapons of mass destruction rather than conventional ones. She justified her argument with several points. First, she explained that algorithms perpetuate biases embedded in the data used for training them. Furthermore, some training is incomplete, such as the recognition of disabled people. As a result, the deployment of such technologies raises error rates when encountering civilians and the wounded. Second, any function of targeting and attacking is inherently vulnerable to accidents and hacking. Third, artificial intelligence (AI) yields the black box dilemma, i.e., engineers are not always able to understand why a machine has made a particular decision. As a result, in cases related to military activities, overseeing accountability is made even more difficult. Fourth, she explained that machines based on AI and machine learning mechanisms will never be able to make decisions upon a moral framework that is sufficient to pass judgements regarding human life. Therefore, a machine will never be able to understand when a civilian becomes a combatant by taking part in hostilities, when a soldier is surrendering, or similar situations that require subtle human control and judgment.

Stefania Grottola

The session focused on item 5(d) Characterisation of LAWS, in order to promote a common understanding of concepts and characteristics relevant to the objectives and purposes of the convention

The session focused on item 5(d) Characterisation of LAWS, in order to promote a common understanding of concepts and characteristics relevant to the objectives and purposes of the convention. Focusing on the application of IHL, the session was guided by the following clarifying questions, proposed by the chair:

Which characteristics of autonomous weapons systems would be important from the point of view of IHL and the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (CCW) specifically?  

  • Is autonomy an attribute of a weapon system as a whole or should it be attached to different tasks of weapons systems?
  • Is the environment of deployment, specific constraints on time of operation, or scope of movement over an area, important from an IHL/CCW perspective?
  • Is a differentiation between anti-personnel and anti-material weapons meaningful from an IHL/CCW perspective?
  • The session was characterised by discussions on three main points: the need for human control in order to ensure compliance with IL and IHL; the distinction between civilian uses of new technologies and military applications; the need for a definitive or working definition for LAWS.

Delegations recalled the need of having a human-centric approach, underlining human control as a necessary element. As explained by a delegation, the use of systems that do not have human control mechanisms is prohibited by existing legal frameworks. Therefore,  IHL prohibits the use of LAWS because they do not have the capacity of applying the principles of distinction, proportionality, and precautions. This view was complemented by the argument that human control is an indispensable variable to avoid the dehumanisation of wars. In addition to that, delegations stressed the need for review of new systems in accordance to Art. 36 API and systems which are not in compliance with IL and IHL should not be deployed. In this context, the International Committee of the Red Cross’s (ICRC) contribution should be highlighted. The ICRC proposed a distinction between weapons systems in which the human actor chooses the objective; and weapon systems in which the exact time and location is determined by the weapon according to the environment (found to a limited extent in air defense systems). Especially in the last case, it is essential to ensure meaningful human control.

The unprecedented characteristic of LAWS is the reliance of technologies which might be useful in civilian activities. As a result, some delegations such reiterated the need to distinguish between military and civilian uses of new technologies, arguing that the development of new technologies should not be restricted for civilian uses.

However, the main topic of discussion was on the question of whether a definition for LAWS is crucial for the future work of the GGE, and on the elements that should characterise this definition. While on one hand, a delegation argued for the necessity of a clear and universally agreed understanding of what LAWS are, other delegations pointed out that a definition is not indispensable and does not prevent the positive outcomes of the future work of the UN GGE on LAWS. Recalling the need to have a clear understanding of the topic of discussion, a delegation proposed to agree on a working definition; whereas others reminded that as well as the Treaty on the Non-Proliferation of Nuclear Weapons (NPT) does not include a definition for nuclear weapons, the same logic can be applied to LAWS, even considering that too narrow technical definitions can become soon obsolete with the disruptive and rapid evolution of new technologies. Following this line, it was argued that definitions have to be pragmatic and not abstract. A final intervention from the floor argued that negotiating a binding definition for LAWS is a premature step. Nonetheless, the need for a better understanding of the systems was largely shared and featured by many interventions highlighting the elements that characterise a LAWS. The following represent the most recurrent ones:

  • Concept of full autonomy: Refers to the notion of a system able to operate independently without the intervention, decision, and command of a human actor. The concept of autonomy was questioned by one delegation with the argument that it represents a relative term: as a result, the delegation proposed to change the wording with ‘weapon systems having autonomous features or functions’. On this point, it was underlined that the notion of a fully autonomous system has limited utility.
  • Design to conduct military tasks without human operator: Refers to the idea that these systems are able to run through a targeting cycle, with the final intention to apply lethal force, without any human intervention (described in the working paper CCW/GGE.1/2019/WP.4 - Food for Thought Paper, submitted by Belgium, Ireland, and Luxembourg).
  • Design, location, and target is not known to the human operator: Refers to the idea of machine learning systems which are able to adapt to complex environment scenarios and select the appropriate actions to accomplish their mandate. With regards to this variable, a delegation proposed three additional elements: self-mobility, self-direction ability, self-determination ability.
  • Lethality: Represented one of the most contested element. While some delegations stressed it as an indispensable variable in the definition of LAWS, a delegation argued against the notion of lethality as being a prerequisite in the characterisation of LAWS, justifying the statement with the idea that non-lethal systems can have lethal effects and implications when deployed.
  • Indiscriminate feature: Implies a loss in the principle of distinction and proportionality under IHL.
  • Ability to redefine without human intervention (explicitly described in the Food for Thought Paper): Refers to the notion of learning from the complex environment in which the system has been deployed and take decisions for the accomplishment of the mission without human intervention.
  • Lack of chain of commands once the system is deployed
  • Reliability and predictability: refers to the need of systems to satisfy high standards.
  • Ability to impose constraints once the weapons are deployed: Refers to the possibility to always take control of the system even when it is deployed. In other words, it refers to the idea of being able to have the human control over the system at all times.
Stefania Grottola

The first session of the 2019 United Nations Group of Governmental Experts on Lethal Autonomous Weapons Systems (UN GGE on LAWS) was opened on 25 March 2019 and chaired by Mr Ljupčo Jivan Gjorgjins

The first session of the 2019 United Nations Group of Governmental Experts on Lethal Autonomous Weapons Systems (UN GGE on LAWS) was opened on 25 March 2019 and chaired by Mr Ljupčo Jivan Gjorgjinski (Minister Counsellor, Chargé d’affaires a.i. of North Macedonia). The first part of the meeting was dedicated to the adoption of the agenda, the confirmation of the rules of procedure, and the organisation of the work of the GGE. On the organisation of the work of the GGE, there was a strong proposal for two amendments: first, the removal of the clarifying questions to the agenda item, arguing that this would impact the time available for substantial discussions; and second, the shortening of the time for discussions on the human element. The first amendment was contested by some delegations, with the respective arguments that delegations can be pragmatic and decide when and in which circumstances to address the questions. With regard to the second amendment, other delegations stressed the importance of the human element. Therefore, a longer time is needed for discussing it. There was also a proposal to add a footnote in the organisation of the work, explaining that the clarifying questions are indicative questions from the chair, which are not subject to consensus. The proposal was rejected by one delegation, which suggested inserting the questions in an additional Food for Thought Paper. The chair stressed that this was the initial goal of the questions, which will eventually be discussed in dedicated sessions during the meeting.

The session then moved to item 5(c) Review of the potential military applications of related technologies in the context of the GGE’s work with the guiding questions:

  • How and to what extent is human involvement in the use of force currently exercised with existing weapons that employ or can employ autonomy in their critical functions, over different stages of their life cycle?  
  • How is responsibility ensured for the use of force with existing weapons that employ or can be employed with autonomy in their critical functions? Relevant existing weapons could include types of:
    Air defence weapon systems with autonomous modes or functions;
    Missiles with autonomous modes or functions;
    Active protection weapon systems with autonomous modes or functions;
    Loitering weapons with autonomous modes or functions;
    Naval or land mines with autonomous modes or functions;
    ‘Sentry’ weapons with autonomous modes or functions.

Interventions from the delegations were based on the experiences from High Contracting Parties, previous sessions’ written contributions, and the following working papers:

Submitted by the Russian Federation, the paper highlights the benefits of LAWS with regards to the decrease of application of weapons, the increase of accuracy of weapon guidance,  and the lowering the rate of unintentional strikes against civilians and civilian items. Furthermore, the Russian Federation underlined the possible uses of LAWS for the destruction of military facilities; protection and safekeeping of critical infrastructure (atomic power plants, dams, bridges, and so on); elimination of terrorist groups; and protection of civilians. Moreover, Russia argued that the existing automated systems used in military apparatus should not fall into a ‘special’ category requiring restrictions or bans. It further argued that it is the degree of automation that allows the system to operate in ‘dynamic combat situations and in various environments while ensuring an adequate level of discrimination and accuracy’. Russia’s position stressed that the compliance to international humanitarian law (IHL) is driven by the degree of automation, existing international frameworks are already applicable and they already limit automated weapons systems, therefore do not need to be updated. Among these are: indiscriminate and disproportionate use of LAWS, as well as their use against civilians or without precautions taken to protect civilians is unacceptable; any military use of LAWS should be conducted in compliance with the principle of proportionality between the military necessity and the damage caused; the decision on whether and how to use LAWS is made by a person planning the military operation and developing scenarios of the use (mission) of these systems.

Australia stressed its position in the use of military force, and the implementation of the ‘system of control’ which ‘incrementally builds upon itself, embedding controls into military processes and capability at all stages of their design, development, training and usage’. In Australia’s view, this would ensure respect of the principles of accountability and responsibility.

The paper was submitted by Japan. With regard to the definition of LAWS, Japan stressed the need to deepen the discussion on the notion of lethality and forms of human control. Moreover, with regards to the scope of rules, Japan highlighted that fully autonomous weapons systems with lethality do not allow a meaningful human control. International law ethics, as well as the principles of IHL should be included in the development of LAWS, and any violation of IHL should be attributed ‘to States or individual persons as is the case with conventional weapons systems’. Finally, Japan underlined the importance of information sharing and confidence building measures, necessary for ensuring secure transparency.

Crucial aspects that were recalled and re-stressed were the need for LAWS to comply to existing legal frameworks of international law (IL), IHL and especially compliance to the Martens Clause. New technologies complement military activities and have the potential to change armed conflicts. Many European delegations recalled the need for meaningful human control as the only element able to comply to principles of IHL. To this extent, it was proposed to push for awareness training with the aim of promoting responsible innovation for policymakers and businesses. Underlining this view, a delegation re-proposed the adoption of a political declaration outlining principles such as the necessity of human control in the use force, the importance of human accountability, and the elements of transparency and technology review. Finally, the need for trust between humans and machines was acknowledged, recalling the role of art. 36 API in addressing the deployment of new technologies.

Delegations recognised that the use of LAWS can have beneficial aspects in complementing military activities. It was argued that the discussions of possible options must involve discussions on how these technologies can enhance the protection of civilians and civilian items. Nonetheless, as other delegations pointed out, the disarmament machinery and arms control on LAWS can lead to an arms race from states and non-state actors. Additional issues were stressed with regards to the biases LAWS can perpetuate. For instance, artificial intelligence (AI) can be heavily biased and such algorithmic biases can rise at all stages of development. In addition to that, a delegation highlighted that discussions and decisions on LAWS should not amplify existing asymmetries.

A final intervention from the floor clarified that there is the need to look at the development of AI systems with impartiality and objectivity in order not to sacrifice the development of science and technology in discussions.

The Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems (GGE LAWS) will meet on 25-29 March 2019, in Geneva, Switzerland.

The group was established following a decision taken in 2016 by the High Contracting Parties to the Convention on the Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects (Convention on Certain Conventional Weapons - CCW), and is mandated to examine issues related to emerging technologies in the area of lethal autonomous weapons systems (LAWS) in the context of the objectives and purposes of the CCW. 

The group has an open-ended nature, and is open to all High Contracting Parties and non-State Parties to the CCW, international organisations, and non-governmental organisations. It is expected to submit a report to the 2017 Meeting of the High Contracting Parties to the Convention.

For more information, visit the group webpage.

 

The GIP Digital Watch observatory is provided by

 

 

and members of the GIP Steering Committee



 

GIP Digital Watch is operated by

Scroll to Top