In 2013, the Meeting of State Parties to the Convention on prohibitions or restrictions on the use of certain conventional weapons which may be deemed to be excessively injurious or have indiscriminate effects (Convention on Certain Conventional Weapons – CCW) agreed on a mandate on lethal autonomous weapons systems (LAWS). It mandated its Chairperson to convene an informal Meeting of Experts 'to discuss the questions related to emerging technologies in the areas of lethal autonomous weapons systems in the context of the objectives and purposes of the Convention'. Such meetings of experts were convened three times, in 2014, 2015, and 2016, and produced reports which fed into meetings of the High Contracting Parties to the Convention.
In 2016, at the Fifth CCW Review Conference, High Contracting Parties decided to establish an open-ended Group of Governmental Experts on emerging technologies in the area of LAWS (GGE on LAWS), to build on the work of the previous meetings of experts. The group was re-convened in 2017, 2018 and 2019. In its 2019 report, the group recommended that its work continues in 2020 and 2021.
GGE in 2020 and 2021
In November 2019, the CCW Meeting of High Contracting Parties decided that the GGE work will continue in 2020 and 2021; the group is expected to meet for a total of ten days in 2020 and between ten to twenty days in 2021. In this period, the group is to explore and agree on possible recommendations on options related to emerging technologies in the area of LAWS, in the context of the objectives and purposes of the Convention.
The Group’s recommendations will be reported for consideration at the 2020 Meeting of High Contracting Parties and 2021 Sixth Review Conference.
GGE at a glance
Mandate
-
The GGE was mandated to examine issues related to emerging technologies in the area of LAWS in the context of the objectives and purposes of the Convention on Certain Conventional Weapons.
Composition
- The group is open to all High Contracting Parties and non-State Parties to the CCW, international organisations, and non-governmental organisations.
Issues discussed
- Characterisation of LAWS in order to promote a common understanding on concepts and characteristics relevant to the objectives and purposes of CCW
- Potential challenges posed by emerging technologies in the area of LAWS to international humanitarian law (IHL)
- The human element in the use of lethal force; aspects of human-machine interaction in the development, deployment and use of emerging technologies in the area of LAWS
- Potential military implications of related technologies
- Options for addressing the humanitarian and international security challenges posed by emerging technologies in the area of LAWS
Guiding principles
The guiding principles developed by the GGE were endorsed by the CCW Meeting of the High Contracting Parties in 2019. Below is an extract from the principles; the full version can be found in the report of the 2019 GGE and the report of the 2019 meeting of the CCW Meeting of High Contracting Parties

GGE’s work as reflected in annual reports
Report of the 2019 GGE
At its 2019 session, the group adopted the guiding principles affirmed in 2018 as a basis for its work. It also identified an additional principle, which states that human-machine interaction should ensure that the potential use of LAWS is in compliance with applicable international law, in particular IHL.
Other conclusions outlined in the group’s report included, among others:
- The potential use of weapons systems based on emerging technologies in the area of LAWS must be conducted in accordance with applicable international law, in particular IHL and its requirements and principles.
- States, parties to armed conflict, and individuals remain at all times responsible for adhering to their obligations under applicable international law, including IHL.
- Human judgement is essential to ensure that the potential use of weapons systems based on emerging technologies in the area of LAWS is in compliance with international law, and in particular IHL.
- A weapons system based on emerging technologies in the area of LAWS must not be used if it is of a nature to cause superfluous injury or unnecessary suffering, or if it is inherently indiscriminate, or is otherwise incapable of being used in accordance with the requirements and principles of IHL.
- Identifying and reaching a common understanding among High Contracting Parties on the concepts and characteristics of LAWS could aid further consideration of the aspects related to emerging technologies in the area of LAWS.
- Human responsibility for the use of weapons systems based on emerging technologies in the area of LAWS can be exercised in various ways across the life-cycle of these weapon systems and through human-machine interaction.
- During the design, development, testing and deployment of weapons systems based on emerging technologies in the area of LAWS, the risks inter alia of civilian casualties, as well as precautions to help minimise the risk of incidental loss of life, injuries to civilians, and damage to civilian objects must be considered.
- Research and development of autonomous technologies should not be restricted based on the sole rationale that such technologies could be used for weapons systems. But it is important to promote responsible innovation and use of such technologies.
The group continued discussions, without reaching an agreement, on possible policy options for addressing the humanitarian and international security challenges posed by emerging technologies in the area of LAWS. The four possible categories put forward in 2018 were reiterated.
The GGE also took note of multiple issues that would require further consideration and review, such as:
- Possible bias in the data sets used in algorithm-based programming relevant to emerging technologies in the area of LAWS.
- Different potential characteristics of emerging technologies in the area of LAWS.
- Developing a shared understanding on the concept of the human element in the use of emerging technologies in the area of LAWS.
- Further clarifications on the type and degree on human-machine interaction required in the development, deployment and use of emerging technologies in the area of LAWS.
Report of the 2018 GGE
In its report of the 2018 session, the GGE outlined a series of emerging commonalities, conclusions and recommendations.
- A set of possible principles to guide the work of the GGE was developed. Among them: IHL continues to apply to the development and use of LAWs; human responsibility must be retained when it comes to decisions on the use of weapons systems; risks assessments and mitigation measures should be part of the design, development, testing and deployment cycle of emerging technologies in any weapons systems.
- The report also summarised the group discussions on the human element in lethal force, aspects of human-machine interaction in the development, deployment and use of emerging technologies in the area of LAWS, and on the potential military implications of related technologies.
- When it comes to the possible policy options for addressing the humanitarian and international security challenges posed by emerging technologies in the context of LAWSs, the report outlines the four proposals discussed within the Group:
- A legally-binding instrument stipulating prohibitions and regulations on LAWS
- A political declaration that would outline principles such as the necessity of human control in the use of force and the importance of human accountability, and with elements of transparency and technology review
- Further discuss the human-machine interface and the application of existing international legal obligations
- A view was also expressed that no further legal measures were needed, as IHL is fully applicable to potential LAWS.
Report of the 2017 GGE
At the end of its November 2017 meeting, the GGE adopted a report outlining several conclusions and recommendations. Among these:
- IHL applies fully to all weapons systems, including the potential development and use of LAWS.
- Responsibility for the deployment of any new weapon systems in armed conflicts remains with states, which must ensure accountability for lethal action by any weapon system used by their forces in armed conflict in accordance with applicable international law, including IHL. The human element in the use of lethal force needs to be further considered.
- Given the dual nature of technologies in the area of intelligent autonomous systems, the Group's work should not hamper progress in or access to civilian research and development and use of these technologies.
- There is a need to keep potential military applications of related technologies under review in the context of the Group’s work.
- There is a need to further assess the aspects of human-machine interaction in the development, deployment, and use of emerging technologies in the area of LAWS.
- There should also be further discussions on possible options for addressing the humanitarian and international security challenges posed by emerging technologies in the area of LAWS.
GGE Meetings
GGE meetings in 2020:
- 21–25 September 2020 | First session of the GGE on LAWS
- 2–6 November 2020 | Second session of the GGE on LAWS
Previous GGE meetings (2017, 2018, and 2019):
-
20–21 August 2019, Geneva | Second meeting of the 2019 GGE on LAWS
-
25–29 March 2019, Geneva | First meeting of the 2019 GGE on LAWS
-
27–31 August 2018, Geneva | Second meeting of the 2018 GGE on LAWS
-
9–13 April 2018, Geneva | First meeting of the 2018 GGE on LAWS
- 13–17 November 2017, Geneva | Meeting of the 2017 GGE on LAWS
Additional resources
- Searching for meaningful human control: The April 2018 meeting on lethal autonomous weapons systems, by Barbara Rosen Jacobson, DiploFoundation (April 2018)
- Lethal Autonomous Weapons Systems: Mapping the GGE debate, by Barbara Rosen Jacobson, DiploFoundation (November 2017)
- Defending the boundary: Constraints and requirements on the use of autonomous weapon systems under international humanitarian and human rights law, by the Geneva Academy of International Humanitarian Law and Human Rights (May 2017)
- Artificial intelligence: Lethal autonomous weapons systems and peace time threats, by Regina Surber, ICT4Peace Foundation