The human-machine relationship: Lessons from military doctrine and operations
CCW Group of Governmental Experts on Lethal Autonomous Weapons Systems – First 2018 Meeting
9 Apr 2018 11:00h - 14 Apr 2018 01:30h
12 Apr 2018 02:00h
On 10 April 2018 at the Convention on Certain Conventional Weapons (CCW) in Geneva, Switzerland, the session ‘The Human-Machine Relationship: Lessons from Military Doctrine and Operations’ was moderated by Ms Merel Ekelhof, PhD Researcher, VU University Amsterdam. The session aimed to increase transparency about targeting practices in military operations, and it discussed issues related to ‘sufficient’ human control, building on existing doctrine and knowledge acquired from military operations. Panellists began by disclaiming their views as their own, stating that they are not reflective of their government’s opinions. Ekelhof continued by defining NATO’s six-phase deliberate targeting cycle. She explained that targeting is formalised in doctrine as guidance for military operations.
Lt Col Matthew King, chief, Air and Space Law, Headquarters Air Force Operations and International Law Directorate, discussed the legal considerations in the law of war. Military necessity, distinction, proportionality, and precautions in attack help to determine whether to use military action. King stated that the rules of engagement encompass two broad facets of target prosecution: validation and execution. This involves considering whether there is a valid target and appropriate authority; collateral damage is also estimated, and other feasible cautions are investigated.
Dr Larry Lewis, director, Center for Autonomy and AI, CNA, articulated the importance of assessing risk to civilians in using lethal autonomous weapon systems (LAWS). He emphasised the notion of ‘meaningful human control’ and noted that broader human involvement may be better than humans deciding final engagement. He cited autonomous systems as viable solution to reduce fratricide and civilian casualties. Claiming that humans are fallible, he shared two examples of failed missions that were the result of poor judgement and led to unwanted deaths. Lewis discussed the Patriot missile, a system that detects ballistic missiles and prompts the operator for approval of automatic missile launch. In one mission, after incorrectly qualifying an oncoming aircraft as ‘hostile,’ the system operator proceeded in launching an attack, killing the pilot. The other incident was a Special Forces predator drone falsely detecting an ambush from a civilian filled vehicle. Lewis pointed to misreading and acting upon autonomously gathered data as what led to the fatal mistakes.
The speakers shed light on the difficulty to make such decisions by involving the audience in hypothetical war missions with LAWS. This was followed by Q&A, where discussion continued about the balance between meaningful human control, human error, and relying on autonomous systems.