Defending the boundary: Constraints and requirements on the use of autonomous weapon systems under international humanitarian and human rights law
June 2018
Policy Reports
Summary
The debate over autonomous weapon systems (AWS) involves ethical, humanitarian, legal, and security concerns, as these systems can detect, select, and attack targets without human intervention. AI advancements make AWS deployment feasible within years, potentially revolutionizing warfare. A few states are developing AWS to react faster to threats, process data efficiently, and protect their forces. However, concerns arise over maintaining meaningful human control to ensure legal compliance and prevent negative implications. International law currently does not specifically prohibit AWS, but states agree AWS use must comply with international law, particularly the Geneva Conventions. Opinions differ on AWS legality and the need for binding international regulations. Some advocate for a ban on AWS, arguing they cannot meet International Humanitarian Law (IHL) requirements, while others suggest non-binding measures or believe current laws suffice. The issue remains contentious, with calls for urgent international action to address AWS implications.