Prohibition of Lethal Autonomous Weapons

The importance of prohibiting lethal autonomous weapons

Weapon systems that select and engage targets without substantial human control or oversight are unacceptable and should be prevented. AI should not be developed or used for the development of lethal autonomous weapons systems, which are weapons that can select and engage targets without human intervention.

All states have a responsibility to protect humanity from the dangers of lethal autonomous weapons by fully prohibiting them. It is important to retain meaningful human control over the deployment of force – which is a legal necessity, ethical imperative, and moral obligation.

Lethal autonomous weapons, which can operate without human intervention, raise serious ethical concerns and pose a threat to human life. Proponents believe that lethal autonomous weapons systems elicit concerns regarding reliability and predictability.

For instance, Human Rights Watch is skeptical about whether fully autonomous weapons can meet international humanitarian law standards, including the principles of proportionality and distinction. This is partly premised on the international law that prohibits an autonomous weapon system if it behaves unpredictably or if its intended performance is deemed unreliable. Current laws already forbid the use of autonomous weapon systems designed to exterminate a group of individuals based on their race, nationality, or any other grounds.

There are ongoing efforts to prohibit the development and use of such weapons. Various private firms, ordinary citizens, legislators, policymakers, and domestic and international organizations have supported the call to ban entirely autonomous weapons.

Moreover, since 2018, the U.N. S.G. Antonio Guterres has often urged states to ban weapon systems that can without human oversight target and attack people, referring to them as “morally repugnant and politically unacceptable.”

Various countries, including those involved in these military and technological developments, acknowledge that regulating autonomous weapon systems should be premised on a two-pronged approach: forbidding systems that do not permit adequate human control while mandating human control for systems that are not forbidden.

Examples of harmful outcomes resulting from lethal autonomous weapons

Lavender: An AI targeting system used by the Israeli army. This system assigns ratings to people in Gaza related to their suspected affiliation with Palestinian armed groups to label them as military targets. The system then puts those with a higher rating on a kill list. According to +972 Magazine, during the first weeks of the war in Gaza, which started on October 7, 2023, the Lavender system marked approximately 37,000 Palestinians as suspected militants for possible air strikes. Although the system has a 10% margin of error, the army would only take about 20 seconds to each target before authorizing a bombing. The Israeli army also relies on two other AI systems to track targets, “Where’s Daddy?” and “The Gospel.”

“Where’s Daddy?” track the targeted individuals and carry out bombings when they had entered their family’s residences. “The Gospel” generates lists of buildings or other structural targets to be attacked.

Recommendations for prohibiting lethal autonomous weapons

Embed Human Control in Autonomous Weapons: Meaningful human control should be necessary for these weapons. This consists of the role of the human operator to ensure ongoing compliance and human moral agency.

Role of States: States should embrace new legally binding rules for regulating autonomous weapon systems to ensure that there is adequate human control and judgment retention in the deployment of force. This will necessitate forbidding certain forms of autonomous weapon systems and strictly regulating others. An urgent and effective international response is required to tackle the serious risks posed by autonomous weapon systems, as evidenced by numerous civil society organizations and states over the last decade.

Design and Use: The design and use of non-forbidden autonomous weapon systems should be regulated, including combining limits on forms of target.

International Law: International law needs to continue evolving to maintain and strengthen protections in the wake of evolving military practice and technology. High Contracting Parties should take immediate action to adopt new rules.