Are AI-DSS a ‘means or method of warfare’ requiring legal review?

A confronting 2023 +972 Magazine article brought the reality of Artificial Intelligence (AI) Decision Support Systems (DSS) into stark reality by describing the Israel Defence Force (IDF) use of AI DSS to inform human targeting decisions against Hamas in the ongoing conflict in Gaza. The IDF’s Lavender AI DSS allegedly processed data on thousands of suspected Hamas individuals, producing strike recommendations that were reviewed by humans before strikes with a reported error rate of around 10%.

The IDF’s use of AI DSS to inform human decision making on an industrial scale has been the subject of intense scrutiny (see here and here) and is occurring against the background of the increased development of AI DSS by defence industry.

An AI DSS designed to inform targeting decisions engages fundamental rules of international humanitarian law (IHL) including distinction, proportionality and precautions in attack. While humans ultimately make the final decisions, and so are legally responsible for those decisions, there are important questions as to whether a human decision maker can rely on the recommendations provided by AI DSS and make reasonably decisions in compressed timeframes.

Article 36 of Additional Protocol I to the Geneva Conventions of 1949 states:

‘In the study, development, acquisition or adoption of a new weapon, means or method of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law’.

This blog considers whether an AI DSS may be classified as a ‘means or method of warfare’ triggering a legal review requirement for States party to Additional Protocol I.

Article 36 creates an express legal obligation to determine the legality of ‘new weapons, means or method of warfare’ however it does not define this phrase or require a States to adopt a particular legal review process. Accordingly, States are obliged to determine their own national definition of ‘new weapon, means or method of warfare’ through internal directives or policies. The importance of this cannot be understates as the definition determines what a State considers the lawful subject of the legal review obligation.

Most national definitions focus on conventional weapons that are designed to cause harm/kill combatants or damage/ destroy military objectives. There is less clarity as to whether an AI DSS is considered a means or methods of warfare in isolation of a weapon system.

No State has publicly disclosed their view as to whether AI DSS is a ‘means or method of warfare’ . However, there is an increasingly consistent view by scholars (see here, here and here) and the ICRC that AI DSS that inform targeting decisions should the subject of legal reviews.

Article 36 Legal supports the view that an AI DSS designed to inform targeting decisions should be the subject of a legal review both on the basis that they may be characterised as ‘means of warfare’ for the purpose of article 36 but also because the AI performs functions that are regulated by IHL rules. Accordingly, a State is required to ensure that the system is able to perform its function in a way that complies with its IHL and other international law obligations.

Having arrived at this position, they question is how do you conduct a legal review of AI DSS? There is uncertaintly as to whether the traditional legal review process (explained here) is sufficient to determine the legality of AI DSS and autonomous weapon systems (AWS). There are practical, technical and policy issues that a reviewing State will need to consider in establishing their national legal review process to address AI DSS and AWS. In my view, there is a need to broaden the legal review process to both to address the unique characteristics of AI DSS and AWS (including bias, accuracy, legal responsibility, reliability and predictability) but to also to ensure that such systems are designed to meet IHL requirements not just during their acquisition but during their design and throughout their lifecycle.

Next
Next

Assessing the Legality of Ukraine’s ‘Operation Spiderweb’ Drones Under Article 36 of Additional Protocol I