By Kelley Sayler
From April 11 to 15, representatives from 95 states and numerous non-governmental organizations gathered in Geneva for the third UN Convention on Certain Conventional Weapons’ Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS). As outlined by Chairman Michael Biontino, this meeting was intended to foster a shared understanding of the technical, ethical, and legal issues surrounding LAWS and provide a forum for sharing national policies and weapons review processes. Rather than covering new ground, the meeting largely served as a reprise of CCW meetings held in 2014 and 2015.
While it was clear that states have made incremental progress in reaching a shared understanding of LAWS—for example, in contrast to last year’s discussion, no state appeared to be equating LAWS with drones—they did not establish a working definition of the systems being discussed. As a result, competing understandings of LAWS were in use throughout the week, with most states coalescing around one of two definitions, or some variation thereof. The first definition, advanced by the United Kingdom, focused on the complexity of the machine, such that the term “LAWS” would only apply to those systems capable of “understanding, interpreting, and applying higher level [commander] intent.” The second definition, which focused on the particular function(s) being automated, classified LAWS as systems that select and engage targets that have not been specifically authorized by a human. These differing understandings of LAWS, in turn, impacted states’ assessments of both existing and prospective autonomous weapons systems, though the vast majority of states held that LAWS do not currently exist—but may someday.
Regardless of their preferred definition of LAWS, states agreed on the importance of maintaining human involvement in decisions over the use of lethal force. As in past years, opinions regarding the ideal amount of human involvement varied widely. For example, some states noted their support for the more restrictive “meaningful human control” concept developed by the advocacy community (which is seeking a ban on LAWS), while others preferred the less restrictive “appropriate levels of human judgment” standard established by the U.S. Defense Department’s 3000.09 directive on autonomous weapons—a standard that would presumably preserve a much greater degree of operational flexibility. Meanwhile, the U.K. promoted the concept of an “intelligent partnership,” in which humans would be involved in all targeting decisions under a framework of man-machine teaming. The 2016 discussion was thus productive in terms of solidifying high-level norms regarding the desirability of human involvement in the use of force. As a next step, states will seek to clarify the specific ways in which they intend to assess, measure, and ensure this involvement.
States also exhibited a growing interest in increasing the transparency of national review processes for evaluating the legality of new weapons systems. A number of countries, including Belgium, Sweden, and Israel, provided thorough accounts of their own review processes at the 2016 meeting and endorsed the development of best practices for weapons reviews and technical standards (the United States has shared details of its review process previously—and indeed, is the only nation with a formal, written policy on LAWS). As LAWS are likely to become a reality at some point in the future—by some accounts, these systems already exist—strengthening such measures will play a critical role in ensuring that future weapon systems are able to operate in compliance with international humanitarian law. This “compliance-based approach” to LAWS was viewed by many states as offering the most fruitful way forward for CCW discussions.
States concluded the 2016 Meeting of Experts on LAWS by agreeing to consider the creation of a Group of Governmental Experts (GGE)—a more formal forum for discussions—at the December review conference. However, the move to a GGE, which must be approved by consensus, is far from assured, as at least one state has signaled its reluctance to participate in formal meetings. Overall, states demonstrated little urgency in addressing Chairman Biontino’s objective of establishing a working definition of LAWS during this year’s discussions, instead doubling down on disparate and, in some cases, irreconcilable national positions. States were similarly reticent to define their own standards for “human involvement” in the use of force and made no attempt to adjudicate differences between various national approaches. Given the consensus-based nature of the CCW, these differences hampered attempts to move forward on the Chairman’s agenda.
At present, only 14 states have registered their support for an outright ban or moratorium on LAWS and, of those 14, none will possess the technological ability to indigenously produce LAWS in the foreseeable future. Most states instead appear to be content with playing wait-and-see, preserving their flexibility for the future by delaying any concrete decisions on LAWS. In 2017, we can expect more of the same.
Kelley Sayler (@kelleysayler) is an Associate Fellow at the Center for a New American Security (CNAS). She is a contributor to CNAS’ Ethical Autonomy Project.
For updates, news, and commentary, follow us on Twitter.
[includeme file=”tools/sympa/drones_sub.php”]