Debating “Killer Robots” at the United Nations

2014_UN_CCW Experts Meeting
The first United Nations Convention on Certain Conventional Weapons meeting to discuss LAWS on May 14, 2014. Credit: United Nations

By Dan Gettinger

Today, April 13, experts and delegates from around the world are gathering in Geneva, Switzerland for a discussion on Lethal Autonomous Weapons Systems, or, LAWS. The Meeting of Experts is organized by the United Nations Convention on Certain Conventional Weapons (CCW). Over the next five days, the representatives will attempt to work through some of the technical, legal, military, sociological, and ethical issues posed by the development of “killer robots.” At stake is a proposed preemptive ban on the development, production, and use of these weapons.

The call to ban “killer robots” is gaining traction among human rights lawyers and activists. On April 9, Human Rights Watch and Harvard Law School’s International Human Rights Clinic published a report urging all nations to support a ban on LAWS given the “significant hurdles to assigning personal accountability for the actions of fully autonomous weapons.” Various nations are also pushing for the ban as a means of preventing LAWS from reaching the battlefield. Since these highly sophisticated autonomous weapons have yet to be invented, a substantial portion of the deliberations at this meeting will be devoted to understanding and defining LAWS.

The results of this meeting will be assembled and presented in a report, authored by German Ambassador Michael Biontino, the chair of this Meeting of Experts, that will set the framework for future discussions on LAWS. Here’s what you need to know about the CCW Meeting of Experts on LAWS:



  • Article 36 of Additional Protocol I (1977) to the 1949 Geneva Conventions requires that a nation must determine whether, by employing a new weapon, that nation might be acting in contravention of other statutes of the Geneva Convention. Article 36 falls under Part Two of the Additional Protocol 1, which considers the conduct of hostilities in war. It follows Article 35 which bans weapons that “cause superfluous injury or unnecessary suffering.”
  • Directive 3000.09 lays out the positions of the U.S. Department of Defense as relating to autonomous weapons. The directive orders that a commander must be able to “exercise appropriate levels of human judgment over the use of force.” In other words, for an autonomous system to engage in a lethal or kinetic action, a human must be in the loop. The directive allows the military to develop and use non-lethal autonomous systems and permits high-ranking Pentagon officials to overrule the ban on lethal autonomous systems. Directive 3000.09 was signed by Ash Carter, currently Secretary of Defense, on November 21, 2012, when he was serving as Deputy Secretary of Defense.

[gview file=””]

What Will be Discussed

  • On Monday and Tuesday, the delegates and experts at the conference will consider the technical issues raised by lethal autonomous weapons systems. It can be expected that a large portion of this discussion will tackle the roles and complexities of autonomous capabilities in technology. In the summary of the 2014 Meeting of Experts on LAWS, French Ambassador Jean-Hugues Simon-Michel noted that although autonomous capabilities had yet to be achieved, many “research activities were ongoing in this area.” In spite of the absence of a universal conception or definition of autonomy, Simon-Michel writes, it is possible to identify common traits of autonomous capabilities in LAWS, like the ability to select and engage targets. Read on: Presentation “On the Concept of Autonomy” at the 2014 Meeting of Experts by Dr. Raja Chatila // Centre National de la Recherce Scientifique.
  • On Tuesday and Wednesday, the conversation will turn to the characteristics of LAWS. Some of the issues to be discussed may include the level of meaningful human control over LAWS, the functional role of autonomy, the fact that these technologies can be used for both civilian and military applications, and the critical tasks and responsibilities that LAWS might assume on the battlefield. In a primer for the meeting, Ambassador Michael Biontino identified some of the key questions that might guide this discussion: Does the context and environment in which a weapon system is being used affect its being categorised as an autonomous system? How does the issue of dual use technology impede differentiating between civilian and military applications? The issue of meaningful human control is a particularly salient one for the Holy See and the German delegations; in his opening remarks to the 2014 Meeting of Experts the German ambassador notes that “[F]or Germany, this principle of human control is the foundation of the entire international humanitarian law.” Read on:Weapons, Technology, and Human Control” – Panel discussion at the United Nations Institute for Disarmament Research
  • The final in-depth discussion, which will last take place on Wednesday and Thursday, will tackle the challenges that LAWS might pose to the law, specifically to International Humanitarian Law (IHL). This discussion will focus primarily on issues such as the accountability chain and the principles of distinction and proportionality. Another possible issue is how the level of autonomy in a machine might change the rules of engagement, and how the international community can implement legal reviews of LAWS. In the 2014, “The discussions focused on whether one could establish responsibility for violations of international law and whether such cases incurred the responsibility of subordinates, programmers or manufacturers,” writes French Ambassador Jean-Hugues Simon-Michel in a summary of the 2014 Meeting of Experts.

The Lockheed Martin Long Range Anti-Ship Missile is being developed together with the Defense Advanced Research Projects Agency. In this promotional video, Lockheed Martin shows how the LRASM will have autonomous targeting capabilities.

National Positions

At the outset of the Meeting of Experts, the representatives of each nation will submit a statement of remarks. In some cases, the delegation will take this opportunity to identify some of the issues that are particularly important to their nation. In other cases, the statements are more perfunctory. While most of the delegations did not outright endorse a ban on LAWS, a number expressed serious reservations with the concept of autonomous weapons. A few nations—Cuba, Egypt, Pakistan, Ecuador, the Vatican—explicitly called for immediate international action to preemptively restrict the development and use of LAWS.

Although not all the countries have published statements in anticipation of the 2015 Meeting of Experts, we have included statements from last year that offer an idea of where some delegations stand on certain issues.



  • 2014 Statement: “The fundamental question here seems to be: how much of a role for robots in human society are we ready to accept, taking as a measure the security and well-being of humankind as a whole?”
  • 2014 Statement: “While in the case of a war crime perpetrated by a human actor legal responsibility can be, at least in principle, established, it is fundamentally unclear how to resolve the issue once the autonomous decision of a machine is at the root of the crime.”




  • 2015 Statement: “Cuba supports the adoption of a legally binding international instrument which ordains a total prohibition of lethal autonomous weapons, especially those used against people.”
  • 2015 Statement: “The high cost of the technology required for lethal autonomous weapons can only be undertaken by developed countries, which further increases the asymmetry between rich and poor countries.”


Czech Republic


  • 2014 Statement (in Spanish): “The states should take actions to prevent creation and development, and block investments in the field of completely autonomous weapons systems, through national rules and laws, as well as an international protocol that prohibits the creation, development, and use of these systems.”


  • 2014 Statement: “[W]e support calls to pose a moratorium on the development of such lethal technology in order to allow serious and meaningful international engagement with this issue.”


  • 2014 Statement (in French): “Finally, we must keep in mind that the technologies in question are dual, being able to have numerous uses that are civil, peaceful, legitimate, and useful. In each case, it should not limit the research in this area.”


  • 2014 Statement: “For Germany, this principle of human control is the foundation of the entire international humanitarian law. It is based on the right to life, on the one hand, and on the right to dignity, on the other. Even in times of war, human beings cannot be made simple objects of machine action.”

Holy See

  • 2014 Statement: “For the Holy See the fundamental question is the following: Can machines—well-programmed with highly sophisticated algorithms to make decisions on the battlefield that seek to comply with IHL—truly replace humans in decisions over life and death?”
  • 2014 Statement: “Decisions over life and death inherently call for human qualities, such as compassion and insight, to be present. While imperfect human beings may not perfectly apply such qualities in the heat of war, these qualities are neither replaceable nor programmable.”
  • 2014 Statement: “History shows that developments in weapons technology, from crossbows to drones, give the inventing side a temporary military advantage. The inevitable widespread proliferation of these weapons systems will fundamentally alter the nature of warfare for the whole human family.”


  • 2014 Statement: “From India’s point of view, we would like the CCW process to emerge strengthened from these discussions, resulting in increased systematic controls on international armed conflict embedded in international law in a manner that does not further widen the technology gap amongst states or encourage the use of lethal force to settle international disputes just because it affords the prospects of lesser casualties to one side, or that its use can be insulated from the dictates of public conscience.”
  • 2014 Statement: “Overall, the consideration of this issue is a test case of whether the CCW can respond meaningfully to evolving new technology as applicable to armed conflict in this century.”


  • 2014 Statement: “Although outside the scope of the CCW, the potential use and abuse of autonomous weapons beyond the battlefield, in law enforcement for instance, is also deserving of consideration.”



  • 2015 Statement: “Japan is of the view that LAWS should be discussed with a focus on various aspects of technology, ethics, law, and military affairs, and that it is not appropriate to draw any conclusions from only one of these aspects.”
  • 2015 Statement: “Even though LAWS refer to weapons not robotics in civil use, it is not easy to decide where to draw the line between technical components in military use and those in civilian use.”


  • 2014 Statement (in French): “In this regard, Mali, although it does not yet have a policy on LAWS, would like, here and now, to reaffirm its firm determination to remain mobilized on the issue.”


  • 2014 Statement (in Spanish): “We believe in the need to observe the development of new weapons technologies in the context of respecting the human right to life, and we express our concern at the prospect of lethal autonomous weapons systems that have the power to arbitrarily decide on the life and death of human beings.”

New Zealand


  • 2014 Statement: “First of all, let me underline that there are already a number of weapons systems already in use that are highly automatic but which operate within such tightly constrained spatial and temporal limits that meaningful human control is ensured.”
  • 2014 Statement: “At this stage, our main concern with the possible development of fully autonomous weapons systems is whether such weapons could be programmed to operate within the limitations set by international law. Including in particular with regard to the fundamental rules on distinction and proportionality.”


  • 2014 and 2015 Statement: “LAWS are by nature unethical, because there is no longer a human in the loop and the power to make life and death decisions are delegated to machines which inherently lack compassion, morality and intuition. This will likely make war inhumane.”
  • 2014 and 2015 Statement: “LAWS could easily be used in anonymous and clandestine operations as well as for targeted killings including in the territory of other states as is being witnessed in the use of armed drones. Like drones, civilians could be targeted and killed with LAWS through so-called signature strikes. The breaches of state sovereignty – in addition to breaches of International Humanitarian Law and International Human Rights Law – associated with targeted killing programmes risk making the world and the protection of life less secure with LAWS in the equation.”
  • 2014 and 2015 Statement: “The use of LAWS on the battlefield would amount to a situation of one-sided killing. Besides depriving the combatants of the targeted state the protection offered to them by the international law of armed conflict, LAWS would also risk the lives of civilians and non-combatants on both sides.”
  • 2014 and 2015 Statement: “We should not let the blind quest for the ultimate weapon, driven by commercial interests of the military-industrial complex, get the better of us.”

Republic of Korea

  • 2014 Statement: “In the military area, we are considering the utilization of robot technology to better protect soldiers exposed to serious risks. For instance, robots would be used to clear mines, remove improvised explosive devices (IEDs), and detect [Chemical, biological, radiological and nuclear weapons].”
  • 2014 Statement: “Korea is working to enact an ethics charter on the commercial service of robotic technology in accordance with the National Plan on Intelligent Robots. I expect the charter to contain provisions on ethical values and the code of conduct regarding the development, manufacture and use of robots.”

South Africa

  • 2014 Statement: “The lack of human intervention in “autonomous” weapons raises serious concerns and even those systems that are reportedly “semiautonomous,” do not necessary provide for sufficient time for humans to make an intervention.”


  • 2014 Statement (in Spanish): “There are those who point out that the behavior of these systems could be more predictable from a humanitarian standpoint, as they wouldn’t suffer from certain emotions that can exacerbate the actions of a combatant in the field, such as anger, hate, and panic. But this same reasoning–the exclusion of other emotions such as piety, solidarity and empathy–could lend itself to the opposite conclusion.”


  • 2014 Statement: “We do not see any such systems yet in existence, nor can we foresee a situation in the near future in which a weapon system would be operated without any human control or oversight. However, as states we have an obligation to assess the legality of new weapons, and we therefore welcome this discussion.”


  • 2014 Statement: “One key question to also start addressing is whether a machine without intuition would ever be able to acquire a sufficient degree of “situational awareness” and could assess risks or conduct the qualitative assessments required by International Humanitarian Law. Other key questions will probably relate to the civilian use of these technologies and the potential dual-use applications.”

United Kingdom

United States

  • 2014 Statement: “Too often, the phrase “lethal autonomous weapons system” appears still to evoke the idea of a humanoid machine independently selecting targets for engagement and operating in a dynamic and complex environment. But that is a far cry from what we should be focusing on, which is the likely trend of technological development, not images in popular culture.”
  • 2014 Statement: “As we begin our discussion here, though, we must be clear on one point — we are here to discuss future weapons, or, in the words of the mandate of this meeting, “emerging technologies.” Therefore we need to be clear, in these discussions we are not referring to remotely piloted aircraft, which as their name indicates are not autonomous and therefore conceptually distinct from LAWS.”
  • 2014 Statement: “In order to assess the risk associated with any weapons system, states need a robust domestic legal and policy process and methodology. We think states may need to tailor those legal and policy processes when considering weapons with autonomous features. For that reason, as you know, after a comprehensive policy review, the United States Department of Defense issued DoD Directive 3000.09, ‘Autonomy in Weapons Systems,’ in 2012.”

Statements by Organizations at the 2014 Meeting of Experts

This post is part of a series of resources on the United Nations conference on lethal autonomous weapons systems. For a primer on the key issues that will be discussed in Geneva, click here. For our Multimedia Portal on technology and war, click here.


For updates, news, and commentary, follow us on Twitter.

[includeme file=”tools/sympa/drones_sub.php”]

One comment

Comments are closed.