Autonomous Weapon Systems and the Limits of Analogy

in The Ethics of Autonomous Weapon Systems (Claire Finkelstein, Duncan MacIntosh & Jens David Ohlin) (forthcoming 2017)

Most imagine autonomous weapon systems as either more independent versions of weapons already in use or as some kind of robotic soldier. But every analogy is false in some way, and all potential analogies for autonomous weapon systems—weapon, combatant, child soldier, animal—misrepresent crucial traits and limit our understanding of the technology, thereby impeding our ability to properly regulate it. As is often the case when law by analogy fails, what is needed is new law—at the very least, new regulations for autonomous weapon systems, but perhaps also a new legal regime for all unconventional warfighters.


A Meaningful Floor for "Meaningful Human Control"

30 Temple Int’l & Comp. L.J. 53 (2016) (invited workshop contribution)

The broad support for “meaningful human control” of autonomous weapon systems comes at a familiar legislative cost: there is no consensus as to what this principle requires. This paper describes attempts to define the concept; discusses benefits of retaining imprecision in a standard intended to regulate new technology through international consensus; and argues for an interpretative floor grounded on existing humanitarian protections.


War, Responsibility, and Killer Robots

40 N.C. J. Int’l L. 909 (2015) (invited symposium contribution)

Although many are concerned that autonomous weapon systems may make war “too easy,” no one has discussed how their use may affect the constitutional war power. When conventional weaponry required boots on the ground, popular outrage at the loss of American lives incentivized Congress to check presidential warmongering. But as human troops are augmented and supplanted by robotic ones, it will be politically easier to justify using force, especially for short-term military engagements. Like drones and cyber operations, autonomous weapon systems will contribute to the growing concentration of the war power in the hands of the Executive, with implications for the doctrine of humanitarian intervention. 


The Varied Law of Autonomous Weapon Systems

in NATO Allied Command Transformation, Autonomous Systems: Issues for Defence Policy Makers (Andrew P. Williams & Paul D. Scharre, eds., 2015)

What law governs autonomous weapon systems? Those who have addressed this subject tend to focus on the law of armed conflict. But the international laws applicable to the development or use of autonomous weapon systems are hardly limited to rules regarding the conduct of hostilities. Other legal regimes—including international human rights law, the law of the sea, space law, and the law of state responsibility—may also be relevant to how states may lawfully create or employ autonomous weapon systems, resulting in a complex and evolving web of international governance.


Judicious Influence: Non-Self-Executing Treaties and the Charming Betsy Canon

Note, 120 Yale L.J. 1784 (2011) (cited in Brownlie's Principles of Public International Law 79-80 (James Crawford, ed., 8th ed. 2012))

Non-self-executing treaties are commonly, and inappropriately, dismissed as irrelevant in domestic law. This note examines how judges employ the Charming Betsy canon to interpret ambiguous statutes to accord with U.S. international obligations, including those expressed in non-self-executing treaties. The Note concludes that this practice is justified and beneficial.


International Law and Targeted Killings in Pakistan

White Paper submitted to the Office of the Legal Advisor, U.S. Dept. of State (2010) (with Brian Finucane and Oona A. Hathaway)

This report evaluates the lawfulness, under international law, of U.S. use of unmanned aerial vehicles to target non-state actors in Pakistan.