International Humanitarian Law


Law Review Articles 

Consent is Not Enough: Respecting the Intensity Threshold in Transnational Conflict

165 U. Pa. L. Rev. 1 (2016) (with Oona A. Hathaway, Daniel Hessel, Julia Shu, and Sarah Weiner)

It is widely accepted that a state cannot treat a conflict with an organized non-state actor as an armed conflict until the violence crosses a minimum threshold of intensity. But can a host state consent to a use of force that would be illegal for the host state to use itself? And can an intervening state always presume that host state consent is valid?

This article argues that host state consent is limited and that intervening states cannot treat consent as a blank check. Accordingly, even in consent-based interventions, the logic and foundational norms of the international legal order require both the consent-giving and consent-receiving states to independently evaluate which legal regime governs—which will often turn on whether the intensity threshold has been met. If a non-international armed conflict exists, the intervening state may act pursuant to international humanitarian law; if not, its lawful actions are limited by its own and the host state’s human rights obligations.

War Torts: Accountability for Autonomous Weapons

164 U. Pa. L. Rev. 1347 (2016)

Unlike conventional weapons or remotely operated drones, autonomous weapon systems can independently select and engage targets. As a result, they may take actions that look like war crimes without any individual acting intentionally or recklessly. Absent such willful action, no one can be held criminally liable under existing international law.

Criminal law aims to prohibit certain actions, and individual criminal liability allows for the evaluation of whether someone is guilty of a moral wrong. Given that a successful ban on autonomous weapon systems is unlikely (and possibly even detrimental), what is needed is a complementary legal regime that holds states accountable for the injurious wrongs that are the side effects of employing these uniquely effective but inherently unpredictable and dangerous weapons. Just as the Industrial Revolution fostered the development of modern tort law, autonomous weapon systems highlight the need for “war torts”: serious violations of international humanitarian law that give rise to state responsibility.

The Killer Robots Are Here: Legal and Policy Implications

36 Cardozo L. Rev. 1837 (2015)

Notwithstanding increasing state interest in the issue, no one has yet put forward a coherent legal definition of autonomy in weapon systems, resulting in a confusing conflation of legal, ethical, policy, and political arguments. This article proposes that an “autonomous weapon system” be defined as “a weapon system that, based on conclusions derived from gathered information and preprogrammed constraints, is capable of independently selecting and engaging targets.” Contrary to the general consensus, such systems are not weapons of the future: they exist and are in use today.

This fact has two main implications: it undermines almost all legal arguments for a ban, as they are based on the false assumption that such weaponry could never be lawfully employed; and it significantly reduces the likelihood that a successful ban will be negotiated, as states will be reluctant to voluntarily relinquish otherwise lawful and uniquely effective weaponry. Accordingly, this article concludes with a discussion of how best to create successful international regulations.

Which Law Governs During Armed Conflict? The Relationship Between International Humanitarian Law and Human Rights Law

96 Minn. L. Rev. 1883 (2012) (with Oona A. Hathway, Philip Levitz, Haley Nix, William Perdue, Chelsea Purvis, and Julia Spiegel)

Notwithstanding increasing state interest in the issue, no one has yet put forward a coherent legal definition of autonomy in weapon systems, resulting in a confusing conflation of legal, ethical, policy, and political arguments. This article proposes that an “autonomous weapon system” be defined as “a weapon system that, based on conclusions derived from gathered information and preprogrammed constraints, is capable of independently selecting and engaging targets.” Contrary to the general consensus, such systems are not weapons of the future: they exist and are in use today.

This fact has two main implications: it undermines almost all legal arguments for a ban, as they are based on the false assumption that such weaponry could never be lawfully employed; and it significantly reduces the likelihood that a successful ban will be negotiated, as states will be reluctant to voluntarily relinquish otherwise lawful and uniquely effective weaponry. Accordingly, this article concludes with a discussion of how best to create successful international regulations.

The Law of Cyber-Attack

100 Calif. L. Rev. 817 (2012) (with Oona A. Hathaway, Philip Levitz, Haley Nix, Aileen Nowlan, William Perdue, and Julia Spiegel)

This article clarifies the definition of cyber-attack and describes how such attacks are currently regulated under the law of war, international treaties, and domestic criminal law. Concluding that existing legal regimes address only a small fraction of potential challenges raised by cyber-attacks, this article proposes how international and domestic law might be adapted or created to more effectively regulate them.


Other Academic Writing

Autonomous Weapon Systems and the Limits of Analogy

in The Ethics of Autonomous Weapon Systems (Claire Finkelstein, Duncan MacIntosh & Jens David Ohlin) (forthcoming 2017)

Most imagine autonomous weapon systems as either more independent versions of weapons already in use or as some kind of robotic soldier. But every analogy is false in some way, and all potential analogies for autonomous weapon systems—weapon, combatant, child soldier, animal—misrepresent crucial traits and limit our understanding of the technology, thereby impeding our ability to properly regulate it. As is often the case when law by analogy fails, what is needed is new law—at the very least, new regulations for autonomous weapon systems, but perhaps also a new legal regime for all unconventional warfighters.

A Meaningful Floor for "Meaningful Human Control"

30 Temple Int’l & Comp. L.J. 53 (2016) (invited workshop contribution)

The broad support for “meaningful human control” of autonomous weapon systems comes at a familiar legislative cost: there is no consensus as to what this principle requires. This paper describes attempts to define the concept; discusses benefits of retaining imprecision in a standard intended to regulate new technology through international consensus; and argues for an interpretative floor grounded on existing humanitarian protections.

War, Responsibility, and Killer Robots

40 N.C. J. Int’l L. 909 (2015) (invited symposium contribution)

Although many are concerned that autonomous weapon systems may make war “too easy,” no one has discussed how their use may affect the constitutional war power. When conventional weaponry required boots on the ground, popular outrage at the loss of American lives incentivized Congress to check presidential warmongering. But as human troops are augmented and supplanted by robotic ones, it will be politically easier to justify using force, especially for short-term military engagements. Like drones and cyber operations, autonomous weapon systems will contribute to the growing concentration of the war power in the hands of the Executive, with implications for the doctrine of humanitarian intervention. 

The Varied Law of Autonomous Weapon Systems

in NATO Allied Command Transformation, Autonomous Systems: Issues for Defence Policy Makers (Andrew P. Williams & Paul D. Scharre, eds., 2015)

What law governs autonomous weapon systems? Those who have addressed this subject tend to focus on the law of armed conflict. But the international laws applicable to the development or use of autonomous weapon systems are hardly limited to rules regarding the conduct of hostilities. Other legal regimes—including international human rights law, the law of the sea, space law, and the law of state responsibility—may also be relevant to how states may lawfully create or employ autonomous weapon systems, resulting in a complex and evolving web of international governance.

International Law and Targeted Killings in Pakistan

White Paper submitted to the Office of the Legal Advisor, U.S. Dept. of State (2010) (with Brian Finucane and Oona A. Hathaway)

This report evaluates the lawfulness, under international law, of U.S. use of unmanned aerial vehicles to target non-state actors in Pakistan.


The Law of Consent-Based Interventions

Just Security (Oct. 13, 2016) (with Sarah Weiner)

Autonomous Weapon Systems and Proportionality

Völkerrechtsblog (Apr. 15, 2015)