decision requires the real-time authorization from designated military
personnel for a LAW to kill a combatant or destroy a target that might harbor combatants and non-combatants
alike. In other words, it is not sufficient for military personnel to merely
delegate a kill order in advance to an
autonomous weapon or merely be “
on-the-loop”h of systems that can act without a real time go-ahead.
3. Petition leaders of states to declare that LAWS violate existing IHL. In
the U.S. this would entail a Presidential
Order to that effect.i, 14
4. Review marginal or ambiguous cases to set guidelines for when a
weapon system is truly autonomous
and when its actions are clearly the
extension of a military commander’s
will and intention. Recognize that any
definition of autonomy will leave some
cases ambiguous.
5. Underscore that some present
and future weapon system will occasionally act unpredictably and most
LAWS will be difficult if not impossible
to test adequately.
6. Present compelling cases for
banning at least some, if not all, LAWS.
In other words, highlight situations in
which nearly all parties will support a
ban. For example, no nation should want
LAWS that can launch nuclear warheads.
7. Accommodate the fact that
there will be necessary exceptions to
any ban. For example, defensive autonomous weapons that target unmanned
incoming missiles are already widely
deployed.j These include the U.S. Aegis
Ballistic Missile Defense System and Israel’s Iron Dome.
8. Recognize that future technological advances may justify additional
h “On the loop” is a term that first appeared in
the “United States Air Force Unmanned Aircraft Systems Flight Plan 2009–2047.” The plan
states: Increasingly humans will no longer be
“in the loop” but rather “on the loop”—
monitoring the execution of certain decisions. Simultaneously, advances in AI will enable systems to make combat decisions and act within
legal and policy constraints without necessarily
requiring human input.
i Wallach, W. (2012, unpublished but widely circu-
lated proposal). Establishing limits on autonomous weapons capable of initiating lethal force.
j In practice a weapon designed for defensive
purposes might be used offensively. So the
distinction between the two should emphasize the use of defensive weaponry to target
unmanned incoming missiles.
The leading military powers contend
that they will maintain effective control
over the LAWS they deploy.f But even
if we accept their sincerity, this totally
misses the point. They have no means
of ensuring that other states and non-state actors will follow suit.
More is at stake in these definitional debates than whether to preemptively ban LAWS. Consider a Boston
Dynamic’s Big Dog loaded with explosives, and directed through the use of
a GPS to a specific location, where it is
programmed to explode. Unfortunately, during the time it takes to travel to
that location, the site is transformed
from a military outpost to a makeshift
hospital for injured civilians. A strong
definition for meaningful human control would require the location be given a last-minute inspection before the
explosives could detonate. Big Dog, in
this example, is a dumb LAW, which we
should perhaps fear as much as speculative future systems with advanced
intelligence. Dumb LAWS, however,
do open up comparisons to widely deployed existing weapon systems, such
as cruise missiles, whose impact on
an intended target military leaders
have little or no ability to alter once the
missile has been launched. In other
words, banning dumb LAWS quickly
converges with other arms control
campaigns, such as those directed at
limiting cruise missiles and ballistic
missiles. 5 States will demand a definition for LAWS that distinguishes them
from existing weapon systems.
Delegates at the CCW are cognizant
that in the past (1990s) they failed at
banning the dumbest, most indiscriminate, and autonomous weapons of all,
anti-personnel mines. Nevertheless,
anti-personnel weapons (land mines)
were eventually banned during an independent process that led up to the
Mine Ban or Ottawa Treaty; 162 countries have committed to fully comply
with that treaty.g
f See, for example, the U.S. Department of De-
fense Directive 2000.09 entitled, “Autonomy in
Weapon Systems.” The Directive is dated No-
vember 21, 2012 and signed by Deputy Secretary
of Defense, Ashton B. Carter, who was appoint-
ed Secretary of Defense by President Obama on
December 5, 2014; http://bit.ly/1myJikF
A second failure to pass restric-
tions on the use of a weapon systems,
whose ban has garnered popular sup-
port, might damage the whole CCW ap-
proach to arms control. This knowledge
offers the supporters of a ban a degree
of leverage presuming: the ban truly
has broad and effective public support;
LAWS can be distinguished from exist-
ing weaponry that is widely deployed;
and creative means can be forged to de-
velop the framework for an agreement.
A 10-Point Plan
Many of the barriers to fitting a ban
on LAWS into traditional approaches
to arms control can be overcome by
adopting the following approach.
1. Rather than focus on establishing a bright line or clear definition for
lethal autonomy, first establish a high
order moral principle that can garner
broad support. My candidate for that
principle is: Machines, even semi-intelligent machines, should not be making life
and death decisions. Only moral agents
should make life and death decisions
about humans. Arguably, something
like this principle is already implicit,
but not explicit, in existing international humanitarian law, also known
as the laws of armed conflict (LOAC). 3
A higher order moral principle makes
explicitly clear what is off limits, while
leaving open the discussion of marginal cases where a weapon system may
or may not be considered to be making
life and death decisions.
2. Insist that meaningful human
control and making a life and death
The leading military
powers contend
they will maintain
effective control
over the LAWS
they deploy.
But even if
we accept their
sincerity, this totally
misses the point.