banned without requiring an inspection
regime. Consider, for example, the relatively recent bans on blinding lasers or
anti-personnel weapons, which are often offered as a model for arms control
for LAWS. These bans rely on representatives of civil society, non-governmental
organizations such as the International
Committee of the Red Cross, to monitor
and stigmatize violations. So also will a
ban on LAWS. However, blinding lasers
and anti-personnel weapons were relatively easy to define. After the fact, the
use of such weapons can be proven in a
straightforward manner. Lethal autonomy, on the other hand, is not a weapon
system. It is a feature set that can be added to many, if not all, weapon systems.
Furthermore, the uses of autonomous
killing features are likely to be masked.
• LAWS will be relatively easy to assemble using technologies developed
for civilian applications. Thus their proliferation and availability to non-state
actors cannot be effectively stopped.
In forging arms-control agreements
definitional distinctions have always
been important. Contentions that definitional consensus cannot be reached
for autonomy or meaningful human control, that LAWS depend upon advanced
AI, and that such systems are merely a
distant speculative possibility repeatedly arose during the April discussion at
the U. N. in Geneva, and generally served
to obfuscate, not clarify, the debate. A
circular and particularly unhelpful debate has ensued over the meaning of
autonomy, with proponents and opponents of a ban struggling to establish a
definition that serves their cause. For
example, the U.K. delegation insists
that autonomy implies near humanlike
capabilitiese and anything short of this
is merely an automated weapon. The
Campaign to Stop Killer Robots favors
a definition where autonomy is the ability to perform a task without immediate
intervention from a human. Similarly,
definitions for meaningful human control range from a military leader specifying a kill order in advance of deploying
a weapon system to having the real-time
engagement of a human in the loop of
selecting and killing a human target.
e While the U.K. representatives did not use this
language, it does succinctly capture the delegation’s statements that all computerized systems are merely automated until they display
advanced capabilities.
for IHL has been a core strategy for
supporters of a ban.
Those among the more than 3, 100
AI/Robotics researchers who signed
the Autonomous Weapons: An Open Letter From AI & Robotics Researchersc are
reflective of a broad consensus among
citizens and even active military personnel who favor a preemptive ban. 4
This consensus is partially attributable to speculative, futuristic, and
fictional scenarios. But perhaps even
science fiction represents a deep intuition that unleashing LAWS is not a
road humanity should tread.
Researchers who have waded into the
debate over banning LAWS have come
to appreciate the manner in which geopolitics, security concerns, the arcana
of arms control, and linguistic obfuscations can turn a relatively straightforward proposal into an extremely complicated proposition. A ban on LAWS
does not fit easily, or perhaps at all, into
traditional models for arms control.
If a ban, or even a moratorium, on the
development of LAWS is to progress, it
must be approached creatively.
I favor and have been a long-time
supporter of a ban. While a review of
the extensive debate as to whether
LAWS should be banned is well beyond the scope of this paper, I wish
to share a few creative proposals that
could move the campaign to ban LAWS
forward. Many of these proposals were
expressed during my testimony at the
CCW meeting in April and during a
side luncheon event.d Before introducing those proposals, let me first point
out some of the obstacles to fashioning
an arms control agreement for LAWS.
Why Banning LAWS Is Problematic
˲ Unlike most other weapons that
have been banned, some uses of LAWS
c Available at http://bit.ly/1V9bls5
d The full April 12, 2016, testimony entitled,
Predictability and Lethal Autonomous Weap-
ons Systems (LAWS), is available at http://bit.
ly/2mjmuwH. An extended article accompa-
nied this testimony. That article was circulated
to all the CCW member states by the chair of
the meeting, Ambassador Michael Biontino of
Germany. It was also published in Robin Geiss,
Ed., 2017, “Lethal Autonomous Weapons Sys-
tems: Technology, Definition, Ethics, Law &
Security.” Federal Foreign Office, p. 295–312.
The luncheon event on April 11, 2016, was
sponsored by the United Nations Institute for
Disarmament Research (UNIDIR).
are perceived as morally acceptable, if
not morally obligatory. The simple fact
that LAWS can be substituted for and
thus save the lives of one’s own soldiers is the most obvious moral good.
Unfortunately, this same moral good
lowers the barriers to initiating new
wars. Some nations will be emboldened to start wars if they believe they
can achieve political objectives without
the loss of their troops.
˲ It is unclear whether armed military robots should be viewed as weapon systems or weapon platforms, a
distinction that has been central to
many traditional arms control treaties.
Range, payload, and other features are
commonly used in arms control agreements to restrict the capabilities of a
weapon system. A weapon platform
can be regulated by restricting where
it can be located. For example, agreements to restrict nuclear weapons will
specify number of warheads and the
range of the missiles upon which they
are mounted, and even where the missiles can be stationed. With LAWS,
what is actually being banned?
• Arms control agreements often
focus on working out modes of verification and inspection regimes to determine whether adversaries are honoring the ban. The difference between
a lethal and non-lethal robotic system
may be little more than a few lines of
code or a switch, which would be difficult to detect and could be removed before or added after an inspection. Proposed verification regimes for LAWS6
would be extremely difficult and costly
to enforce. Military strategists do not
want to restrict their options, when
that of bad actors is unrestricted.
• LAWS differ in kind from the various
weapon systems that have to date been
Some nations will be
emboldened to start
wars if they believe
they can achieve
political objectives
without the loss
of their troops.