P
H
O
T
O
B
Y
S
H
A
R
R
O
N
W
A
R
D
F
O
R
T
H
E
C
A
M
P
A
I
G
N
T
O
S
T
O
P
K
I
L
L
E
R
R
O
B
O
T
S
they constitute. He does not advocate
a total ban on LAWS, but suggests the
need to look at specific instances of
weapons and decide on a one-by-one
basis whether they are viable or should
be banned. He explains: “I am a pro-
ponent of a moratorium until there is
more understanding of autonomous
weapons. We all agree we don’t want a
scenario like The Terminator, but we do
need to talk about what we do want.”
Arkin argues that a better under-
standing of autonomous weapons
could lead to the development of in-
telligent autonomous military sys-
tems that could be precise in hitting
targets and, at the same time, re-
duce civilian casualties and property
damage when compared to the per-
formance of human fighters, whose
behavior in the theatre of war can be
inconsistent and waver between he-
roic and atrocious.
Says Arkin, “We need to assume
more responsibility for non-combat-
THE HISTORY OF battle knows no bounds, with weapons of destruction evolving from prehistoric clubs, axes, and spears to bombs, drones,
missiles, landmines, and systems
used in biological and nuclear warfare. More recently, lethal autonomous weapon systems (LAWS) powered by artificial intelligence (AI) have
begun to surface, raising ethical issues about the use of AI and causing
disagreement on whether such weapons should be banned in line with international humanitarian laws under
the Geneva Convention.
Much of the disagreement around
LAWS is based on where the line should
be drawn between weapons with limited human control and autonomous
weapons, and differences of opinion on
whether more or less people will lose
their lives as a result of the implementation of LAWS. There are also contrary
views on whether autonomous weapons
are already in play on the battlefield.
Ronald Arkin, Regents’ Professor and Director of the Mobile Robot
Laboratory in the College of Computing at Georgia Institute of Technology, says limited autonomy is already
present in weapon systems such
as the U.S. Navy’s Phalanx Close-In
Weapons System, which is designed
to identify and fire at incoming missiles or threatening aircraft, and Israel’s Harpy system, a fire-and-forget
weapon designed to detect, attack,
and destroy radar emitters.
The Campaign to Stop Killer Robots, which was founded in 2013 by a
group of regional, national, and international non-governmental organizations (NGOs), agrees that no fully autonomous weapons are yet in use, but
says existing systems could soon be
extended to become fully autonomous
and that the window to fulfill its ambition of achieving a preemptive ban on
all such systems is closing.
The campaign’s key tenet is that
giving machines the power to decide
who lives and dies on the battlefield is
an unacceptable application of tech-
nology and makes human control of
any combat robot essential to ensur-
ing humanitarian protection.
Mary Wareham, global coordinator
of the Campaign to Stop Killer Robots
at Human Rights Watch in Washing-
ton D.C., explains the potential of pre-
curser weapon systems that could be
extended, exampling armed drones.
Says Wareham, “Remotely piloted
armed drones still have a human in
the loop deciding on the selection of
targets and force to be used, but new
generations of arms could fly autono-
mously and complete missions with
no human control. We don’t want to
see these systems in action.”
From a robotic perspective, Arkin
also defines autonomous machines as
those that have no opportunity for hu-
man intervention, but says one of the
concerns around LAWS is that there
is no substantive agreement on what
Society | DOI: 10.1145/3077231 Sarah Underwood
Potential and Peril
The outlook for artificial intelligence-based autonomous weapons.
Participants in the first NGO Conference of the Campaign to Stop Killer Robots in London in 2013.