systems in combat makes it difficult
to apportion blame when something
goes wrong. The question of whether
the weapon designer, deployer, or indeed, any other entity should take the
blame is far from answered. Schwitzgebel suggests the diffusion of blame
could be consistent with governments
collaborating on undesirable uses of
autonomous systems.
Lack of predictability could also
become a more serious threat as AI
systems become more complex. “Hu-
man soldiers in warfare can be unpre-
dictable, but within limits as military
commanders have an understanding
of what has happened in various con-
ditions in the past,” says Schwitzgebel.
“Autonomous systems could be more
unpredictable than humans, which in
warfare could lead to disastrous con-
sequences. The ethics of autonomous
weapons and issues of AI and philoso-
phy are not as widely talked about as
they should be.”
With many questions about the
benefits and dangers of LAWS still up
in the air, and no international agree-
ments in place to provide answers, the
Campaign to Stop Killer Robots and
other research organizations keen to
ensure a ban on their development,
manufacture, and deployment, are
exerting pressure on governments to
adopt and implement their approach.
While the campaign is concerned
that the window of time to reach
agreement on a ban on LAWS is closing as autonomous weapons are being
developed, progress in its favor is beginning to be made and autonomous
weapons are moving up the agenda
following a December 2016 United
Nations review of the CCCW.
Making a small piece of history, the
U.N. voted during the review to start
a formal process that might lead to a
ban on LAWs. Of course, there are no
guarantees that the process will be successful, but as Walsh puts it: “If states
hadn’t voted to start the process, there
would have been no chance to finish.”
Russia abstained from the vote.
Countries participating in the
vote agreed to set up an open-ended
Group of Governmental Experts that
will discuss nations’ concerns about
LAWS and the line between auto-
nomous and non-autonomous weap-
ons. The group will meet for two
weeks in August this year, but the ex-
pectation is that it will take multiple
years to reach consensus and add a
protocol to the CCCW that will ban
autonomous weapons operating be-
yond the boundaries of international
humanitarian law.
Further Reading
Losing Humanity: The Case
against Killer Robots
November 2012, Human Rights Watch
and International Human Rights Clinic at
Harvard Law School
https://www.hrw.org/report/2012/11/19/
losing-humanity/case-against-killer-robots
Views of the International Committee of the
Red Cross (ICRC) on autonomous weapon
systems
April 2016, ICRC
https://www.icrc.org/en/document/views-icrc-autonomous-weapon-system
Three in Ten Americans Support Using
Autonomous Weapons
February 2017, Ipsos
http://www.ipsos-na.com/news-polls/
pressrelease.aspx?id=7555
IEEE Ethically Aligned Design Document
Elevates the Importance of Ethics in the
Development of Artificial Intelligence (AI)
and Autonomous Systems (AS)
December 2016, IEEE
http://standards.ieee.org/news/2016/
ethically_aligned_design.html
Lethal Autonomous Systems and the Plight
of the Non-combatant
July 2103, Ronald Arkin, Georgia Institute
of Technology
http://www.cc.gatech.edu/ai/robot-lab/
online-publications/aisbq-137.pdf
Campaign to Stop Killer Robots
https://www.stopkillerrobots.org/
Sarah Underwood is a technology writer based in
Teddington, U.K.
© 2017 ACM 0001-0782/17/06 $15.00
“Autonomous
systems could be
more unpredictable
than humans, which
in warfare could
lead to disastrous
consequences.”
ACM
Member
News
DETERMINING NORMS
FOR CYBER WARFARE
Patrick
McDaniel, a
Distinguished
Professor in the
School of
Electrical
Engineering
and Computer Science at
Pennsylvania State University
(Penn State), says that when he
was 11 years old, his father
brought home a TRS- 80 portable
computer from Radio Shack,
and handed him the manual to
BASIC. “Within 10 minutes I
was addicted, and I have never
looked back. I have bachelor’s,
master’s, and Ph.D. degrees in
computer science, and it has
never even been a thought to do
anything else.”
McDaniel obtained his
undergraduate degree at Ohio
University in 1989, and his
master’s degree at Ball State
University in 1991. He then
worked to develop some of the
first IP networking hardware as
a project manager at Primary
Access Corp. in San Diego,
which was acquired by 3Com
in 1995.
He later earned his Ph. D.
in computer science and
engineering at the University of
Michigan, Ann Arbor. McDaniel
spent several years as a senior
research staff member at AT& T
Labs in New Jersey, before
joining the faculty at Penn State
in 2004.
McDaniel is director of
the Institute for Networking
and Security Research at Penn
State, and also university
lead for the U.S. Army Cyber
Security Research Alliance, a
10-year project to develop an
understanding of how to make
security-relevant decisions in
cyberspace.
One area McDaniel is
focused on concerns the norms
for international cyber warfare.
He works to help define and
set standards for what is
allowable; in effect, a Geneva
Convention for cyber warfare.
“Right now, because nothing
is set up, it is really hard to go
to the U.N. Security Council for
sanctions when you haven’t set
up any norms.”
—John Delaney