for Social Responsibility
I THINK OFTEN of Ender’s Game these days. In this award-win- ning 1985 science-fiction novel by Orson Scott Card (based on a 1977 short story with the same
title), Ender is being trained at Battle
School, an institution designed to
make young children into military commanders against an unspecified enemy
( http://bit.ly/2h YQMDF). Ender’s team
engages in a series of computer-sim-ulated battles, eventually destroying
the enemy’s planet, only to learn then
that the battles were very real and a real
planet has been destroyed.
I got involved in computing at age 16
because programming was fun. Later I
discovered that developing algorithms
was even more enjoyable. I found the
combination of mathematical rigor
and real-world applicability to be highly
stimulating intellectually. The benefits
of computing seemed intuitive to me
then and now. I truly believe that computing yields tremendous societal benefits; for example, the life-saving potential of driverless cars is enormous!
Like Ender, however, I realized
recently that computing is not a game—
it is real—and it brings with it not only
societal benefits, but also significant societal costs. Let me mention three examples.
I have written previously on the automation’s adverse impact on working-class
people—an impact that has already had
profound political consequences—with
further such impact expected as driving
gets automated ( http://bit.ly/2AdEv8A).
It has also become clear that “
frictionless sharing” on social media has given
rise to the fake-news phenomenon. It
is now widely accepted that this had
serious impact on both the 2016 U.K.
Brexit referendum and the 2016 U.S.
Presidential election. Finally, a 2017
paper in Clinical Psychological Science
attributes the recent rise in teen
depression, suicide, and suicide
attempts to the ascendance of the
smartphone ( http://bit.ly/2zianG5).
A dramatic drop in the public view
of Tech, a term that I use to refer both
to computing technology and the
community that generates that tech-
nology, has accompanied the recent
recognition of the adverse societal con-
sequences of computing. This decline
is well exemplified by Peggy Noonan,
a Wall Street Journal columnist who
wrote recently about trying to explain
(dubiously, IMHO) why Americans own
so many guns: “Because all of their per-
sonal and financial information got
hacked in the latest breach, because
our country’s real overlords are in Sili-
con Valley and appear to be moral Mar-
tians who operate on some weird new
postmodern ethical wavelength. And
they’ll be the ones programming the
robots that’ll soon take all the jobs!”
The question I’d like to pose to us in
Tech is as follows: We have created this
technology; What is our social responsibil-
ity? Of course, not all of us sit in Silicon
Valley, and not all of us make product-
deployment decisions. But much of
the technology developed by high-tech
corporations is based on academic re-
search, by students educated in aca-
demic institutions. Whether you like it
or not, if you are a computing profes-
sional, you are part of Tech!
Computer Professionals for Social
Responsibility (CPSR), founded in the
early 1980s, was an organization pro-
moting the responsible use of com-
puter technology. The triggering event
was the Strategic Defense Initiative
(SDI), a proposed missile-defense sys-
tem intended to protect the U.S. from
attack by ballistic strategic nuclear
weapons. CPSR argued that we lack
the technology to develop software that
would be reliable enough for the pur-
pose of SDI. Later, CPSR expanded its
scope to other tech-related issues. The
organization was dissolved in 2013.
(See Wikipedia http://bit.ly/2zvZsZb)
With the benefit of hindsight, the is-
sues that CPSR pursued in 1980s
appear remarkably prescient today.
One could argue that CPSR is not
needed any more; there are now numerous organizations and movements that
are focused on various aspects of responsible use of technology. But our society is
facing a plethora of new issues related to
societal impact of technology, and we, the
people who are creating the technology,
lack a coherent voice. ACM is involved in
many of these organizations and movements, by itself or with others, for example, ACM U.S. Public Policy Council,
ACM Europe Policy Committee, the ACM
Code of Professional Ethics, the Partnership on AI, and more. Yet, these efforts
are dispersed and lack coordination.
I believe ACM must be more active
in addressing social responsibility issues raised by computing technology.
An effort that serves as a central organizing and leadership force within
ACM would bring coherence to ACM’s
various activities in this sphere, and
would establish ACM as a leading
voice on this important topic. With
great power comes great responsibility. Technology is now one of the most
powerful forces shaping society, and
we are responsible for it!
Follow me on Facebook, Google+,
Moshe Y. Vardi ( firstname.lastname@example.org) is the Karen Ostrum
George Distinguished Service Professor in Computational
Engineering and Director of the Ken Kennedy Institute for
Information Technology at Rice University, Houston, TX.
He is the former Editor-in-Chief of Communications.
Copyright held by author.
DOI: 10.1145/3168007 Moshe Y. Vardi