high. Unfortunately, however, such verification software is not designed to verify
first that the sensor is operating properly.
Moreover, software developers must
learn to be more analytical about the
specifications their software is being
designed to obey. Making software conform to “some specification,” as Andrew Appel of Princeton University said
to Shein, is foolish unless the specification is the right specification. In a large-system context, “right,” as Tatlock said,
cannot be assured in advance. Developers must first create the code, then confirm it is consistent with all the other
code with which it will eventually inter-operate. Here, “confirm” means “
consistent with the rules of logic, arithmetic, and semantics,” not just some
code developer’s specification for only
a piece of the ultimate system.
Jack Ring, Gilbert, AZ
Communications welcomes your opinion. To submit a
Letter to the Editor, please limit yourself to 500 words or
less, and send to letters@cacm.acm.org.
©2018 ACM 0001-0782/18/1
data scientists and researchers who critique data science is a strawman. For
example, Cathy O’Neil, whose book
Weapons of Math Destruction
Barocas and boyd turned to for examples
of unethical algorithms, is not just a
researcher who happens to focus on
data ethics, as they described her,
but is herself a data scientist. Researchers like O’Neil would thus seem
to be part of the solution to the problem Barocas and boyd identified.
Barocas and boyd also did not
mention capitalism—the economic
force behind much of today’s data science—with its behavioral prediction
and manipulative advertising. What
if the answer to unethical algorithms
is to not create them in the first place?
If an algorithm yields biased results
based on, say, ethnicity, as Barocas
and boyd mentioned, then surely the
answer would be to not develop or use
it to begin with; that is, do no harm.
But such an approach also means the
algorithm writer cannot sell the algorithm, an option Barocas and boyd
ignored. They said data scientists try
“to make machines learn something
useful, valuable…” yet did not identify who might find it useful or what
kind of value might be derived. Selling
an algorithm that helps “wrongly incarcerate” people has financial value
for the data scientist who wrote it but
negative social value for, and does major financial harm to, those it might
help put in jail. Data scientists must
do what “maximizes the…models’
performance” but for whom and to
what end? Often the answer is simply
to earn a profit.
If one can learn data science, one
can should be able to use it ethically.
More than once Barocas and boyd
mentioned how data scientists “
struggle” with their work but expressed no
empathy for those who have been hurt
by algorithms. They did say data scientists choose “an acceptable error rate,”
though for some scenarios there is no
acceptable error rate.
If data scientists do not use data
science ethically, as Barocas and boyd
wrote and as O’Neil has shown, they
are indeed doing it improperly. What
Barocas and boyd failed to suggest is
such data scientists should not be doing data science at all.
Nathaniel Poor, Cambridge, MA
Why Whole Foods for Amazon
Michael A. Cusumano’s Viewpoint
“Amazon and Whole Foods: Follow the
Strategy (and the Money)” (Oct. 2017)
looked to identify a financial strategy
for Amazon.com’s June 2017 acquisition of Whole Foods, which Amazon
must have seen as strategically advantageous because it paid far more than
Whole Food’s market valuation at the
time. However, the column ignored the
crucial potential for cross-subsidization
to harm competition in the grocery
industry. The trade press has since reported sales of high-margin Amazon
products, including Echo devices, at
Whole Foods, along with deals expected to come later (such as Whole Foods
discounts for customers who also purchase Amazon Prime). Using revenue
from such products and services to lower the price for commodity, low-margin
products like groceries is a classic anti-competitive strategy with dubious legal
or ethical basis.
Andrew Oram, Arlington, MA
Address the Slacker
Before the Hacker
Esther Shein’s news story “
Hacker-Proof Coding” (Aug. 2017) deserves a
clarification and a warning about assumptions. Software developers should
recognize that the techniques Shein explored will locate code faults but need
not be solely manual, expensive, or tedious and thus error-prone. Contrary to
the sources Shein quoted, much of the
process of looking for faults can be automated. Automated diagnosis of errors
can be done for approximately 20% of
what it now costs in terms of programmer time and financial expenditure,
yielding immediate and significant ROI.
Regarding assumptions, software
developers should also be aware that
the threat to the world’s software systems is not just from hackers but also
from slackers often within their own
midst. Too many software developers
simply do not put enough effort into
understanding the context in which
their code will operate. For example,
consider what Zachary Tatlock of
the University of Washington said to
Shein, “… as software that verifies the
beam power has not become too high
… ” because software verifies only that
some sensor output value is not too
The Next Phase in
the Digital Revolution:
Platforms, Automation,
Growth, Employment
Elements of the Theory
of Dynamic Networks
Titus: Introducing
Containers to
the Netflix Cloud
Views from the Top
The Theory
of Dynamic Networks
Plus the latest news about
quantum encryption, the value
of data, and the continuing
education of software.
C
om
ingNex
tMonthin C
OMMUN
IC
AT
IONS