dress and future similar problems will
be averted. Thus, the public and legal
attitudes against automated prediction seem to be justified.
Yet there is a substantial shortcoming in this human-driven alternative
strategy, which might eclipse all the
advantages of human discretion and
lead us back to automated prediction
as a preferred option—the notion of
hidden biases. In many instances, human errors are not merely arbitrary.
Rather, they result from systematic
(while possibly subconscious) biases.
Recent studies indicate that subconscious biases against minorities still
plague individuals and their decision-making process. At times, decision
makers (both in the back office and
the field) discriminate against minorities when making discretion-based
decisions, even unintentionally. Given
these subconscious trends of behavior,
a discretion-based process is structurally prone to these forms of errors. On
the other hand, automation introduces
a surprising benefit. By limiting the
role of human discretion and intuition
and relying upon computer-driven decisions this process protects minorities and other weaker groups. Therefore, a shift to automated predictive
modeling might be affecting different
segments of society in predictable, yet
different ways.
It should be noted that the “hidden
bias problem” might migrate to the
automated realm. Automated decision
making relies upon existing datasets,
which are often biased regarding minorities, given years of unbalanced data
collection and other systematic flaws.
Restricting the use of problematic factors and their proxies, as well as other
innovative solutions that systematically
attempt to correct data outputs, can potentially mitigate these concerns.
Returning to our discussion regarding the interplay between public opinion and law, and with the novel intuition regarding the hidden benefits of
automated predictions in mind, our
analysis must examine how prediction
might be impacting the segmented
public’s opinion and what are the implications of these findings.
For members of the powerful majority, a shift toward an automated model
might prove to be a disadvantage. In a
discretion-based process, the chance
For members of
the powerful majority,
a shift toward
an automated model
might prove to
be a disadvantage.
of a member of the powerful majority to be wrongfully singled out is low.
Even if selected for further scrutiny,
members of this group can try to appeal to the reason of another individual. This appeal would involve a human
decision-making process, thus again
involving subconscious biases. In
these cases, there is a better chance the
subconsciously biased decision maker
will act in their favor. Therefore, the
powerful majority’s vocal discontent
with automated prediction is rational,
yet socially unacceptable.
However, things are quite different
for members of a minority group. For
these individuals, human discretion is
not always a warm human touch, but at
times a cold discriminating shoulder.
The automated process increases the
chances of blind justice. While society
installed various safeguards against
these forms of governmental discrimination, such measures still fail to limit
the impact of unintentional (and even
subconscious) biases. Automated prediction might be the most powerful
solution to the problems of such hidden discrimination—a solution the
unpopularity of this measure should
not block.
Conclusion
This analysis argues that the nega-
tive opinion flowing from at least
part of the public—the powerful ma-
jority—regarding automated predic-
tion should be ignored. One might go
further to note that the broad anti-
automated prediction sentiment gen-
erated through the media is merely a
manipulative ploy intended to main-
tain the existing social structure and
ensure the “haves” continue to benefit
from structural advantages. Even with-
out accepting this final radical view,
we must beware of popular calls to
limit the use of predictive automation
and further examine if they reflect the
interests and opinions of all.
References
1. savage, C. u.s. relaxes limits on use of data in terror
analysis. New York Times (Mar. 22, 2012); http://www.
nytimes.com/2012/03/23/us/politics/us-moves-to-relax-
some-restrictions-for-counterterrorism-analysis.html.
2. sayare, s. March 22, 2012. suspect in french killings
slain as police storm apartment after 30-hour siege.
New York Times (Mar. 22, 2012); http://www.nytimes.
com/2012/03/23/world/europe/mohammed-merah-
toulouse-shooting-suspect-french-police-standoff.
html?pagewanted= 1&_r= 1&adxnnlx=1332615795-
XCnsDQatfcppBtBhfdrnBQ.
3. united states Census Bureau. table 480. Internal
revenue gross Collections by type of tax: 2005 to
2010; http://www.census.gov/compendia/statab/2012/
tables/12s0481.pdf.
Tal Z. Zarsky ( tzarsky@law.haifa.ac.il) is a senior lecturer
in the faculty of law at the university of haifa, Israel.
the ideas presented in this column are further developed
in Zarsky, t.Z., “governmental Data Mining and its
alternatives.” Penn State Law Review 116, (2011), 285–330.
Copyright held by author.