of the algorithms of those companies.
Not only are the algorithms behind
these major services complex and secret, often users do not know that any
selection logic or personalization occurs at all, says Karrie Karahalios, a
professor of computer science at the
University of Illinois and co-director of
the Center for People and Infrastructures. In a study involving 40 of her students, more than half were “surprised
and angered” to learn there was a “
cu-ration algorithm” behind the Facebook
News Feed. She says such “invisible algorithms,” in the interest of efficiency,
can mislead people by acting as secret
“gatekeepers” to information.
Karahalios recommends browsers offer graphical cues to users to
show how the algorithms work, so users know why they are seeing certain
results. For example, when an item
ranks high in search results because
it has many links to other things, she
suggests that might be signaled with
a larger type font. She also says users
should have some control over how the
algorithms work. “I think it is important to have some levers that users can
poke and prod to see changes in the algorithmic system,” she says.
In 2014, Karahalios and several colleagues presented five ideas by which
algorithms, even secret ones, might be
“audited” for bias by outside parties. In
one, the Sock Puppet Audit, computer
programs would impersonate actual
users, generating test data and analyzing the results. Similarly, the testing
and evaluation of algorithms could be
crowd-sourced, by some mechanism
such as Amazon’s Mechanical Turk.
The advocates of audits agree these
ideas present technical and legal difficulties, but they say some kind of external checking on the fairness of these
ubiquitous services is needed. “
Putting warnings on search results is not
enough,” Karahalios says.
Luciano Floridi, a professor of phi-
losophy and ethics of information at
the University of Oxford, says the power
and secrecy of Google is worrisome in
part because of the company’s near-
monopoly power. “Nothing wrong has
happened so far, but that’s not a strat-
egy,” he says; “that’s like keeping your
fingers crossed.” He says recent revela-
tions that Volkswagen AG manipulated
engine software to fool regulators and
voluntarily adopt an equal-time rule or
the warnings, but he says either or both
could be built into the browser, acting
automatically when search results con-
tain the names of political candidates.
The idea government might play
a role in regulating search engines is
not new, and it is strongly opposed
by search companies and by many
First Amendment watchdogs. Frank
Pasquale, now a professor of law at the
University of Maryland, in 2008 wrote
a paper recommending the establishment of a Federal Search Commission.
He argues free speech concerns do not
apply to search because search engines
act more like common carriers, which
are subject to regulation, than like media outlets, which enjoy First Amendment protections.
As for why we need federal regulation, Pasquale says, “When a search engine specifically decides to intervene,
for whatever reason, to enhance or reduce the visibility of a specific website
or a group of websites ... [it] imposes its
own preferences or the preferences of
those who are powerful enough to induce it to act.”
Beyond Elections
Concerns about algorithms that
search, select, and present information extend beyond search companies. “I’ve looked at the black box algorithms behind Google, Facebook,
Twitter, and the others,” Pasquale
says, “and I’m pretty troubled by the
fact that it’s so hard to understand the
agenda that might be behind them.”
He supports the idea of a “trusted advisory committee” of technical, legal,
and business experts to advise the Federal Trade Commission on the fairness
There is no evidence
that any search
engine company
has ever tried to
manipulate
election-related
search rankings.
consumers are not reassuring.
Floridi says the risk of mischief is
compounded because Google’s users
are not customers in the retail com-
mercial sense. “They are not account-
able because users are not paying for
searches,” he says. “We don’t have cus-
tomers’ rights with Google.”
Floridi advises Google on “the right
to be forgotten” regulations by the
European Union. He says he finds his
contacts at the company to be “open-
minded and sensible” about ideas for
regulating search. “If it makes good
sense socially speaking, and if it makes
good sense business-wise, then there is
a conversation on the table,” he says.
Further Reading
Bracha, O., Pasquale, F.,
Federal Search Commission? Access,
fairness, and accountability in the law of
search. Cornell Law Review, September 2008
http://papers.ssrn.com/sol3/papers.
cfm?abstract_id=1002453
Epstein, R.,
The search engine manipulation effect
and its possible impact on the outcomes
of elections. Proceedings of the National
Academy of Sciences, Aug. 18, 2015
http://www.pnas.org/content/112/33/E4512.
abstract
Pasquale, F.,
The black box society: the secret algorithms
that control money and information.
Harvard University Press, 2015
http://www.hup.harvard.edu/catalog.
php?isbn=9780674368279
Sandvig, C., Hamilton, K.,
Karahalios, K., and Langbort, C.,
Auditing algorithms: research methods
for detecting discrimination on Internet
platforms. 64th Annual Meeting of the
International Communication Association,
May 22, 2014
http://acawiki.org/Auditing_Algorithms:_
Research_Methods_for_Detecting_
Discrimination_on_Internet_Platforms
Zittrain, J.
Engineering an election – digital
gerrymandering poses a threat to
democracy. Harvard Law Review Forum,
Jun 20, 2014
http://harvardlawreview.org/2014/06/
engineering-an-election/
Videos – How Google Works
https://www.youtube.com/
watch?v=Md7K90FfJhg
https://www.youtube.com/
watch?v=3tNpYpcU5s4
Gary Anthes is a technology writer and editor based in
Arlington, VA.
© 2016 ACM 0001-0782/16/04 $15.00