to practical questions of governance:
we wish to conceptualize how data
should be governed to promote freedom and equality. This is not something academia can do on its own, but
is a long-term challenge to be addressed in collaboration with policymakers, and in consultation with everyone affected by the data economy.
Computer scientists are already
part of this process. When they conceptualize and build systems, they
make choices that determine how
data gets constructed and used. Understanding how computer scientific
research connects to the human and
to the social world, and how CS research contributes to particular outcomes, is the first step. Making connections between that understanding
and social scientific research is a
necessary first step. This process is
taking place at some computer scientific conferences (notably ACM FAT*,
which is now integrating social science and law tracks), but is also visible in smaller workshops and interdisciplinary programs where social
scientists and computer scientists
come together to work on the social
implications of data science and AI,
to publish together and to build a research agenda. This work will grow in
scale and importance in the coming
years, with the notion of global data
justice as a benchmark for the inclusiveness and breadth of the debate.
References
1. Dencik, L., Hintz, A., and Cable, J. Towards data
justice? The ambiguity of anti-surveillance
resistance in political activism. Big Data & Society 3,
2 (Feb. 2016), 1–12; https://bit.ly/2VxoF0A
2. Heeks, R. and Renken, J. Data Justice For
Development: What Would It Mean? (Development
Informatics Working Paper Series No. 63).
Manchester, U.K., 2016; https://bit.ly/2UKVIRr
3. Lyon, D. Surveillance Studies: An Overview. Polity
Press, Cambridge, 2007.
4. Taylor, L. What Is Data Justice? The Case for
Connecting Digital Rights and Freedoms on the
Global Level. Big Data and Society, 2017;
https://bit.ly/2uZjxXb
5. United Nations. A World that Counts: Mobilising the
Data Revolution for Sustainable Development. New
York, 2014; https://bit.ly/1it3l8P
6. Wagner, B. Ethics as an escape from regulation:
From ethics-washing to ethics-shopping? In M.
Hildebrandt, Ed. Being Profiled: Cogitas Ergo. Sum
Amsterdam University Press, Amsterdam, 2018,
84–90.
Linnet Taylor (l. e.m.taylor@tilburguniversity.edu) is
an associate professor at Tilburg Law School, Tilburg
University, The Netherlands.
This work is funded by Horizon 2020 ERC Starting Grant
#757247 DATAJUSTICE.
Copyright held by author.
and how to set boundaries and goals
collectively for our global data economy?, we arrive at questions about
both justice, and intercultural understandings of it. We need not only to be
able to articulate principles of justice
and fairness, but to have a productive
discussion about them with nations
that see things very differently.
Research on global data justice4 is
starting from this larger question of
how to pick and articulate principles
that people seem to agree on around
the world; we will then work on how
those should be turned into tools for
governing data—and creating the
institutions we need to do so, if they
do not exist. Researchers working on
this problem (who now include philosophers, social scientists, lawyers,
computer scientists and informatics
scholars, doing research in Europe,
the U.S., Africa, and Asia) have to try
to capture at least three conflicting
ideas about what data technologies
do and what their value is.
These conflicting ideas offer three
main principles: first, that our visibility through data should work for
us, not against us. We should be visible through our data when we need
to be, in ways that are necessary for
our well-being, but that it should be
part of a reasonable social contract
where we are aware of our visibility
and can withdraw it to avoid exploitation. Second, that we should have
full autonomy with regard to our use
of technology. We should be able to
adopt technology that is beneficial
for us, but using a smartphone or being connected should not be linked
to our ability to exercise our citizenship. Someone who has to use social
media to get a national identity document or who has to provide biometrics through a private company in
order to register for asylum, is not
using data technologies so much as
being used by them. Lastly, the duty
of preventing data-related discrimination should be held by both individuals and governments. It is not
enough to demand transparency so
that people can protect themselves
from the negative effects of profiling:
people should be proactively protected from discrimination by authorities
who have the power to control and
regulate the use of data.
These principles form a starting
point for understanding how similar
challenges play out in different places. The task of research is to identify
where common responses to those
challenges are emerging, to draw out
lessons for governance, and to suggest ways to operationalize them.
Translating this vision to the global level is a huge challenge. To do
this, we have to place different visions of data’s value and risks in relation to each other, and seek common principles that can inform
governance. Framing what global
data justice might mean involves
law, human rights, the anthropology
of data use and sharing, the political
economy of the data market and of
data governance more broadly, and
international relations.
This global problem is also becoming part of the agenda of computer
science and engineering. The agenda
of justice in relation to digitization
is under formation, and needs input
from all the fields doing conceptual
and applied work in relation to the
digital. It is not a task any individual
field can address on its own, because
work on data technology has evolved
beyond the point where those who
conceptualize and develop systems
can understand what effects they will
have on the global level. What is fair
or innocuous in one place may be unfair or harmful in another.
Data justice should provide a lens
through which we can address questions about how to integrate values
into technology, but it is a higher-lev-el question that cannot be answered
with guidelines or with toolkits for
privacy or explainability (despite the
importance of these approaches). It is
a conceptual question, though it leads
What is fair
or innocuous in
one place may be
unfair or harmful
in another.