crowdsourced research responsible, researchers and IRBs must develop ongoing, respectful dialogue
For detailed treatment of ethical issues
in crowdwork, see Martin et al. 7 For alternatives to MTurk, see Vakharia and
Lease11 or type “mturk alternatives”
into any search engine. Readers interested in ethical design of labor platforms should seek recent discussions
on “platform cooperativism” (for example, platformcoop.net).
1. Bederson, B. and Quinn. A.J. Web workers unite!
Addressing challenges of online laborers. In
Proceedings of CHI ’ 11 EA (2011), 97–106.
2. Berg, J. Income security in the on-demand economy:
Findings and policy lessons from a survey of
crowdworkers. Comparative Labor Law & Policy
Journal 37, 3 (2016).
3. Farrell, D. and Greig, F. Paychecks, paydays, and the
online platform economy: Big data on income
volatility. JP Morgan Chase Institute, 2016.
4. Harris, M. Amazon’s Mechanical Turk workers protest: ‘I
am a human being, not an algorithm.’ The Guardian
(Dec. 3, 2014); http://bit.ly/2EcZvMS.
5. Hitlin, P. Research in the crowdsourcing age, a case
study. Pew Research Center, July 2016.
6. Litman, L., Robinson, J., and Rosenzweig, C. The
relationship between motivation, monetary
compensation, and data quality among U.S.- and
India-based workers on Mechanical Turk. Behavior
Research Methods 47, 2 (Feb. 2015), 519–528.
7. Martin, D. et al. Turking in a global labour market.
Computer Supported Cooperative Work 25, 1 (Jan.
8. Michelucci, P. and Dickinson, J.L. The power of crowds.
Science 351, 6268 (2016), 32–33.
9. Salehi, N. et al., Eds. Guidelines for Academic
Requesters—WeAreDynamo Wiki (2014); http://bit.
10. Silberman, M. et al. Stop citing Ross et al. 2010, ‘Who
are the crowdworkers?’; http://bit.ly/2FkrObs.
11. Vakharia, D. and Lease, M. Beyond Mechanical Turk: An
analysis of paid crowd work platforms. In Proceedings
of iConference 2015. (2015).
M. Six Silberman ( firstname.lastname@example.org)
works in the Crowdsourcing Project at IG Metall, 60329
Frankfurt am Main, Germany.
Bill Tomlinson ( email@example.com) is a Professor in the
Department of Informatics at the University of California,
Irvine, CA, USA, and a Professor in the School of
Information Management, Victoria University of
Wellington, New Zealand.
Rochelle LaPlante ( firstname.lastname@example.org) is a
professional crowdworker, Seattle, WA, USA.
Joel Ross ( email@example.com) is a Senior Lecturer in the
Information School, University of Washington, Seattle,
Lilly Irani ( firstname.lastname@example.org) is an Assistant Professor in
the Communication Department and Science Studies
Program, University of California, San Diego, CA, USA.
Andrew Zaldivar ( email@example.com) is a Researcher in
the Department of Cognitive Sciences, University of
California, Irvine, CA, USA (now at Google).
This material is based upon work supported in part by
National Science Foundation Grant CCF-1442749. The
authors thank Janine Berg and Valerio De Stefano for
comments. This Viewpoint reflects the authors’ views, not
any official organizational position.
Copyright held by authors.
requested this (Salehi et al. 9). Ethics demands we take worker requests seriously.
While crowdworkers are often located around the world, minimum wage
at the client’s location is a defensible
lower limit on payment. If workers are
underpaid, for example, due to under-estimation of how long a task might
take, correct the problem (for instance,
on MTurk, with bonuses). On MTurk, if
workers are refused payment mistakenly, reverse the rejections to prevent
damage to the workers’ approval rating. Note that fair wages lead to higher
quality crowdsourced research. 6
Remember you are interacting with
human beings, some of whom complete these tasks for a living. Treat them
at least as well as you would treat an in-person co-worker. As workers themselves have gone to great lengths to express to the public, 4 crowdworkers are
not interchangeable parts of a vast
computing system, but rather human
beings who must pay rent, buy food,
and put children through school—and
who have, just like clients, career and
life goals and the desire to be acknowledged, valued, and treated with respect.
Respond quickly, clearly, concisely,
and respectfully to worker questions
and feedback via both email and worker forums (for example, turkernation.
com, mturkcrowd.com). In addition to
being a reasonable way to engage with
human workers, this engagement may
also improve the quality of the work
you receive, since you may be informed
of task design problems before a great
deal of work has been done—and before you have incurred a responsibility
to pay for that work, which was done in
Learn from workers. If workers tell
you about technical problems or unclear instructions, address them
promptly, developing workarounds as
needed for workers who have completed the problematic task. Especially if
you are new to crowdsourcing, you
may unknowingly be committing errors or behaving inappropriately due
to your study design or mode of engagement. Many workers have been
active for years, and provide excellent
advice. Workers communicate with
one another and with clients in forums
(as described earlier); MTurk workers
in particular have articulated best
practices for ethical research in the Dynamo Guidelines for Academic Requesters (guidelines.wearedynamo.
org; Salehi et al. 9).
Currently, the design of major
crowdsourcing platforms makes it difficult to follow these guidelines. Consider a researcher who posts a task to
MTurk, and after the task is posted,
discovers that even expert workers
take twice as long as expected. This is
unsurprising; recent research shows
that task instructions are often unclear to workers. If this researcher
wishes to pay workers “after-the-fact”
bonuses to ensure they are paid the intended wage, this can only be done
one-by-one or with command-line
tools. The former is time-consuming
and tedious; the latter is only usable
for a relative minority of clients. The
platform’s affordances (or non-affor-dances) are powerful determiners of
how clients (are able to) treat workers.
We suggest platform operators would
do workers, clients, and themselves a
service by making it easier for clients
to treat workers well in these cases.
Finally, we call on university Institutional Review Boards to turn their attention to the question of responsible
crowdsourced research. Crowdworkers
relate to their participation in crowdsourced research primarily as workers.
Thus the relation between researchers
and crowdworkers is markedly different
than researchers’ relation to study participants from other “pools.” While
there may be some exceptions, we thus
believe researchers should generally pay
crowdworkers at least minimum wage.
We urge IRBs to consider this position.
These suggestions are a start, not a
comprehensive checklist. To make
IRBs must develop