Everyday Ethics Across the Curriculum Initiative.
Teaching Ethics, 2009.
19. Rawls, J. A Theory of Justice. Belknap, Cambridge,
MA, USA, 1971.
20. Rumbold, B. and Wilson, J. Privacy rights and public
information. J. Political Philosophy, 2018.
21. Scanlon, TM. The Difficulty of Tolerance: Essays in
Political Philosophy. Cambridge University Press,
Cambridge, MA, USA, 2006.
22. Skirpan, M., Beard, N., Bhaduri, S., Fiesler, C. and
Yeh, T. Ethics education in context: A case study
of novel ethics activities for the CS classroom. In
Proceedings of the 49th ACM Technical Symposium
on Computer Science Education (Baltimore,
MD, 2018). ACM Press, New York, NY, 940–945;
23. Sweeney, L. Uniqueness of simple demographics in the
U. S. population. LIDAP-WP4, 2000. Carnegie Mellon
University, Laboratory for International Data Privacy,
24. Tasioulas, J. The moral reality of human rights.
Freedom from Poverty as a Human Right: Who
Owes What to the Very Poor? T. Pogge, (ed). Oxford
University Press, co-Published with UNESCO, 2007.
25. Wasserman, D., Asch, A., Blustein, J. and Putnam,
D. Disability: Definitions, models, experience. The
Stanford Encyclopedia of Philosophy (Summer 2016
Edition). Retrieved July 19, 2018 from https://plato.
Jeff Behrends ( firstname.lastname@example.org) is a Lecturer
of Philosophy at Harvard University, Cambridge, MA, USA,
and Director of Ethics and Technology Initiatives at the
Edmond J. Safra Center for Ethics.
David Gray Grant ( email@example.com) is a
postdoctoral fellow in philosophy at Harvard University
where he co-leads the Embedded EthiCS teaching lab,
and a research fellow in digital ethics at the Jain Family
Barbara J. Grosz ( firstname.lastname@example.org) is Higgins
Professor of Natural Sciences on the Computer Science
faculty of the John A. Paulson School of Engineering
and Applied Sciences at Harvard University, Cambridge,
MA, USA, and a member of the External Faculty of Santa
Fe Institute, Santa Fe, NM, USA. She is co-founder and
co-director of the Embedded EthiCS initiative and also
created the Harvard course “Intelligent Systems: Design
and Ethical Challenges.”
Lily Hu ( email@example.com) is a Ph. D. candidate in
Applied Mathematics and a Fellow at the Berkman Klein
Center, both at Harvard University, Cambridge, MA, USA.
She has served as a teaching assistant for the course on
“Intelligent Systems: Design and Ethical Challenges.”
Alison Simmons ( firstname.lastname@example.org) is the
Samuel H. Wolcott Professor of Philosophy at Harvard
University, Cambridge, MA, USA. She is co-founder and
co-director of the Embedded EthiCS initiative.
Kate Vredenburgh ( email@example.com) was
an Embedded EthiCS teaching assistant during her time
as a Ph.D. candidate in philosophy at Harvard University.
She is currently a postdoctoral fellow at Stanford’s McCoy
Family Center for Ethics in Society and will join the faculty
of the London School of Economics as an Assistant
Professor in 2020.
Jim Waldo ( firstname.lastname@example.org) is a professor of
the practice of computer science at Harvard University,
Cambridge, MA, USA, where he is also the chief
technology officer for the School of Engineering, a position
he assumed after leaving Sun Microsystems Laboratories.
He teaches courses in privacy, technology ethics, and
Copyright held by authors/owners. Publication rights
licensed to ACM. $15.00.
technical work. Post-module surveys
provide insight into the effectiveness
of particular modules, but we want to
measure the approach’s impact over
the course of years, for instance, as students complete their degrees and even
later in their careers. By design, Embedded EthiCS makes small interventions
in individual courses, precluding the
usefulness of short-term evaluations
of impact at the individual course level
(for example, pre-course/post-course
surveys22). We thus need to find ways
to measure the long-term effectiveness
of the Embedded EthiCS approach and
compare it to other approaches. As
measuring the impact of teaching ethics within the CS curriculum is a challenge regardless of approach, we aim to
identify broadly applicable methods.
The institutional challenges to
mounting Embedded EthiCS derive
from its cross-disciplinary nature. In
particular, university support, both financial and administrative, is crucial.
Funding is needed for teaching assistants and postdoctoral fellows, including senior level postdoctoral fellows
able to train and support the efforts of
those developing modules for courses.
Administrative support is needed for
recruiting faculty and courses in computer science, for recruiting teaching
assistants and postdoctoral fellows in
philosophy, and for organizing and
managing a repository of materials for
the program, including modules and
evaluation materials. Several of these
challenges are made more complex because they cross university divisions.
Teaching computer scientists to iden-
tify and address ethical problems
starting from the design phase is as
important as enabling them to develop
algorithms and programs that work
efficiently. The strategy of integrating
the teaching of ethical reasoning skills
with the teaching of computational
techniques into existing computer sci-
ence coursework not only provides
students valuable experience identify-
ing, confronting, and working through
ethical questions, but also communi-
cates the need to identify, confront,
and address ethical questions through-
out their work in computer science. It
provides them with ethical reasoning
skills to take into their computing and
information technology work after they
graduate, preparing them to produce
socially and ethically responsible com-
puter technology, and to justify their
ethically motivated design choices to
their colleagues and employers. Com-
puter scientists and technologists with
these capabilities are important for the
long-term well-being of our society.
We invite those at other institutions to join us by integrating ethics
throughout their own computer science curriculum and to help us expand
the open repositories of resources we
are developing for ethics modules, including in-class activities, case studies,
assignments, and recommended readings. We also think it is important to
share lessons learned, approaches to
meeting the challenges of university
support for these efforts, and ways to
engage and train philosophers to participate in them.
1. Angwin, J., Larson, J., Mattu, S. and Kirchner, L.
Machine Bias. (May 2016). Retrieved Oct. 13, 2018 ;
2. Barocas, S. and Nissenbaum, H. Big data’s end run
around procedural protections. Commun. ACM 57, 11
3. Barocas, S. and Selbst, A. Big data’s disparate impact.
California Law Review 104, 671 (2016).
4. Burton, E. Goldsmith, J. and Mattei, N. How to teach
computer ethics through science fiction. Commun.
ACM 61, 8 (Aug. 2018).
5. Califf, M.E. and Goodwin, M. 2005. Effective
incorporation of ethics into courses that focus on
programming. In Proceedings of the 36th SIGCSE
Technical Symposium on Computer Science Education.
(St. Louis, MO, 2015) ACM Press, New York, NY,
347–351. DOI: 10.1145/1047344.1047464
6. Cech, E.A. Culture of disengagement in engineering
education? Science, Technology, & Human Values 39,
1 (2014), 42–72.
7. Dwork, C. Differential privacy. Automata, Languages
and Programming. M. Bugliesi, B. Preneel, V. Sassone,
and I. Wegener, (eds) Lecture Notes in Computer
Science 4052 (2006). Springer, Berlin, Heidelberg.
8. Economist. Do social media threaten democracy? (Nov.
2017). Retrieved Oct. 13, 2018; https://econ.st/2xf31GL.
9. Hellman, D. When is Discrimination Wrong? Harvard
University Press, Cambridge, MA, 2011.
10. Hollander, R. and Arenberg, C. R. (eds.). Ethics
Education and Scientific and Engineering Research:
What’s Been Learned? What Should Be Done?
Summary of a Workshop. National Academy
of Engineering. The National Academies Press,
Washington, D.C, 2009.
11. Kleinberg, J., Mullainathan, S. and Raghavan, M.
Inherent Trade-Offs in the Fair Determination of Risk
Scores (2016), arXiv:1609.05807
12. Knight, W. Biased algorithms are everywhere, and no
one seems to care. Technology Review, (July 2017).
Retrieved Oct. 13, 2018 from https://bit.ly/2tIh1EX.
13. Levin, S. Uber crash shows ‘catastrophic failure’ of
self-driving technology, experts say. The Guardian
(Mar. 2018); https://bit.ly/2pych2k.
14. Levy, K. and Barocas, S. Refractive surveillance:
Monitoring customers to manage workers. Intern. J.
Commun. 12 (2018), 1166–1188.
15. Lohr, S. Facial recognition is accurate if you’re a
white guy. New York Times (Feb. 2018); https://nyti.
16. Mill, J.S. On Liberty. Parker and Son, London, U.K., 1859.
17. New York Times Editorial Board. Facebook and the
digital virus called fake news. (Nov. 2016); https://nyti.
18. Pease, A. and Baker, R. Union College’s Rapaport
Watch the authors discuss
this work in the exclusive