authentication and auditing. Some financial systems incorporate accountability mechanisms for purposes of
fraud deterrence and detection, and
some health records systems incorporate mechanisms to account for privacy
violations, but these mechanisms will
need to find homes in a much broader
range of systems with various privacy
policy enforcement needs. The ability
to support provenance for both data
and programs at scale is needed.
Techniques and mechanisms for
tracking information flow. To me, the
fundamental nature of a privacy violation is an improper information flow.
proper and improper flows and to enable authorization of exceptions. Capabilities for tracking the flow of information within programs have matured
substantially in the past decade. Those
capabilities need to be extended to systems of programs.
Techniques for binding policies to
data and enabling distributed enforcement. If the data itself can carry the
policy to be enforced along with it, each
domain in which it appears can apply
appropriate enforcement. One might
imagine the data also collecting an audit trail as it moves from place to place.
Cryptographic approaches may help.
Techniques for identifying and
quantifying benefits of large-scale
data analysis and costs of privacy
harms. It is a tall order to model in advance the benefit one may gain from
analyzing some large dataset, since
one does not know what might be
learned. On the other hand, the analysis is usually undertaken with some
objective in mind, and it might be possible to quantify what is to be gained
if that objective is realized. Similarly,
some resources need to be devoted to
anticipating privacy harms and what
damages may occur if large datasets
are abused. These kinds of trade-offs
must be understood as well as possible at the time people are deciding
whether or not to initiate new projects
if there is to be any rigorous risk/ben-efit analysis in this sphere.
Privacy may be difficult to define and
culturally dependent, but it neverthe-
less seems to be universally valued.
Future computing systems must in-
corporate mechanisms for preserv-
ing whatever privacy policies people
and societies decide to embrace, and
research is needed to identify those
mechanisms and how they may best
1. National Academy of Science Raymond and Beverly
Sackler U.S.-U.K. Scientific Forum on Cybersecurity,
Dec. 8-9, 2014, Washington, D. C.
2. Networking and Information Technology Research
and Development (NI TRD) Program. Report on
Privacy Research Within NI TRD. (Apr. 2014); http://1.
3. Networking and information Technology Research
and Development (NI TRD) Program. Trustworthy
Cyberspace: Strategic Plan for the Federal
Cybersecurity Research and Development Program.
(Dec. 2011); http://1.usa.gov/1NgEUFN
4. Nissenbaum, H. Privacy as contextual integrity.
Washington Law Review 79 (2004), 119–158.
5. 2015 NSF Secure and Trustworthy Cyberspace PI
meeting (Jan. 5–7, 2015), Washington, D.C.; http://bit.
6. President’s Review Group on Communications and
Intelligence Technologies. Liberty and Security
in a Changing World. (Dec. 12, 2013); http://1.usa.
7. President’s Council of Advisors on Science and
Technology, Big Data and Privacy: A Technological
Perspective. (May 2014); http://1.usa.gov/1r TipM2
8. Privacy and Civil Liberties Oversight Board, Report
on the Surveillance Program Operated Pursuant to
Section 702 of the Foreign Intelligence Surveillance
Act, July 2, 2014; http://bit.ly/1FJat9g
9. Privacy and Civil Liberties Oversight Board. Report
on the Telephone Records Program Conducted under
Section 215 of the USA PATRIO T Act and on the
Operations of the Foreign Intelligence Surveillance
Court. (Jan. 23, 2014); http://bit.ly/1SRiPke
10. Records, Computers, and the Rights of Citizens:
Report of the Secretary’s Committee on Automated
Personal Data Systems (July 1973). Dept. of Health,
Education and Welfare, DHEW(OS), 73–94; http://1.
11. Second Annual CATO Surveillance Conference (Oct.
21, 2015), Washington, D. C.; http://bit.ly/1MEs Y05
12. U.S. Privacy and Civil Liberties Oversight Board
(PCLOB) Workshop “Defining Privacy”, Nov. 12, 2014,
Washington, D.C.; http://bit.ly/1ReOmg T
Carl Landwehr ( firstname.lastname@example.org) is Lead
Research Scientist the Cyber Security Policy and Research
Institute (CSPRI) at George Washington University in
Washington, D.C., and Visiting McDevitt Professor of
Computer Science at LeMoyne College in Syracuse, NY.
Copyright held by author.
but it nevertheless
seems to be
TEI ‘16: 10th International
Conference on Tangible,
Embedded, and Embodied
Eindhoven, the Netherlands,
Contact: Saskia Bakker,
FPGA’16: The 2016 ACM/SIGDA
International Symposium on
Contact: Deming Chen,
WSDM 2016: 9th ACM
International Conference on
Web Search and Data Mining,
San Francisco, CA,
Contact: Paul N. Bennett,
February 27–March 2
CSCW ‘16: Computer Supported
Cooperative Work and Social
San Francisco, CA,
Contact: Meredith Ringel
I3D ‘16: Symposium on
Interactive 3D Graphics
Contact: Chris Wyman,
SIGCSE ‘16: The 47th ACM
Technical Symposium on
Computing Science Education
Contact: Jodi L. Tims,
IUI’16: 21st International
Conference on Intelligent User
Contact: John O’Donovan,