transparency perspective is closely related to the issue
of maintaining provenance for scientific data [ 4, 11].
Alan Westin published his landmark study Privacy
and Freedom in 1967 [ 12]. Still in the age of main-frame computers, it set the stage for thinking about
privacy over the next three decades. Westin presented what has become a classic definition of privacy, emphasizing the individual’s right to control
how personal information “is communicated to others.” An information-accountability perspective on
privacy would reframe this definition, shifting toward
the use of any information. Following Westin, we
would say that privacy is the claim of individuals,
groups, and institutions to determine for themselves
when, how, and to what extent information about
them is used lawfully and appropriately by others.
Westin’s work is essential today for identifying the
role of privacy in a free society. However, advances in
communications and information technology and the
ease of data searching and aggregation have rendered
his definition incomplete as a framework for information policy and information architectures that are
intended to be policy aware.
Will the new tools and laws we’ve described here
put an end to all privacy invasion, unfair misuse of personal information, copyright infringement, and identity theft? Of course not. Perfect compliance is not the
proper standard by which to judge laws or systems that
help enforce them. Rather we should ask how to build
systems that encourage compliance and maximize the
possibility of accountability for violations. We should
see clearly that our information-policy goals cannot be
achieved by restricting the flow of information alone.
While the accountability approach is a departure from
contemporary computer and network policy techniques, it is far more consistent with the way legal rules
traditionally work in democratic societies.
Contemporary information systems depart from
the norm of social systems in the way they seek to
enforce rules up front by precluding the possibility of
violation, generally through the application of strong
cryptographic techniques. In contrast, we follow rules
because we are aware of what they are and because we
know there will be consequences, after the fact, if we
violate them. Technology will better support freedom
by relying on these social compacts than by seeking to
supplant them. c
1. Barth, A., Mitchell, J., and Rosenstein, J. Conflict and combination in
Privacy in the Electronic Society (Washington, D.C., Oct. 28). ACM,
New York, 2004, 45– 46.
2. Dempsey, J. and Flint, L. Commercial data and national security. The
George Washington Law Review 72, 6 (Aug. 2004).
3.Fair Credit Reporting Act, 15 U.S.C.†ß†1681;
www.law.cornell.edu/uscode/15/usc_sup_01_ 15_ 10_ 41_ 20_III.html .
4. Golbeck, G. and Hendler, J. A semantic Web approach to the provenance challenge. Concurrency and Computation: Practice and Experience
5.Jo bs, S. Thoughts on Music (Feb. 6, 2007);
6. Kagal, L., Hanson, C., and Weitzner, D. Integrated policy explanations
via dependency tracking. In Proceedings of the IEEE Workshop on Policies for Distributed Systems and Networks (June 2– 4, 2008).
7. Lunt, T. Protecting Privacy in Terrorist-Tracking Applications.
Presentation to the Department of Defense Technology and Privacy Advisory
Committee (Washington, D.C., Sept. 29, 2003).
8. Samarati, P. Protecting respondent’s privacy in microdata release. IEEE
Transactions on Knowledge and Data Engineering 13, 6 (Nov./Dec.
9. Solove, D. The Digital Person. New York University Press, New York,
10. Sweeney, L. K-anonymity: A model for protecting privacy.
International Journal on Uncertainty, Fuzziness, and Knowledge-based Systems
10, 5 (2002), 557–570.
11. Szomszor, M. and Moreau, L. Recording and reasoning over data
provenance in Web and grid services. In Proceedings of the International
Conference on Ontologies, Databases, and Applications of Semantics 2888
(Catania, Sicily, Italy, 2003), 603–620.
12. Westin, A. Privacy and Freedom. Atheneum Press, New York, 1967.
DANIEL J. WEITZNER ( email@example.com) is Director of the
Massachusetts Institute of Technology Decentralized Information
Group and principal research scientist in the MIT Computer Science
and Artificial Intelligence Laboratory, Cambridge, MA, and
Technology and Society Policy Director of the the World Wide Web
HAROLD ABELSON ( firstname.lastname@example.org) is the Class of 1922 Professor of
Computer Science and Engineering at the Massachusetts Institute of
Technology, Cambridge, MA.
TIM BERNERS-LEE ( email@example.com) is Director of the World
Wide Web Consortium and holds the 3Com Founders chair and is a
senior research scientist in the Laboratory for Computer Science and
Artificial Intelligence at the Massachusetts Institute of Technology,
JOAN FEIGENBAUM ( firstname.lastname@example.org) is the Grace
Murray Hopper Professor of Computer Science at Yale University,
New Haven, CT.
JAMES HENDLER ( email@example.com) is the Tetherless World
Professor of Computer and Cognitive Science at Rensselaer
Polytechnic Institute, Troy, NY.
GERALD JAY SUSSMAN ( firstname.lastname@example.org) is the Panasonic Professor of
Electrical Engineering at the Massachusetts Institute of Technology,
The authors would like to thank Randy Davis and Butler Lampson for their insightful comments on accountability, copyright, and privacy.
The work reported here was conducted at MIT, RPI, and Yale with support from the
National Science Foundation Cybertrust Grant (award #04281) and IARPA (award
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation
on the first page. To copy otherwise, to republish, to post on servers or to redistribute
to lists, requires prior specific permission and/or a fee.
© 2008 ACM 0001-0782/08/0600 $5.00