ized as private to them and republish it
to their immediate friends. The discipline of Web Science covered recently
in these pages3 is an attempt to harness transdisciplinary endeavor to try
to understand the complex feedback
cycles between the Web and society.
If ownership and regulation are problematic, what to do? We have two proposals, one modest, one a little deeper.
As things stand, privacy is a game
for the rich and well informed, creating
a digital divide to which one response
is to redress the balance by exploring
ways in which people can perceive advantage from protecting their privacy.
In particular, if we can shift the emphasis from concealment to transparency—from the concealment of data from
potential users, to transparency of how
data is being used—we can begin to
provide answers to questions like “who
is looking at you?” and “what is being
said about you?” Data will continue to
be gathered, aggregated and graphed,
but its use should be clear and traceable. We are of course gesturing toward
the work of Daniel Weitzner and colleagues on information accountability,
reported in this magazine. 11
With a proper infrastructure in
place, it should be possible to construct legal/technical/economic models where people can be recompensed
for the use of their data—you could be
paid for your clickthroughs. Or perhaps
you would require a donation to a cause
of your choice in return for your clickthroughs. If others are making money
from observing your activity, it doesn’t
seem outrageous that you or your nominees should be compensated.
It may be that the commercial thirst
for consumer data is about to wane as
the global financial crisis undermines
advertising, and therefore the business
models of many Internet companies.
But this idea is just one instance of a
more general principle of reciprocity between technology developers and
information subjects. If a technology
makes public service more efficient,
or a business process more profitable,
then it should also be used reciprocally
to aid the citizen or consumer.
If government officials have better
access to data as a result of technology,
then citizens should too—improved
data for government implying more
freedom of information. Indeed, this
Perhaps we should
be talking of
of privacy too.
is the thrust of the Making Public Data
Public project on which Nigel Shadbolt
and Tim Berners-Lee are advising the
U.K. government. 2 Although the project is focusing on non-personal public
sector information the premise is that
more data increases transparency and
can drive public sector improvement
and reform. In the context of personal
information a consumer should be
able to get improvements in data protection, for example by being able to
use technology to enforce access to
information in the many jurisdictions
where such enforcement is currently
As our rights as citizens and as consumers seem to be coming together,
markets could be redefined to change
the incentives to protect one’s own privacy and respect that of others, for example, as with principles such as ‘the polluter pays.’ The analogy with pollution
is suggestive for our more fundamental
idea—an invasion of privacy has things
in common with pollution, in particular
that the individual benefits and costs do
not capture the full social costs.
In many jurisdictions, particularly
common law ones, the complexities
of privacy are dealt with by exploiting
collective wisdom, referring individual
cases to a reasonable expectation of
privacy. In other words, if one behaves
in such a way that one could not reasonably expect to be private, then others are not liable for invading one’s privacy. Reasonable expectations change
through time and space, making law
sensitive to context.
Online, reasonable expectations are
diminishing all the time, as our clicks
are logged and people generously give
information about themselves and others away to their social networks. Surveillance is becoming the norm, with
the complicity of many data subjects.
But might this be a social harm?
Privacy is essential for the proper
functioning of a liberal, democratic society. Some benefits may accrue to the
individual (who gains autonomy, a space
of intimacy, freedom of speech, and so
forth). But equally benefits accrue to
society—a free, liberal polity of autonomous individuals is a public good, in the
same way that clean air is. Everyone benefits, even if not everyone contributes.
If privacy is a public, not a private,
good, then talking exclusively of rights
is not the right way to go. Perhaps we
should be talking of the responsibilities
of privacy too. This would involve something of a culture change, especially in
our voyeuristic society. 1 But this would
not be unprecedented: it was privacy
activists, not the law, which pressured
Web sites in the 1990s to respect privacy rather than promiscuously gathering and selling consumers’ data. 4
Perhaps it is our duty to ensure that
reasonable expectations of privacy are
1. Anderson, D. The failure of American privacy law.
In b.s. Markesinis, Ed., Protecting Privacy, oxford
university Press, oxford, 1999.
2. berners-Lee, T. and shadbolt, n. Put your postcode in,
out comes the data. The Times (nov. 18, 2009).
3. hendler, J. et al. Web science: An interdisciplinary
approach to understanding the Web. Commun. ACM
51, 7 (July 2008), 60–69.
4. hetcher, s.A. Norms in a Wired World, Cambridge
university Press, Cambridge, 2004.
5. Meiss, M.r., Menczer, F., and Vespignani, A.
structural analysis of behavioural networks from the
Internet. Journal of Physics A: Mathematical and
Theoretical 41, 22 (June 2008); doi: 10.1088/1751-
6. o’hara, K. and shadbolt, n., The Spy in the Coffee
Machine: The End of Privacy As We Know It. oneworld,
7. Pitt-Payne, T. Access to electronic information. In C.
reed, and J. Angel, Eds., Computer Law: The Law and
Regulation of Information Technology, 6th ed., oxford
university Press, oxford, 2007.
8. Poullet, y. and Dinant, J.M. The Internet and private
life in Europe: risks and aspirations, In A. T. Kenyon
and M. richardson, Eds., New Dimensions in Privacy
Law, Cambridge university Press, Cambridge, 2006.
9. shadbolt, n., hall, W., and berners-Lee, T. The
semantic Web revisited. IEEE Intelligent Systems 23,
3 (May/June 2006), 96–101.
10. Walden, I., Privacy and data protection. In C. reed
and J. Angel. Computer Law: The Law and Regulation
of Information Technology, 6th ed., oxford university
Press, oxford, 2007.
11. Weitzner, D. et al. Information accountability.
Commun. ACM 51, 6 (June 2008), 82–87.
Kieron O’hara ( email@example.com) is a senior research
fellow in Electronics and Computer science at the
university of southampton, and a research fellow of the
Web science Trust at the university of southampton, u.K.
nigel Shadbolt ( firstname.lastname@example.org) is Professor of
Artificial Intelligence and Deputy head (research) of
the school of Electronics and Computer science at the
university of southampton, and information advisor to the