includes my home as a starting point;
my request is too easily disaggregated
from the bundle.
Bear in mind that system designers need not completely eliminate the
transfer of location information; it
would be sufficient to reduce the precision of the location information to
where the preference mapping gives
the attacker or marketer little with
which to work.p Given the decreasing cost of memory and bandwidth,
it is both efficacious and inexpensive
to simply blur the location estimate
provided with the request for mapping functionality.q An LBS user may,
for example, submit a request to the
Doppio Detector that includes his or
her location as “somewhere in downtown Ithaca,” rather than a specific
address. The server will respond with
a map that indicates the locations of
all the espresso shops in downtown
Ithaca. The user’s handset can then
use its more precise knowledge of his
or her location to determine the nearest espresso shop and generate directions accordingly.
Anonymity can also be preserved
by limiting the length m of each location trace. This limitation is accomplished by preventing the LBS from
determining which requests, if any,
come from a given user.r As described
in Wicker, 27 public-key infrastructure
and encrypted authorization messages can be used to authenticate
users of a service without providing
their actual identities. Random tags
can be used to route responses back
to anonymous users. Anonymity for
frequent users of an LBS may thus be
protected by associating each request
with a different random tag. All users
of the LBS thus enjoy a form of k
-anonymity. Coupled with coarse location
estimates or random location offsets,
this approach shows great promise
p Privacy-preserving data mining techniques
(such as those developed by Evfimievski et al. 11)
may also provide solutions.
for preserving user anonymity while
allowing users to enjoy the benefits of
The increasing precision of cellular-location estimates is at a critical
threshold; using access-point and
cell-site location information, service
providers are able to obtain location
estimates with address-level precision. Compilation of these estimates
creates a serious privacy problem, as
it can be highly revealing of user behavior, preferences, and beliefs. The
subsequent danger to user safety and
autonomy is substantial.
To determine the extent to which
location data can be anonymized, this
article has explored the Shannon-theoretic concept of unicity distance to reveal the dynamics of correlation attacks
through which existing data records are
used to attribute individual identities
to allegedly anonymous information.
With this model in mind, it has also laid
out rules of thumb for designing anonymous location-based services. Critical
to them is maintenance of a coarse level
of granularity for any location estimate
available to service providers and the
disassociation of repeated requests for
location-based services to prevent construction of long-term location traces.
This work is funded in part by the National Science Foundation TRUST Science and Technology Center and the
NSF Trustworthy Computing Program.
I gratefully acknowledge the technical and editorial assistance of Sarah
Wicker, Jeff Pool, Nathan Karst, Bhas-kar Krishnamachari, Kaveri Chaudhry,
and Surbhi Chaudhry.
1. agnew, j.a. Place and Politics: The Geographical
Mediation of State and Society. unwin hyman, london,
2. agre, p.e. Computation and Human Experience.
Cambridge university press, Cambridge, u.k., 1997.
3. apple. Q&a on location Data; http://www.apple.com/
4. bilton, n. 3g apple ios devices are storing users
location data. The New York Times (apr. 20, 2011).
5. blumenthal, j., reichenbach, f., and timmermann,
D. position estimation in ad hoc wireless sensor
networks with low complexity. In Proceedings of the
Second Joint Workshop on Positioning, Navigation, and
Communication and First Ultra-Wideband Expert Talk
(hannover, germany, mar. 2005), 41–49.
6. Clarke, r.a. Information technology and dataveillance.
Commun. ACM 31, 5 (may 1988), 498–512.
7. Cresswell, t. Place: A Short Introduction. Wiley-blackwell, malden, ma, 2004.
8. Deleuze, g. postscript on the societies of control.
October 59 (Winter1992), 3–7.
9. Djuknic, g.m., and richton, r.e. geolocation and
assisted gps. Computer 34 (feb. 2001), 123–125.
10. Durrell, l. Balthazaar. faber & faber, london, 1960.
11. evfimievski, a., srikant, r., agrawal, r., and gehrke,
j. privacy-preserving mining of association rules. In
Proceedings of the Eighth ACM SIGKDD International
Conference on Knowledge Discovery and Data Mining
(edmonton, july 23–26). aCm press, new york, 2002,
12. federal Communications Commission. Notice of
Proposed Rulemaking Docket 94-102. Washington,
D. C., 1994.
13. ghinita, g., kalnis, p., khoshgozaran, a., shahabi,
C., and tan, k.-l. private queries in location-based
services: anonymizers are not necessary. In
Proceedings of the ACM SIGMOD International
Conference on Management of Data (vancouver, b.C.,
june 9–12). aCm press, new york, 2008, 121–132.
14. gruteser, m. and grunwald, D. anonymous usage of
location-based services through spatial and temporal
cloaking. In Proceedings of the First International
Conference on Mobile Systems, Applications, and
Services (san francisco, may 5–8). aCm press, new
york, 2003, 31–42.
15. hansell, s. aol removes search data on vast group of
Web users. The New York Times (aug. 8, 2006).
16. kaplan, e.D. Understanding GPS Principles and
Applications. artech house publishers, boston, 1996.
17. khoshgozaran, a. and shahabi, C. blind evaluation of
nearest-neighbor queries using space transformation
to preserve location privacy. In Proceedings of
the 10th International Symposium on Spatial and
Temporal Databases (boston, july 16–18). springer-verlag, berlin, 2007, 239–257.
18. kifer, D. and machanavajjhala, a. no free lunch in
data privacy. In Proceedings of the SIGMOD 2011
International Conference on Management of Data
(athens, june 12–16). aCm press, new york, 2011,
19. malpas, j. Place and Experience: A Philosophical
Topography. Cambridge university press, Cambridge,
20. morrissey, s. iOS Forensic Analysis for iPhone, iPad,
and iPod Touch. apress, new york, 2010.
21. narayanan, a. and shmatikov, v. robust de-anonymization of large sparse datasets. In
Proceedings of the 2008 IEEE Symposium on
Security and Privacy (oakland, Ca, may 18–21). Ieee
Computer society press, Washington, D.C., 2008,
22. netflix prize rules; http://www.netflixprize.com//rules
23. relph, e. Place and Placelessness. routledge kegan &
paul, london, 1976.
24. shannon, C. Communication theory of secrecy
systems. Bell System Technical Journal 28, 4 (oct.
25. sweeney, l. k-anonymity: a model for protecting
privacy. International Journal Uncertainty, Fuzziness
and Knowledge-Based Systems 10, 5 (oct. 2002),
26. tribble, g.b. testimony of Dr. guy “bud” tribble, vice
president for software technology, apple Inc.; http://
27. Wicker, s.b. Cellular telephony and the question of
privacy. Commun. ACM 54, 7 (july 2011), 88–98.
28. Williamson, j. Decoding Advertisements: Ideology and
Meaning in Advertising. marion boyars publishers ltd.,
29. yoshida, j. enhanced 911 service spurs integration
of gps into cell phones. EE Times (aug. 16 1999);
30. Zang, h. and bolot, j. C. anonymization of location
data does not work: a large-scale measurement
study. In Proceedings of the 17th Annual International
Conference on Mobile Computing and Networking
(las vegas, sept. 19–23). aCm press, new york, 2011,
Stephen B. Wicker ( firstname.lastname@example.org) is a
professor in the school of electrical and Computer
engineering of Cornell university, Ithaca, ny, and member
of the graduate fields of information science and computer