A TraceAudit is another idea that
helps users connect with their data over
time. The TraceAudit builds on the idea
of an Internet traceroute and relies on
careful logging procedures. An interface that allows users access to logs can
let them trace how their data is used by
an application, where the data has been
shared, and who has had access to it.
For example, a TraceAudit of data use
in PEIR can show participants exactly
how their raw location traces become
calculations of impact and exposure,
and how data is shared during that
process. A log could show users that
their PDV sent PEIR raw data on weekdays between the hours of 7 a.m. and 8
p.m. PEIR performs activity classification based on this raw data and sends
a summary of the activities and the ZIP
codes in which they occurred to the
California Air Resources Board. PEIR
receives back PM2.5 (fine particle) pollution exposure and CO2 emission values to correspond with these activities
and ZIP codes. PEIR then saves and displays these total calculations for users.
The TraceAudit provides transparency
and accountability, helping individuals
to see how PEIR has used and shared
challenges Beyond technology
System design that pays attention to
participant primacy, longitudinal engagement, and data legibility will help
users make data-sharing decisions and
protect their privacy in participatory
sensing. Technical decisions, however,
won’t be enough to ensure privacy for
sensing participants. Participant engagement in privacy decision making
must also be fortified by supporting social structures.
Participatory sensing opens the
door to entirely new forms of granular and pervasive data collection. The
risks of this sort of data collection are
not always self-evident. Even if we give
people options for managing their
data, they may not understand the benefits of doing so. Data literacy must be
acquired over time through many avenues. Public discussion and debate
about participatory sensing will be critical to educating participants about the
risks and possibilities of sensing data.
Discussion forums and blogs play an
important role, as do traditional media
and even community groups.
Further, partakers in participatory
sensing are going to need to understand what happens with their data
once it leaves their personal vault and
is used by third-party applications.
Diverse and plentiful applications for
participatory sensing data can help to
achieve the potential usefulness of participatory sensing, but will also make it
difficult for participants to understand
which applications are trustworthy and
abide by acceptable data practices. Participants need to know what they are
signing up for—and cryptic, fine-print
EULAs (end-user license agreements)
aren’t the answer.
Users should know how long an
application will retain their data and
whether it will pass the data on to other
parties. A voluntary labeling system,
much like “Fair Trade” labels on food,
could help consumers distinguish applications that abide by a minimum set
of responsible data practices. These
might include logging data use and
keeping audit trails, and discarding
location data after a specified period
of time. Such measures could help to
increase transparency of participatory
Finally, enhanced legal protections
for unshared vault data can encourage
participation in participatory sensing.
Ongoing work is investigating the possibility of a legal privilege for personal-sensing data. Such a privilege could be
enabled by statute and modeled on at-torney-client or doctor-patient privilege.
While lawyers and social scientists
work on structural changes to help
ensure privacy in participatory sensing, many of the initial and critically
important steps toward privacy protection will be up to application developers. By innovating to put participants first, we can create systems that
respect individuals’ needs to control
sensitive data. We can also augment
people’s ability to make sense of such
granular data, and engage participants in making decisions about that
data over the long term. Through attention to such principles, developers
will help to ensure that four billion
little brothers are not watching us. Instead, participatory sensing can have a
future of secure, willing, and engaged
Many thanks to collaborators Jeffrey
Burke, Deborah Estrin, and Mark Hansen, whose ideas and contributions
have shaped this material.
This article is based upon work supported by the National Science Foundation under Grant No. 0832873.
Sensing Everywhere (Video)
A Conversation with Cory Doctorow
and hal Stern
Logging on with KV
1. Cáceres, r., Cox, L., Lim, h., Shakimov, a., Varshavsky,
a. Virtual individual servers as privacy-preserving
proxies for mobile devices. in Proceedings of First
ACM SIGCOMM Workshop on Networking, Systems,
and Applications on Mobile Handhelds (barcelona,
2. Cuff, D., hansen, M., Kang, J. Urban sensing: out of the
woods. Commun. ACM 51, 3 (Mar. 2008), 24–33.
3. eagle, n. behavioral inference across cultures: Using
telephones as a cultural lens. IEEE Intelligent
Systems 23 (2008), 62–64.
4. eisenman, S.b., Lane, n.D., Miluzzo, e., Peterson,
r.a., ahn, g.S., Campbell, a.t. MetroSense Project:
People-centric sensing at scale. Proceedings of the
ACM Sensys World Sensor Web Workshop (boulder,
5. google health; https://www.google.com/health.
6. iachello, g., hong, J. 2007. end-user privacy in
human-computer interaction. Foundations and Trends
in Human-Computer Interaction 1 (2007), 1–137.
7. Kang, J. Privacy in cyberspace transactions. Stanford
Law Review 50 (1998), 1193–1294.
8. Lam, M. building a social networking future without
big brother; http://suif.stanford.edu/%7elam/lam-
9. Microsoft healthVault; http://www.healthvault.com/.
10. nissenbaum, h. Privacy as contextual integrity.
Washington Law Review 79 (2004), 119–158.
11. Palen, L., Dourish, P. Unpacking “privacy” for a
networked world. in Proceedings of 2003 SIGCHI
Conference (ft. Lauderdale, fL), 129-136.
12. Personal Privacy in an Information Society: The Report
of The Privacy Protection Study Commission. 1977;
13. U.S. Department of health, education, and Welfare.
Records, Computers, and the Rights of Citizens. Mit
Press, Cambridge, Ma, 1973.
14. Waldo, J., Lin, h. S., Millett, L. i. Engaging Privacy and
Information Technology in a Digital Age. the national
academies Press, Washington, D. C., 2007.
15. Zittrain, J. The Future of the Internet—And How to
Stop It. yale University Press, new haven and London,
Katie Shilton is a doctoral student in information studies
at UCLa. her research explores privacy and ethical
challenges raised by ubiquitous sensing technologies,
and she coordinates a research project at the Center
for embedded networked Sensing focused on these