data.
12, 13 These codes are still considered a gold standard for privacy protection.
14 But the principles, designed for
corporations or governments rather
than many distributed data collectors,
are no longer enough. Data gathered
during participatory sensing is more
granular than traditional personal data
(name, Social Security number, among
others). It reveals much more information about an individual’s habits and
routines. Furthermore, data is no longer gathered solely by large organizations or governments with established
data practices. Individuals or community groups might create participatory
sensing applications and begin collecting personal data.
15
enabling Participation in Privacy
This is where the responsibility of developers comes into the equation. How
can developers help individuals or
small groups launching participatory
sensing projects implement appropriate data-protection standards? To create workable standards with data so
granular and personal, systems must
actively engage individuals in their
own privacy decision making. At CENS,
we call this participatory privacy regulation—the idea that systems can help
users to negotiate disclosure decisions
depending on context (who is asking,
what is being asked for, and so on). We
need to build systems that improve users’ ability to make sense of, and thereby regulate, their privacy.
Building such systems is a major,
unmet challenge.
6 As the first steps
toward meeting this challenge, we
propose three new principles for developers to consider and apply when
building mobile data-gathering applications. These principles are purposefully broad, because “acceptable” data
practices might vary across applications
(a medical project might be justified in
collecting much more personal data,
with stringent protections, than a community documentation project). These
principles are thinking tools to help
developers adapt privacy protections to
participatory sensing applications.
Participant primacy. The goal of participatory privacy regulation is to give
participants as much control over their
location data as possible. GPS traces
or the secondary traces created by
geotagged media should belong to in-
Privacy is a vital
part of your
identity and
self-presentation.
Deciding what to
reveal to whom
is part of deciding
who you are.
dividuals. Participants should be able
to make and revoke decisions to share
subsets of the data with third-party
applications. Framed this way, participants are not just subjects of data
collection, but take the role of investigators (when they collect data to participate in self-analytic applications)
or co-investigators (when they contribute their data to larger research initiatives). As such, they should have input
into how data is collected, processed,
stored, and discarded.
Developers can enable participants
to own and manage their data by tailoring access-control and data-management tools for use by individual
participants. Users collecting revealing
sensing data are going to need secure
storage and intuitive interfaces to manage access and sharing. As an example,
CENS researchers are developing a
personal data vault (PDV) to give individuals private and robust storage for
their sensing data. The PDV provides
services such as authentication and
access control, allowing participants
not only to collect all of their sensing
data in one place, but also to specify
which individuals and groups in their
social network can see which datasets.
Similar tools are in development in research labs at Stanford8 and AT&T,
1 and
are not unlike commercial applications
such as Google Health5 and Microsoft’s
HealthVault.
9
As developers build data-management tools to put personal data control back in the hands of individuals,
they will need to think about which
controls users will need to make privacy and sharing decisions. At a very
basic level, sharing decisions should
take into account identity (who’s asking?), time (send data only between 9
a.m. and 5 p.m.), location (send data
only when I’m on campus), and data
type (share only geotagged photos).
More advanced techniques for developers to consider include access controls based upon activity (share only
driving routes) or routine (don’t share
anomalous routes).
Application developers can further
protect participant primacy by limiting the amount of raw data a participant is required to share outside of the
vault. When privacy is at stake, more
data is not always better. For example,
participants in Biketastic may collect