access permissions can be set per app
to limit access to that data [ 4]. And why
would a person consider whether it’s OK
for an app to access some of their data
if they have no awareness this data is
being accessed in the first place? Indeed,
the situation is not much different on
the Web. For people in some areas of
computing, we may take it for granted
that unless we use services like ad
blockers or virtual private networks, we
are being tracked across interlocking
webs via mechanisms like fingerprinting
and cookies. For instance, every time we
put a URL into a social media feed and
it is shortened by that service, that URL
reflects its path through the network—
who has used it, who has looked at it,
where they’ve gone after visiting it, and
so on [ 5]. Our social networks and beliefs
are effectively exposed. New research-based services like TrackMeNot
( https://cs.nyu.edu/trackmenot/)
run randomized Web searches from
our browsers in order to confuse this
profile that is constructible from our
footsteps through these pathways of the
Net. Problematically, however, the few
studies that have looked at how tracking
is perceived show that only a small
number of people in the general public
are aware of the degree to which they
are being followed online, or that their
Internet traffic is being shared among
various, mostly commercial entities.
The data suggests that when people do
learn of this tracking, they characterize
it as creepy [ 6] and want to find ways to
control it [ 7].
We see this awareness effect in
other data-related transactions: Once
people are aware of what is happening
to their data without their consent,
they demand better conditions. What
of the privacy paradox, then? In other
words, that people say they’re concerned
about privacy, but if you put a form
in front of them and request personal
data, they readily hand it over. As
more researchers have now shown, this
response is not a contradiction. We
are a sense-making people: If we are
asked for something—especially tied to
something we want—we assume there
must be a rationale for it. We assume
the best. When responses are probed,
however, many people who provide very
personal data to a service do so without
a clear model of how that data may be
used by the service itself; how that data
may be used by other people accessing
that service; what of that data is actually
necessary for the service to function;
and the risks associated with sharing
that data. We are busy: It is easier to
trust there is a good reason for this data
request, it seems, than to stop and check
if we’re being scammed. Indeed, we
need only consider the outrage when
those who do stop and look raise a red
flag about terms and conditions. Doing
so, however, has required the work of
what we might call social interpreters
to translate the language of the revised
terms and conditions, moving it out of
the abstract and into concrete terms
that are meaningful to people. These
changes otherwise remain opaque,
again making our consent socially
meaningless.
TOWARD APPARENCY
AND SEMANTIC/PRAGMATIC
TRANSPARENCY
Just from the above scenarios, we can see
numerous opportunities for interaction
research and UX design to change
the status quo around data consent.
Fundamental to any change, however,
is to see a need for it. This is what we’ve
been calling apparency. One may have
very well-defined terms and conditions,
but if people don’t even know that their
contacts are being accessed by a puzzle
game they downloaded, if this use is not
apparent in the first place, transparency
about the terms of an unperceived
process is at best meaningless.
As designers, we can help to develop
the means to make such data processes
apparent in order for the terms to
be meaningfully transparent. In the
context of ubiquitous computing,
Matthew Chalmers [ 8] framed making
the properties of a system apparent
as “seamfulness,” as opposed to
seamlessness or, more particularly,
sameness. For instance, rather than
hiding which cellphone tower a phone
may be using, it might be better to make
this information available. Some people
might find it useful and empowering:
Being able to look under the hood of
a system at various levels of detail,
specifically in order to engage with it
and change it, is a valuable property.
In data-driven services—like most
of those on the Internet—one can point
to the terms and conditions and label
them as either transparent or opaque,
based on the language used and the
specificity of descriptions. But such
transparency refers largely to only an
acknowledgement that data is being
collected and that it may be used to
“improve the quality of the service”—as
cookie notices on websites in the EU
constantly assert without explanation
of what or how, exactly. Apparency
would seek to make those connections
clear and traceable toward meaningful
transparency. For example, there are
no cues to the user of a downloaded
game that make it apparent that there
are personal-data settings associated
with this app and that changing them
(or not) will have an effect on risks
of burglary (GPS access), identity
theft (contacts access), workplace
harassment (enabling anyone online
to see pictures from social occasions),
job-selection discrimination (social
media commentary being available), or
preferential or discriminatory pricing
[ 9]. Nor is it readily apparent that shared
data is churned into use for targeting
advertisements, not only on the site
where the data is initiated but also from
that site to other sites, and through a
network of brokers and advertisers, as a
person surfs the Web [ 5]. The simple act
of touching these sites is of course itself
valuable data that is both unapparent
and untransparent.
Indeed, we might reframe a
progressive scale from apparency
to transparency, in which we have
apparency, semantic transparency,
and pragmatic transparency. Let’s
call it apparency to s/p transparency.
Apparency reflects how an activity—in
this case a data activity—is signaled.
Semantic transparency addresses
whether we know what the terms of
the apparent activity are and mean;
pragmatic transparency reflects the
degree to which we know what these
data actions actually do or entail.
There are already lovely examples
of apparency to s/p transparency
design online. One elegant, motivating
example is the very simple HT TPS
protocol. That S makes a transparent
process unobtrusively apparent: that the
connection between you and a website
is secure and encrypted, that the data
is not out in the clear for anyone to see.
Increasingly the S is backed up by a
padlock icon in the browser’s address
bar to indicate a secure channel. If
one is unfamiliar with the padlock,
clicking on it usually displays text to
make more of the semantics of the
process apparent: that data is being
transmitted over an encrypted channel.
For pragmatic transparency, these