to consent, this status quo is not
surprising. Regulation, unfortunately,
has to date created only a veneer of
consent—a legal illusion of choice and
control—but design has not delivered
interactions that support genuine
informed choices that regulators assume
data-harvesting services should follow.
The stats are very clear about how
broken terms and conditions pages are:
If we were to read these “agreements” it
would be nearly a full-time job [ 1]. More
disturbingly, these terms are generally
written to require a sophisticated level
of reading comprehension—beyond
the norm for the population using these
services [ 2]. So even if we did read these
T&C, most of us wouldn’t understand
them. When we give our consent under
such conditions, it isn’t meaningful.
We are also asked to consider these
terms at exactly the moment work in
HCI on task interruption has shown is
the wrong time [ 3]: when that request
gets in the way of our primary task.
We click the “agree” button because
clicking it gets rid of the screen so that
we can get on with posting our cat video
or uploading a draft of our paper to a
co-editing site or synchronizing our
calendar with a cloud service.
Many of us became acclimatized to
this meaningless box clicking around
installing software: Yes, yes we don’t
own it, uh huh we’re just leasing it, and
no we won’t make copies of it. Sure. In
those days, however, it was rare for
software to call home to the mothership
to locate our particular copies. Now,
it is commonplace for software to be
deployed as a service that knows exactly
where it is and how many copies have
been authorized. But that service,
especially when deployed on phones
or mobile devices, gathers far more
information for very amorphous reasons
than just registration confirmation.
More troubling, as has been shown when
installing apps on phones: Few people
are even aware that the app is (re)setting
permissions to access personal data not
needed for its operation [ 4], such as our
contacts and text messages. There is
also the belief that this personal-data
capture is a trade with the developer,
and that if one pays for the app then
that data trade is closed. Not so. Some
apps take even more liberties in the paid
version. Consent is not meaningless in
this context—it’s nonexistent.
In the app case, research suggests
that few people are aware that data-
The Internet of Things, we are told,
is about to achieve epic scale, with
Cisco and Ericsson (Dave Evans and
Hans Vestburg, respectively) having
predicted that there will be 50 billion
devices connected to the Internet by
2020 (though each have since revised
their estimates down to 30 billion and
28 billion). Their ecosystem is wildly
heterogeneous. Many devices will be
capturing the same identification data
over and over; others will be part of
networks sharing data, such as cars
moving through various jurisdictions
and associated infrastructures with
their own terms for data sharing. One
of the key concerns of the Io T and its
high-speed cousin, the Internet of
Vehicles, is just how that data may be
captured and shared, not only within
one fixed environment like a home, but
across environments, from the wired
High Street (like that shopping scene
in Minority Report mapping ads to
eyeballs to associated customer profiles)
to moving from one location to another.
In other words, as the objects of our
environment become more connected
to the Net, do we simply become
another thing on the Net, reducing our
privacy, and our civic values, in the
names of everything from convenience
to counterterrorism? Likewise, given
the vast scale, speed, and heterogeneity
of this new ecosystem, are we creating
new risks to our personal and national
security, both as citizens and as
societies—not even from willful
hacking but just because the scale of our
Io T reach will exceed our grasp of all the
necessary protections that we assume
are ours in a civil society?
In response to the current data status
quo and in anticipation of this changing
ecosystem, new rules for data sharing
have been established. For example, the
General Data Protection Regulation
(GDPR) approved by the EU Parliament
on April 14, 2016, will be enforced
across the Union (and enshrined in
law by the U. K.) starting May 25,
2018. These regulations have been put
in place well in advance of the technical
means to sustain their implementation.
Consequently, there is a key moment
for HCI and UX research and design to
influence society for good, not just to
design wonderful devices for the Io T but
also to consider the wicked problems of
how to make apparent for developers,
designers, businesses, policymakers, and
citizens the mechanisms and personal-
data-driven assumptions that enable the
Io T. By this deliberate engagement we
can help surface the implications of data
sharing in order to develop models of
understanding, social expectations for
understanding trade-offs, and means for
developers to know their designs comply
with these expectations and for citizens
and policymakers to be able to trust
that they do. Because of our expertise
in understanding both technology and
especially human-centered approaches
to design, we have this key role to play in
informing the shape of these exchanges
and to create an ecosystem that supports
social, technical, meaning ful consent at
Io T scale.
To better see these opportunities for
the future, let’s consider the status quo
of data-sharing consent in the current
digital economy.
CITIZEN CONSENT IN
DATA SHARING: THE
MARGARINE OF CONSENT
The Internet has made liars of us all.
No one has used a browser, a social
media site, or a smartphone app without
encountering a box that says, before
continuing, that A) we have read the
terms and conditions of a service and
B) we agree to them. We click “agree”
when we haven’t a clue if we do or
don’t. In other words, in a world where
our personal data is largely the oil
that greases the wheels that keep the
Internet running, we have very little
meaningful say in what data is collected
and how it’s used, and why we may
wish to limit it, or for that matter, give
it away in buckets. Given the current
(some might say insulting) approach
We click the “agree” button because
clicking it gets rid of the screen so that
we can get on with posting our cat video
or uploading a draft of our paper.
COVER STORY