ables allow users to make sense of past
events—what activities they have done
and what effect they are likely to have
on their well-being—to prompt positive behavior change, as discussed by
Fritz et al. 8 The next stage of development might be for health wearables
to predict health crises; examples include alerting a hospital of early signs
of a heart attack or warning users of
how likely it is they will develop, say,
breast cancer.
The scenario involving predictive
emergency medical intervention raises
the question of who ought to have ac-
cess to personal health data. While it
would be helpful to link one’s health
data directly to the closest hospital in
order to set the long chain of care in
motion as early as possible in an emer-
gency, there would be highly sensible
consumer pushback around the access
various parties might want to have to
personal health data and that function-
al uncertainty in this arena would likely
not be tolerated. Alternatively, if a wear-
able device alerted a user to hurry to a
hospital at the start of a possible medi-
cal crisis, how certain does the device
have to be? Should gadgets err on the
side of caution, possibly provoking a
false alarm? While not alerting a user
due to insufficient certainty may lead
to preventable deaths, so might caus-
ing alarm when alarm is not absolutely
necessary, thus leading users to ignore
or even reject subsequent alerts, with
the gadget turning the user into “the
boy who cried wolf.”
The very notion of a health wear-
able alerting a user to an otherwise
imperceptible impending crisis dem-
onstrates the insufficiency of solutions
for addressing uncertainty that rely on
manual data correction by the user, as
suggested by Consolvo et al. 6 and Pack-
er et al. 20 Explaining the data collected
and the ways it is processed by the al-
gorithm may be more appropriate for
assisting a user determining whether
the device output is certain enough to
warrant seeking medical attention. At
the same time, this information must
be delivered in ways that can be evalu-
ated rationally by a person who just re-
ceived an anxiety-provoking output (see
the sidebar “Communicating Uncer-
tainty”). Both parts of this solution are
non-trivial and require further research.
Life coaching. Tracking data points
the individual user, affecting others
in the user’s social circle who did not
necessarily consent to such analysis.
Additionally, the consequences to in-
dividuals deciding to cut a person out
of their lives are not necessarily know-
able to a system (such as how cutting
ties might introduce undue financial
instability into their lives). How certain
would one have to be of the toxicity of
a relationship to be willing to end it? It
might indeed be the case that people
would more readily accept diagno-
ses of their problems in the form of a
scapegoat than that their unhappiness
is a result of their own behaviors they
find difficult to change. This is all the
more reason why tools that claim deep
insight into users’ lives must be very
clear about the uncertainties they are
juggling in their algorithms.
For advanced diagnostic tracking
in the form of life coaching, new tech-
niques are needed to identify potential
triggers from relevant contextual infor-
mation; and to the extent that doing so
entails drawing data from other perva-
sive devices, such a filter might intro-
duce further uncertainties that need to
be reflected in overall measures of un-
certainty. Additional research is need-
ed to understand how best to commu-
nicate these uncertainties to users. In
particular, tools are needed for captur-
ing users’ cognitive and affective re-
sponses to these uncertainties (as cov-
ered in the sidebar “Communicating
through one’s personal history is of
limited value for individuals seeking
improvements in and maintenance of
their well-being, in contrast to infor-
mation about dependencies and cor-
relations among multiple variables5
(such as the effect of certain foods on
an individual’s blood sugars). Given
that users are often not rational data
scientists22 and consistent in asking for greater analytical capabilities
than their devices are capable of providing, it seems inevitable that device
manufacturers will introduce systems
that purport to provide more definitive answers for users. The danger
would be doing so without properly
attending to the uncertainties highlighted earlier.
Users’ inability to appreciate uncertainty is made especially clear in the
case of wearables that claim to identify
correlations between mood and activities (such as ZENTA, https://www.indi-egogo.com/projects/zenta-stress-emo-tion-management-on-your-wrist). It is
conceivable that wearable life coaches
may soon draw from other pervasive
technologies to provide indications
of, say, toxic relationships between the
user and other individuals and encouraging them to cut unhealthy social ties.
While such revelations could have benefits, the implications of inaccuracies
of one’s data or of the data being drawn
from other sources to determine correlations would begin to extend beyond
Experimental psychology studies, including those undertaken by Susan Joslyn and
her colleagues at the University of Washington in Seattle (http://depts.washington.
edu/forecast/), have shown that providing information about uncertainty can lead
to greater trust in system models and better decision making. These benefits are far
from assured, however, as studies have also shown that non-expert end users have
great difficulty interpreting information about uncertainty. 12
One presumed reason for this difficulty is that uncertainty increases the
cognitive load individuals need to manage while making any kind of decision. It
requires that individuals engage in slow and methodical thinking, as opposed to
more quick and heuristic thinking. Verbal and numerical expressions of uncertainty
create potential complications for decision makers—the verbal expressions being
open to more subjective interpretative variability, and the numerical expressions
often being more difficult to decipher. 9 Both forms are potentially subject to
framing effects that influence people’s processing of the information. Research
has also uncovered “deterministic construal errors,” 9 or the tendency to interpret
uncertain information as deterministic; for example, people frequently interpret—
incorrectly—the “cone of uncertainty” in hurricane forecasts as the extent of
the wind field, while, in fact, it represents the extent of all possible hurricane
trajectories. All such factors require careful consideration when designing
representations of uncertainty information.
Communicating
Uncertainty