can usefully learn—for example,
about rigor, thinking about outcomes
as well as experiences, and scaling
solutions to populations. To make a
real difference, we need to bring our
complementary perspectives together,
working with and learning from the
expertise and resources available to
different disciplines.
ACKNOWLEDGMENTS
These reflections are based
on extensive work with health
researchers and students—
many of whom have had to find a
transdisciplinary research identity—
and HCI colleagues who work in
similar spaces. I have learned from
you all, but any remaining errors or
unclarities are my own.
Endnotes
1. Grudin, J. From tool to partner: The
evolution of human–computer interaction.
Synthesis Lectures on Human-Centered
Interaction 10, 1 (2017), i-183.
2. Wachter, R. The Digital Doctor. Hope,
Hype and at the Dawn of Medicines
Computer Age, 2015.
3. Blandford, A., Gibbs, J., Newhouse,
N., Perski, O., Singh, A., and Murray,
E. Seven lessons for interdisciplinary
research on interactive digital health
interventions. Digital Health 4 (2018).
DOI: 10.1177/2055207618770325.
4. Pham, Q., Wiljer, D., and Cafazzo, J. A.
Beyond the randomized controlled trial: A
review of alternatives in mHealth clinical
trial methods. JMIR mHealth and uHealth
4, 3 (2016).
5. Collins, L.M., Murphy, S. A., Nair,
V.N., and Strecher, V.J. A strategy for
optimizing and evaluating behavioral
interventions. Annals of Behavioral
Medicine 30, 1 (2005), 65–73.
6. Craig, P., Dieppe, P., Macintyre, S.,
Michie, S., Nazareth, I., and Petticrew,
M. Developing and evaluating complex
interventions: The new Medical Research
Council guidance. Bmj 337 (2008), a1655.
7. Yardley, L., Morrison, L., Bradbury, K.,
and Muller, I. The person-based approach
to intervention development: application
to digital health-related behavior change
interventions. Journal of Medical Internet
Research 17, 1 (2015).
Ann Blandford is professor of human-computer interaction in the Department
of Computer Science at University College
London and director of the UCL Institute of
Digital Health.
→ a.blandford@ucl.ac.uk
with expertise in behavior change)
who define requirements for an
intervention. In their person-based
approach, Yardley et al. [ 7] go beyond
this to focus attention on the intended
users of a digital health intervention
and what might motivate them to
engage with (and get value from) that
intervention, but the starting point
remains a clinical view on desirable
health outcomes. Summarized in
Figure 2, the person-based approach
has four stages:
• Planning is similar to
requirements gathering. It includes an
explicit focus on reviewing existing
evidence (e.g., about user needs and
existing interventions) as well as
conducting qualitative research with
the intended users of the intervention.
• Design covers requirements
synthesis and possible features
to address those requirements
(sometimes referred to as
“mechanisms of action”).
• Development and evaluation
emphasize the iterative nature of
digital intervention development but
make no reference to prototyping or
the creative aspects of interaction
design.
• Implementation and trialing
focus on larger-scale testing and
deployment.
There are well-established HCI
development lifecycles such as the
ISO 9241 standard; these typically
start with user needs and focus on
the iterative development and testing
of design concepts. The end user is
typically regarded as the expert who,
explicitly or tacitly, understands their
activities and may be able to identify
design opportunities. An example
lifecycle that intentionally mirrors the
person-based model but focuses on
HCI activities is presented in Figure 3.
According to this view:
• The first stage is identifying the
problem to be addressed, the intended
users, and understanding the contexts
in which the digital intervention
might be used. The output is likely
to be design representations such as
personas and scenarios that inform
later stages of development.
• Conceptual design involves
creating design representations that
address the identified user needs in
ways that can be implemented within
an interactive digital system.
• Detailed design involves iterative
prototyping and testing, where
prototypes are of higher fidelity
through successive cycles, or where
new features are added through
successive cycles in an agile approach.
• Implementation and testing involve
deploying in naturalistic settings and
iterating as necessary.
These lifecycles have a lot in
common. Probably the most important
differences are the emphases on prior
evidence at the early stage and on
effectiveness (i.e., outcomes) at the late
stage in the person-based approach,
and the emphases on creative design
representations and interaction-focused evaluation through conceptual
and detail design in the HCI lifecycle.
Clearly both are important, and a
richer design lifecycle for digital
health technologies will incorporate
all these features.
SUMMARY
Anyone who has been or has visited
a patient will have experienced
poorly designed health technologies;
clinicians make do with some terrible
interfaces in their work, as do patients
with complex chronic conditions.
There are many reasons for this,
including ignorance of computer
science and HCI; the complexity of
healthcare systems and of people’s
lives into which new technology
is being introduced; population-,
outcomes-, and RCT-oriented
thinking that undervalues individual
qualities and experiences; and the
sheer magnitude of the challenge of
getting all the necessary disciplines
coordinated. Even the idea that the
process is all one way (as implied
in Figures 1–3 for simplicity of
presentation) limits our imaginations
in terms of what is needed and what
is possible. HCI research and practice
have an essential role to play in the
development of health technologies
that are safe, creative, engaging and
truly fit for purpose. Conversely,
health presents fascinating challenges
for HCI research, and there are
many things that HCI researchers
DOI: 10.1145/3292023 COP YRIGHT HELD BY AUTHOR. PUBLICATION RIGHTS LICENSED TO ACM. $15.00