Frequent users should benefit from
the system’s self-reconfiguration (see
Figure 1). Because of the required
immediateness, we claim: Create
shortcuts for interventions that combine
a sequence of interactions.
Offer informative feedback.
Originally, feedback accompanies
every (major) action and is drastically
reduced when using a whole
infrastructure of automated systems.
Designing the balance of
automation and intervention has
to address the feedback dilemma:
Calm environments and minimizing
the attention required are key
aspects; feedback directs attention to
unexpected behavior. Feedback about
the impacts of automated behavior
and the temporary potentials for
intervention must be offered.
This is complemented by: Provide
feedback on whether or not intervention
is occurring and: Offer feedback
(implicit/explicit) on the impact of
Design dialogue to yield closure.
Users can understand how their
activities contribute to a cycle of
task performance and how far it
is completed. With automated
systems, this is mainly about starting
and terminating a process, while
intervention means to switch back to
Therefore, we propose: Design
the start and control of intervention
in conjunction with clear and simple
options for completing and terminating
it. Interventions should be designed
to have a limited temporal impact.
Offer simple error handling. The
rule conventionally covers human
errors. With automated systems, the
challenge is about whether the user
lets the system go on when errors
happen or may be expected.
Thus the context of automation
requires: Allow for immediate
intervention to avoid the occurrence
or repetition of unsolicited automated
Permit easy reversal of actions.
This feature encourages users to try
efficient ways and unfamiliar features.
This remains an important rule, but
literal reversal of human action may
not be feasible; reversing its impact,
however, may be.
Thus we state: Combine the means
for reversing the impact of automated
actions with intervention interfaces
and: Allow for simple means for
reversing the impact of interventions.
This kind of reversibility is crucial
to encourage users to explore the
Support internal locus of control.
The user should feel like they are the
one who controls the system. With
automated systems and implicit
interaction, the relationship between
controlling action and reaction has to
be newly balanced.
It has clearly to be communicated:
• How the impacts of interventions
are related to the goals being pursued by
the automated processes
• How control is distributed between
the automated system and the user.
Possibilities of intervention offer
flexibility for control and enable ad
Reduce short-term memory load.
The amount of information to be kept
in mind to efficiently interact with
a system has to be reduced. Little
information should be needed as long
as active control is not necessary.
We propose: Do not require the user
to remember a previous system status.
And between the interventions:
Minimize required attention and design
for default behavior.
The system should be designed
in such a way that the user could
intervene and go back to automated
behavior without remembering and
noting the current status for future
SIX DESIGN PRINCIPLES
Based on the discussion, we propose
the following design principles:
• Expectability and predictability.
Ensure that users are not surprised
by automated behavior and that they
understand how it develops.
• Communicate options for
interventions. Make options for
interventions that may be context-aware visible and understandable for
users in an unobtrusive way.
• Exploration of interventions. Allow
the safe and enjoyable exploration
of interventions and their potential
impacts, e.g., by simulation or
previews on future system statuses.
• Easy reversal of automated and
intervention actions. Offer a simple
means to reverse the impact of the
system’s automated behavior or of the
results of interventions.
• Minimize required attention.
Minimize the user attention required
to operate the system by implicitly
• Communicate how control is
shared. Clearly communicate the
distribution of responsibilities, as
well as the actual control between the
human and the machine.
Automation and autonomous
systems are inevitably coming
in many domains. We hope our
proposed principles help to develop
usable intervention interfaces and
kick-start a fundamental discussion
on interaction with autonomous
1. Rasmussen, J. The human as a systems
component. In Human Interaction with
Computers. H. T. Smith, T.R.G. Green, eds.
Academic Press, London, 1980, 67–96.
2. Schmidt, A. Implicit human computer
interaction through context. Personal
Technologies 4, 2 (2000), 191–199.
3. Dix, A. Beyond intention-pushing
boundaries with incidental interaction.
Proc. of Building Bridges: Interdisciplinary
Context-Sensitive Computing, Glasgow
University (Vol. 9), 2002.
4. Tennenhouse, D. Proactive computing.
Communications of the ACM 43, 5 (2000),
5. Herrmann, T. Support of intervening use.
Ergonomics of Hybrid Automated Systems 3
6. Rost, M. and Bock, K. Privacy by design and
the new protection goals. DuD (Jan. 2011).
Albrecht Schmidt is a professor of
human-computer interaction and cognitive
systems at the University of Stuttgart. His
research interests are at the intersection of
ubiquitous computing and human-computer
interaction, including large-display systems,
mobile and embedded interaction, and tools to
augment the human mind. He has a Ph.D. from
Thomas Herrmann is a professor of
information and technology management at
the University of Bochum, Germany. Current
research interests include design methods
for sociotechnical systems in various areas
such as healthcare, computer-supported
collaboration, knowledge management,
process management, and industry 4.0. He has
a Ph.D. from the Technical University of Berlin.
DOI: 10.1145/3121357 COPYRIGH T HELD BY AUTHORS. PUBLICATION RIGHTS LICENSED TO ACM. $15.00