sibility support, and many people
presumed touchscreens could not be
made usable for blind people. Slide
Rule developed a set of gestures and
the first finger-driven screen-reading
techniques to enable blind people
to access and control smartphone
touchscreens.
We became aware from a personal
communication in 2010 that Slide Rule
inspired aspects of Apple’s VoiceOver
screen reader for iOS. Indeed, Slide
Rule’s finger-driven screen reading,
swipe gestures, and second-finger tap
can all be found in VoiceOver today.
Slide Rule exhibited the first three
principles of ability-based design; it
also exhibited the fourth and sixth principles, as its screen reader could adapt
to the speed of users’ movements, tailoring its performance to theirs. The
underlying principles demonstrated
in Slide Rule have survived into today’s
touchscreen systems.
Walking user interfaces. Today’s
smartphones are portable but not
truly mobile because they support
interaction only poorly while moving; for example, walking divides attention, 24 reduces accuracy, 17 slows
reading speed, 26 and impairs obstacle
avoidance. 32 We conducted multiple
projects to improve interaction while
walking, focusing on people’s abilities
while on the go.
In our early exploration of walking
user interfaces, 15 we studied level-of-detail (LoD) adaptations, where the interface shown while a user was standing had high detail and the interface
shown while a user was walking had
low detail, with larger fonts and bigger targets. When a user moved from
standing to walking and vice versa, the
interface changed. We compared this
adaptive interface to component static
interfaces for both walking and standing, finding that walking increased
task time for static interfaces by 18%,
but with our adaptive interface, walking did not increase task time. We also
found that the adaptive interface performed like its component static interfaces; that is, there was no penalty for
the LoD adaptation.
In our subsequent project, called
WalkType, 12 we made mobile touch-
based keyboards almost 50% more
accurate and 12% faster while walk-
ing. Touch-based features like finger
needed. Although the Global Public
Inclusive Infrastructure (GPII), 34, 35 with
its cloud-based auto-personalization of
information and communication tech-
nologies, was formulated independent
of ability-based design, its objectives
are the same—enable interfaces to be
ideally configured to match each user’s
situated abilities.
The GPII is built on three technological pillars. 35 The second, “
auto-personalization,” is the one of interest here.d Its long-term goal is to
ensure that any digital interface a
person encounters instantly changes
to a form that can be understood and
used by that person. The GPII’s auto-personalization capability uses a person’s needs and preferences, which
are stored in the cloud or on a token,
to automatically configure the interface of each device for that individual. 34, 36 Its “one size fits one” approach
is designed to help each person have
the “best fit” interface possible. Since
interface flexibility on current devices and software is limited, GPII auto-personalization uses both built-in
features and assistive technologies
(AT) (on the device and in the cloud)
d The two other pillars make it easy for people to
determine what they need or prefer, ensuring
solutions exist for everyone.
location, duration, and travel were
combined with accelerometer features like signal amplitude and phase
to train decision trees that reclassi-fied wayward key-presses. Walk Type
effectively remedied a systematic inward rotation of the thumbs caused
by whichever foot was moving forward as the user walked.
Performing input tasks is only one
challenge while walking. Consuming
output is another. In SwitchBack, 19
an attention-aware system for smartphones, a smartphone’s front-facing
camera was used to track eye-gaze position on the screen to aid task resumption. For example, when a user was
reading and looked away, SwitchBack
remembered the last-read line of text;
when the user’s gaze returned to the
screen, that same line was highlighted
to draw the user’s attention for easy
task resumption.
These three walking user interfaces
exhibited all seven principles of ability-based design to varying degrees.
Global Public Inclusive
Infrastructure
Ability-based design has been applied
mostly at the level of individual systems and applications, but for greater
impact, a new infrastructure that extends beyond the user’s own device is
Table 2. Seven principles of ability-based design, updated and revised from previous
versions. 37, 38
Principle Description
Designer Stance
(required)
Ability Designers focus on users’ abilities, not disabilities,
Accountability Designers respond to poor usability by changing
systems, not users, leaving users as they are.
Availability Designers use affordable and available software,
hardware, or other components acquirable through
accessible means.
Adaptive or Adaptable
Interface (optional)
Adaptability Interfaces might be adaptive or adaptable to provide
the best possible match to users’ abilities.
Transparency Interfaces might give users awareness of adaptive
behaviors and what governs them and the means
to inspect, override, discard, revert, store, retrieve,
preview, alter, or test those behaviors.
Sensing and Modeling
(optional)
Performance Systems might sense, monitor, measure, model,
display, predict, or otherwise utilize users’
performance to provide the best possible match
between systems and users’ abilities.
Context Systems might sense, monitor, measure, model,
display, predict, or otherwise utilize users’
situation, context, or environment to anticipate and
accommodate effects on users’ abilities.