• Figure 2. The Walking UI for a music player interface—an example of an adaptation to a temporary, situationally induced impairment. Larger buttons address the difficulty of pointing accurately while walking; larger fonts for song titles help with impaired reading speed; and differences
in font size between titles and additional song information help direct fragmented attention.
(Screenshots courtesy of Shaun Kane.)
The Walking UI prototype (Figure
2) provides an example of adapting to the changing abilities of
mobile users. It provides different
UIs for users when stationary and
in motion [ 8]. Both versions use a
similar design to ensure that users
do not have to learn two separate
UIs, but the walking variant has
larger interactors to compensate
for mobile users’ impaired dexterity and larger fonts for song titles
to accommodate reduced reading ability. It also manipulates
the visual saliency of important
information. The walking variant
illustrates two differences between
the design requirements for mobile
users and users with permanent
impairments. First, because mobile
users’ abilities change, frequently
adaptations must support the
transfer of user skills between
interfaces. Second, manually
designed adaptations for the most
common mobile situations may be
feasible because most users will
experience similar impairments.
The latter point makes mobile situational impairments a productive
domain in which to initially explore
the design space of user interface
adaptations for dexterity, visual
acuity, and attention impairments.
March + April 2012
interactions
fastest to use for that user. Despite
a large space of possible solutions,
Supple finds the optimal design in
seconds; the resulting interfaces
have improved the performance
and satisfaction of users with
motor impairments. Automatic
user interface generation is a scalable approach and one that enables
highly personalized and dynamic
solutions. One of its limitations,
however, is that the resulting interfaces are unlikely to capture all
the nuances of the applications’
semantics, and the approach is
fundamentally limited by what
types of abilities can be modeled.
Empowering user communities
to design and share specialized
interfaces. Allowing users to rede-
sign the interfaces they use most
often is an alternative to relying
on automation. Challenges to this
solution include enabling deep
end-user redesign of existing user
interfaces and providing tools for
communities with similar abilities
and needs to share and collaborate
on the redesigns. Collaborative
user interface design is already
available for some gaming plat-
forms, and dedicated gamers have
developed interfaces optimized
for particular strategies or game
tasks. In the non-gaming world,
the AdaptableGIMP project has
developed infrastructure for com-
munity-based UI innovations [ 7].
In this project, users can develop
variants of the GIMP toolbox spe-
cialized for different tasks and eas-
ily post their designs on a shared
wiki so others can reuse these
adaptations. We see user-driven
design and sharing of specialized
interfaces as an important com-
ponent in our vision, and some-
thing that is largely unexplored.
Moving Forward
At CHI 2011 we organized a workshop on Personalized Dynamic
Accessibility (though at the time
we called it Dynamic Accessibility).
Our goal was to provide an opportunity for researchers with diverse
backgrounds but a shared interest
in this topic to develop a common
understanding and prioritize the
research agenda.
One broad consensus among
participants was that automated
approaches to adaptation will
likely be part of the solution landscape for Personalized Dynamic