news
Technology | DOI: 10.1145/1859204.1859211
Gregory Goth
The Eyes have it
Eye-tracking control for mobile phones might lead
to a new era of context-aware user interfaces.
HUMan-CoMpUter interFaCes (HCis) controlled by the eyes is not a novel concept. Re- search and development of systems that enable people
incapable of operating keyboard- or
mouse-based interfaces to use their
eyes to control devices goes back at
least to the 1970s. However, mass adoption of such interfaces has thus far not
been necessary nor pursued by system
designers with any particular ardor.
“I’m a little bit of a naysayer in that I
think it will be very, very hard to design
a general-purpose input that’s superior to the traditional keyboard,” says
Michael Holmes, associate director for
insight and research at the Center for
Media Design at Ball State University.
“When it comes to text, it’s hard to beat
the speed of a keyboard.”
photograph by jalani morgan
However, one class of user interfaces (UIs) in particular has proven to be
problematic in the creation of comfortably sized keyboards—the interfaces
on mobile phones, which are becoming
increasingly more capable computational platforms as well as communications devices. It might stand to reason
that eye-based UIs on mobile phones
could provide users with more options for controlling their phones’ applications. In fact, Dartmouth College
researchers led by computer science
“People tend not to point with their eyes,” notes Roel Vertgaal, associate professor of
human-computer interaction at Queen’s university, who studies eye communication.
professor Andrew Campbell recently
demonstrated with their EyePhone
project that they could modify existing
general-purpose HCI algorithms to op-
erate a Nokia N810 smartphone using
only the device’s front-facing camera
and computational resources. The new
algorithms’ accuracy rates, however,
also demonstrated that the science be-
hind eye-based mobile control needs
more refinement before it is ready for
mass consumption.
Eyes as an input Device
Perhaps the primary scientific barrier
to reaching a consensus approach to
eye-controlled mobile interfaces is the
idea that trying to design such an interface flies against the purpose of the
eye, according to Roel Vertegaal, asso-