screens are great in that they can render a multitude of interfaces, but they
require us to look at them,” says Harrison. “You cannot touch type on your
iPhone.” The idea with this interface
technology, which Harrison calls a
shape-shifting display, is to offer some
of the flexibility of touch screens while
retaining some of the beneficial tactile
properties of physical interfaces.
Another interface strategy designed
to offer new advantages while retaining some of the benefits of older technology is interpolating force-sensitive
resistance (IFSR). Developed by two researchers at New York University, IFSR
sensors are based on a method called
force-sensitive resistance, which has
been used for three decades to create
force-sensing buttons for many kinds
of devices. However, until Ilya Rosenberg and Ken Perlin collaborated on
the IFSR project, it was both difficult
and expensive to capture the accurate
position of multiple touches on a surface using traditional FSR technology
alone. “What we created to address this
limitation took its inspiration from human skin, where the areas of sensitivity
of touch receptors overlap, thereby allowing for an accurate triangulation of
the position of a touch,” says Perlin, a
professor of computer science at NYU’s
Media Research Lab.
In IFSR sensors, each sensor ele-
ment detects pressure in an area that
overlaps with its neighboring ele-
ments. By sampling the values from the
touch array, and comparing the output
of neighboring elements in software,
Rosenberg and Perlin found they could
track touch points with an accuracy
approaching 150 dots per inch, more
than 25 times greater than the density
of the array itself. “In designing a new
kind of multitouch sensor, we real-
ized from the outset how much more
powerful a signal is when properly
sampled,” says Rosenberg, a graduate
student in NYU’s Media Research Lab.
“So we aimed to build an input device
that would be inherently anti-aliasing,
down to the level of the hardware.”
Recognizing the increased interest
in flexible displays, electronic paper,
and other technologies naturally suit-
ed to their core technology, Rosen-
berg and Perlin spun off their sen-
sor technology into a startup called
Touchco, and now are working with
“there is a single
true computation
platform for the
masses today,” says
Patrick Baudisch,
and it “is the mobile
phone—by orders
of magnitude. this
is the exciting and
promising reality we
need to design for.”
other companies to integrate IFSR
into large-touch screens and flexible
electronic displays. In addition, the
team is looking into uses as diverse
as musical instruments, sports shoes,
self-monitoring building structures,
and hospital beds.
“It seems that many of the hurdles
are largely ones of cultural and eco-
nomic inertia,” says Perlin. “When
a fundamentally improved way of
doing things appears, there can be
significant time before its impact is
fully felt.”
As for the future of these and other
novel input technologies, users them-
selves no doubt will have the final
word in determining their utility. Still,
researchers say that as input technolo-
gies evolve, the recognizable mecha-
nisms for interfacing with computers
will likely vanish altogether and be
incorporated directly into our environ-
ment and perhaps even into our own
bodies. “Just as we don’t think of, say,
the result of LASIK surgery as an in-
terface, the ultimate descendents of
computer interfaces will be completely
invisible,” predicts Perlin. “They will
be incorporated in our eyes as built-
in displays, implanted in our ears as
speakers that properly reconstruct 3D
spatial sound, and in our fingertips as
touch- or haptic-sensing enhancers
and simulators.”
On the way toward such seamlessly
integrated technology, it’s likely that
new interface paradigms will con-
tinue to proliferate, allowing for com-
puter interactions far more sophisti-
cated than the traditional mouse and
keyboard. CMU’s Harrison predicts
that eventually humans will be able
to walk up to a computer, wave our
hands, speak to it, stare at it, frown,
laugh, and poke its buttons, all as a
way to communicate with the device.
In Harrison’s vision of this multimod-
al interfacing, computers will be able
to recognize nuanced human commu-
nication, including voice tone, inflec-
tion, and volume, and will be able to
interpret a complex range of gestures,
eye movement, touch, and other cues.
Further Reading
Baudisch, P. and Chu, G.
Back-of-device interaction allows creating
very small touch devices. Proceedings of the
27th International Conference on Human
Factors in Computing Systems, Boston, MA,
April 2009.
Csikszentmihalyi, M.
Flow: The Psychology of Optimal
Experience. harperCollins, new York, 1990.
Erickson, T. and McDonald, D. W. (eds.)
HCI Remixed: Reflections on Works That
Have Influenced the HCI Community. MIT
Press, Cambridge, MA, 2008.
Harrison, C. and Hudson, S. E.
Scratch input: creating large, inexpensive,
unpowered, and mobile finger Input
surfaces. Proceedings of the 21st Annual
ACM Symposium on User Interface
Software and Technology, Monterey, CA,
October 2008.
Rosenberg, I. and Perlin, K.
The UnMouse pad: an interpolating
multi-touch force-sensing input pad. ACM
Transactions on Graphics 28, 3, August 2009.
based in Los angeles, Kirk L. Kroeker is a freelance
editor and writer specializing in science and technology.