Even ceilings may someday function as
an information display.
munication (such as a handshake). Tangible UIs are
more logical, or manipulation-oriented, whereas
organic UIs are more emotional, or communication-oriented, though more real-world experience is
needed for a rigorous comparison.
Other modalities for interaction. In organic UIs,
hands are still the primary body parts for interaction.
But we should be able to use other parts, as we do in
our natural communications. Eye gaze is one possibility. Another is blowing, which is useful for manipulation because it is controllable while also conveying
emotion during interaction; a technique developed in
[ 6] determines the direction of a blow based on an
acoustic analysis. The BYU-BYU-View system adds
the sensation of air movement to the interaction
between a user and a virtual environment to add reality for telecommunications by delivering information
directly to the skin [ 9].
INTERACTION BETWEEN REAL WORLD AND
In the context of traditional human-computer
interaction, the term “interaction” generally means
information exchange between a human and a computer. In the near future, interaction will also
involve more physical experience (such as illumination, air, temperature, humidity, and energy). The
interaction concept is thus no longer limited to
interaction between humans and computers but
can be expanded to cover interaction between the
physical world and computers. For example, future
interactive wall systems will react to human gesture,
be aware of the air in the room, and be able to stabilize conditions (such as temperature and humidity) in the same way a cell membrane maintains the
stability of a cell environment. Interactive walls may
also be able to control sound energy to dynamically
create silent spaces. Even ceilings may someday
function as an information display. In this way,
future interactive systems may more seamlessly
interact with and control our physical environments. c
44 June 2008/Vol. 51, No. 6 COMMUNICA TIONS OF THE ACM
1. Dietz, P. and Leigh, D. Diamond Touch: A multiuser touch technology.
In Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology (Orlando, FL, Nov. 11– 14). ACM Press, New
York, 2001, 219–226.
2. Han, J. Low-cost multitouch sensing through frustrated total internal
reflection. In Proceedings of the 18th Annual ACM Symposium on User
Interface Software and Technology (Seattle, Oct. 23– 26). ACM Press,
New York, 2005, 115–118.
3. Hinckley, K. and Sinclair, M. Touch-sensing input devices. In
Proceedings of the ACM Conference on Computer-Human Interaction (Pittsburgh,
PA, May 15– 20). ACM Press, New York, 1999, 223–230.
4. Igarashi, T., Moscovich, T., and Hughes, J. As-Rigid-As-Possible Shape
Manipulation. In Proceedings of the SIGGRAPH Conference (Los Angeles,
July 31–Aug. 4). ACM Press, New York, 2005, 1134–1141.
5. Matsushita, N. and Rekimoto, J. Holo Wall: Designing a finger, hand,
dody, and object-sensitive wall. In Proceedings of ACM Symposium on
User Interface Software (Banff, Alberta, Canada, Oct. 15– 17). ACM
Press, New York, 1997, 209–210.
6. Pateland, S. and Abowd, G. BLUI: Low-cost localized blowable user
interfaces. In Proceedings of of the ACM Symposium on User Interface Software (Newport, RI, Oct. 7– 10). ACM Press, New York, 2007.
7. Rekimoto, J. SmartSkin: An infrastructure for freehand manipulation on
interactive surfaces. In Proceedings of the ACM Conference on Computer-Human Interaction (Minneapolis, MN, Apr. 20– 25). ACM Press, New
York, 2002, 113–120.
8. Rekimoto, J., Ishizawa, T., Schwesig, C., and Oba, H. PreSense: Interaction techniques for finger-sensing input devices. In Proceedings of the
16th Annual ACM Symposium on User Interface Software and Technology
(Vancouver, BC, Canada, Nov. 2– 5). ACM Press, New York, 2003,
9. Sawada, E., Ida, S., Awaji, T., Morishita, K., Aruga, T., Takeichi, R.,
Fujii, T., Kimura, H., Nakamura, T., Furukawa, M., Shimizu, N.,
Tokiwa, T., Nii, H., Sugimoto, M., and Inami, M. BYU-BYU-View: A
Wind Communication Interface. In the Emerging Technologies Exhibi-tionat SIGGRAPH (San Diego, Aug. 5– 9, 2007);
JUN REKIMOTO ( firstname.lastname@example.org) is a professor in the Interfac-ulty Initiative in Information Studies at The University of Tokyo and a
director of the Interaction Laboratory at Sony Computer Science
Laboratories, Inc. in Tokyo.
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
© 2008 ACM 0001-0782/08/0600 $5.00