variety of computational simulations based on physical sand (see Figure 3). Users view these simulations
as they are projected onto the surface of a sand model
representing the terrain. They choose from a variety
of simulations highlighting the height, slope, contours, shadows, drainage, or aspect of the landscape
Users alter the form of the landscape model by
manipulating sand with their hands, seeing the resultant effects of computational analysis generated and
projected onto the surface of the sand in real time.
The project demonstrates how TUIs take advantage
of our natural ability to understand and manipulate
physical forms while harnessing the power of computational simulation to help us understand model representations. SandScape, which uses optical
techniques to capture the geometry of a landscape
model, is less accurate than its predecessor, Illuminating Clay, which used laser range finders to capture the
geometry of a physical clay model [ 6].
SandScape and Illuminating Clay both demonstrate the potential advantage of combining physical
and digital representations for landscape modeling
and analysis. The physical clay and sand models convey spatial relationships that are intuitively and
directly manipulated by hand. Users also insert any
found physical objects directly under the camera.
This approach allows them to quickly create and
understand highly complex topographies that would
be difficult and time-consuming to produce through
conventional computer-aided design tools. This “
continuous and organic TUI” approach makes better use
of our natural ability to discover solutions through
direct manipulation of physical objects and materials.
TUIs give physical form to digital information and
computation, facilitating the direct manipulation of
bits. The goal is to empower collaboration, learning,
and decision making through digital technology
while taking advantage of our human ability to
grasp and manipulate physical objects and materials.
Here, I’ve introduced the genesis and evolution of
TUIs over the past 10 years, from rigid discrete
interface toward organic and malleable materials
that enable dynamic sculpting and computational
analysis using digitally augmented continuous physical materials. This new type of TUI delivers rapid
form giving in combination with real-time computational feedback.
In addition to rapid form giving, actuation technology plays a critical role in making the interface
more organic and dynamic. The Tangible Media
Group is exploring the new genre of TUIs that incor-
porates actuation mechanisms to realize kinetic memory for educational toys like Curlybot [ 1] and Topobo
[ 7]. It is also designing a new generation of tabletop
TUIs that utilize actuation to make tangible objects
behave more actively, dynamically representing the
internal computational state; examples include the
Actuated Workbench [ 4] and physical intervention in
computational optimization, or PICO [ 5].
I hope the TUI evolution I’ve explored here will
contribute to the future discussion of malleable,
dynamic, organic interfaces that seamlessly integrate
sensing and display into soft and hard digital/physical
1. Frei, P., Su, V., Mikhak, B., and Ishii, H. Curlybot: Designing a new
class of computational toys. In Proceedings of the SIGCHI Conference on
Human Factors in Computing Systems (The Hague, The Netherlands,
Apr. 1– 6). ACM Press, New York, 2000, 129–136.
2. Ishii, H., Ratti, C., Piper, B., Wang, Y., Biderman, A., and Ben-Joseph,
E. Bringing clay and sand into digital design: Continuous tangible user
interfaces. BT Technology Journal 22, 4 (2004), 287–299.
3. Ishii, H. and Ullmer, B. Tangible bits: Towards seamless interfaces
between people, bits, and atoms. In Proceedings of the Conference on
Human Factors in Computing Systems (Atlanta, Mar.). ACM Press, New
York, 1997, 234–241.
4. Pangaro, G., Maynes-Aminzade, D., and Ishii, H. The Actuated Workbench: Computer-controlled actuation in tabletop tangible interfaces. In
Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology (Paris, Oct. 27– 30). ACM Press, New York, 2002,
5. Patten, J. and Ishii, H. Mechanical constraints as computational constraints in tabletop tangible interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (San Jose, CA, Apr.
28–May 3). ACM Press, New York, 2007, 809–818.
6. Piper, B., Ratti, C., and Ishii, H. Illuminating Clay: A 3D tangible interface for landscape analysis. In Proceedings of the SIGCHI Conference on
Human Factors in Computing Systems: Changing Our World, Changing
Ourselves (Minneapolis, Apr. 20– 25). ACM Press, New York, 2002,
7. Raffle, H., Parkes, A., and Ishii, H. Topobo: A constructive assembly
system with kinetic memory. In Proceedings of the SIGCHI Conference on
Human Factors in Computing Systems (Vienna, Apr. 24– 29). ACM Press,
New York, 2004, 647–654.
8. Underkoffler, J. and Ishii, H. Urp: A luminous-tangible workbench for
urban planning and design. In Proceedings of the SIGCHI Conference on
Human Factors in Computing Systems: The CHI Is the Limit (May
15– 20). ACM Press, New York, 1999, 386–393.
9. Weiser, M. The computer for the 21st century. Scientific American 265,
3 (Sept. 1991), 94– 104.
HIROSHI ISHII ( firstname.lastname@example.org) is the Muriel R. Cooper
Professor of Media Arts and Sciences at the MIT Media Lab,
Cambridge, MA, where he heads the Tangible Media Group and is
co-director of the Things That Think consortium.
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.