3. MaD:
Mapping by
Demonstration
for Continuous
Sonification
MaD allows for simple and
intuitive design of continuous
sonic gestural interaction.
When movement and sound
examples are jointly recorded,
the system automatically learns
the motion-sound mapping.
Our applications focus on using
vocal sounds—recorded while
performing actions—as the
primary material for interaction
design. The system integrates
probabilistic models with hybrid
sound synthesis. Importantly, the
system operates independently
3
2
various feelings of elastic
and textured surfaces. The
ultrasound does not affect
the optical images and can
be controlled quickly in this
interactive system. The
combination of 3D volumetric
vision and this haptic technology
flexibly displays the presence of
3D objects that can be pinched,
handled, and manipulated.
http://www.hapis.k.u-tokyo.
ac.jp/?portfolio=english-horn-
hapt-optic-reconstruction&lang=en
https://www.youtube.com/
watch?v=7Ibdv0rtiDE
Inoue, S., Kobayashi, K.,
Monnai, Y., Hasegawa, K., Makino,
Y., and Shinoda, H. HORN: The
hapt-optic reconstruction. Proc.
of SIGGRAPH 2014, Emerging
Technologies. ACM, New York,
2014, Article 11.
Seki Inoue, The University of Tokyo
→ seki_inoue@ipc.i.u-tokyo.ac.jp
Keisuke Hasegawa, The University
of Tokyo
→ keisuke_hasegawa@ipc.i.u-tokyo.
ac.jp
Yasuaki Monnai, The University of
Tokyo
→ yasuaki_monnai@ipc.i.u-tokyo.ac.jp
Yasutoshi Makino, The University
of Tokyo
→ yasutoshi_makino@k.u-tokyo.
ac.jp
Hiroyuki Shinoda, The University
of Tokyo
→ hiroyuki_shinoda@k.u-tokyo.
ac.jp
2. HORN—
Ultrasound
Airborne
Volumetric
Haptic Display
Interaction with mid-air floating
virtual objects expands human-computer interface possibilities.
Here, we propose a system that
superimposes haptic volumetric
sensations on mid-air floating
images by using acoustic
potential distribution.
Our surrounding phased-array system freely produces
3D spatial patterns of ultrasonic
standing waves, which create
DEMO
HOUR