space to be shown at their exact
position. Additional interactions
on the device and with the device
are possible—for instance, moving
closer to the wall or performing a
gesture for zooming or selecting
data. (See Lisa Cowan’s sidebar
here for examples of how people
use projectors in the wild.)
This flashlight or spotlight [ 3]
form of projection has similarities
with the well-known augmented
reality (AR) applications, in which
real objects are overlaid with additional information. Traditional AR
always requires a head-mounted
display or a handheld device serving as a “magic lens.” These AR
setups limit the view to a single
user equipped with either of these
devices. In contrast, mobile projections are visible to everybody at
the same time and have the potential to truly augment objects in our
environment without hampering
collaboration.
We can go beyond using a wall
as a backdrop for illuminating an
otherwise static information space.
Just imagine the content being
displayed on the wall to be moving
or animated. Characters of a game
would, for example, try to escape
your projected display frame. To
follow them you would need to
move your handheld projector in
their direction. To tie together
movement-sensor input and projector output within a single handheld device is the intriguing idea
of the motion beam metaphor. (See
Karl D.D. Willis’s sidebar on collaborative interactions.)
Given the dynamic and mobile
nature of projections, we can
envision ways of collaborative
interaction, too. The work of Cao,
Forlines, and Balakrishnan [ 4] and
follow-up research has investigated multiple users interacting with
each other using “display torches.”
PRESENT AND FUTURE USES
Lisa Cowan
We have begun to document how people use projectors in the wild. Our initial
observations have revealed that in addition to the expected passive uses, many
active practices emerged. For example, one participant in our study projected
fire onto a colleague’s back at work to mischievously imply that she was burning,
and “swam” with projected sharks in his bedroom to convey an impression of
immersion while telling a story to a friend. Thus, even the basic projector phone
platform supports novel interaction experiences.
Turning from basic platforms to future forms, we are designing new projection-specific interaction techniques to explore a broader array of possible uses. We
created ShadowPuppets, a technique that allows users to cast hand shadows
as input to mobile projector phones, precluding the need for visual attention
to the handset and additionally supporting collocated input [ 1]. For example, if
users are looking at photos or maps together, they can zoom in or out by casting
shadows of their hands opening or closing; they can pan by moving their hand
shadows in the desired direction; and they can select particular elements by
pointing at them with their shadows (as shown in the photo here). The results
of our user studies suggest that shadows can provide a natural, intuitive way of
interacting with projected interfaces and can support collocated interaction.
• Casting shadows to interact with a map.
ENDNOTE:
1. Cowan, L.G. and Li, K. A. ShadowPuppets: Supporting collocated interaction with mobile
projector phones using hand shadows. Proc. of the 2011 Annual Conference on Human Factors
in Computing Systems (CHI ’ 11). ACM, New York, 2011, 2707-2716.
Lisa Cowan recently completed her Ph.D. in computer science at the University of California, San
Diego (UCSD).
Again, tracking is required to display the appropriate parts of the
information space with regard to
the current position and orientation of the users’ projections.
Combining a rich vocabulary of
on-the-device, with-the-device,
and across-devices interaction,
fascinating possibilities for playful user experiences emerge.
Multiple game players could, for
example, project dynamically
changing content, such as building bricks, onto a wall. Players
would need to combine their
bricks by constantly moving their
projector phones while adjusting the bricks’ sizes to match
the other bricks, using multi-touch gestures on their phones.
March + April 2012