There are more interaction
possibilities than just changing a
drone’s position. Drones can be
extended with other sensors and
actuators, opening up a whole new
design space for more interaction
modalities considering input and
output with drones.
HOW ARE DRONES
USED IN HCI?
Using drones for research projects
has become more popular in recent
years. For example, drones have been
used for carrying cameras, carrying
screens and projectors, providing
a tactile interface, and for their
bare presence in 3D space. In the
following, I list a few examples from
the HCI research community.
Flying cameras. Drones can be
used for carrying a camera to any
position and orientation in 3D space.
This feature was used in the project
DroneLand Art (https://www.youtube.
to create art installations using the
outdoors. The art installations were
not visible to an observer on the
ground, but when a drone would film
the environment from a specific aerial
angle, the art would appear.
Flying screens and projectors.
Screens and projectors are the
traditional output media to display
content from a computer to a
user. Research in human-drone
interaction has investigated the
combination of screens attached to
drones. For example, Schneegass
et al. [ 2] envisioned that pervasive
flying screens, so-called midair
displays, could be used to show
evacuation instructions in
emergency scenarios, or could
guide the way at sporting events or
tourist sites. Gomes et al. [ 3] use
their BitDrones to carry the displays
for Skype conversations on a flying
screen. This allows the callers to
position themselves in the physical
environment of the person they’ve
called in any way they choose.
Further, Knierim et al. [ 4] used
a projector attached to a drone to
provide mobile in situ projected
navigation instructions. With
navigation instructions projected
into the users’ physical environment,
users no longer need to look at
their smartphones and instead
can concentrate on enjoying the
Drones can now
handle a payload
carry sensors and
actuators. I M A G