the surgeon requests scissors from Gestonurse using a hand signal recognized by
the camera above the surgical bed.
Gestonurse safely hands off a surgical scissors to the surgeon requesting it.
quested instrument to the surgeon. A
significant advantage of gesture-based
communication is it requires no special training by the surgeon. Gesturing
comes naturally to surgeons since their
hands are already their main tools;
moreover, hand signs are the standard
method for requesting surgical instruments, 8, 19 and gestures are not affected
by ambient noise in the OR. A multimodal solution combining voice and
gesture provides redundancy needed
to assure proper instrument delivery.
Gestures for robotic control have
been the focus of much research since
the early 1980s. Early work was done
with Richard A. Bolt’s Put-That-There
interface2 followed by others using
magnetic sensors or gloves to encode
hand signs. 20, 21 Since then, gestures
have been used in health care, military,
and entertainment applications, as
well as in the communication industry; see Wachs et al. 26 for a review of the
state of the art.
System architecture
PhotograPhs by mark sImons/PurDue unIversIty
Figure 2 outlines the Gestonurse system architecture. The streaming depth
maps captured through the Kinect
sensor are processed by the gesture-recognition module while a microphone concurrently captures voice
commands interpreted by the speech-recognition module. Following recognition, a command is transmitted to
the robot through an application that
controls a Fanuc LR Mate 200iC robotic
arm across the network through a Tel-net interface. Gestonurse then delivers
the required surgical instrument to the
surgeon and awaits the next command.
We also designed an instrument-re-trieval-and-disposal system. A network
camera monitors a specific region of
the operating area, then, upon recognizing a surgical instrument, picks
it up and delivers it to the surgeon.
Meanwhile, the surgeon’s hands are
tracked to ensure robot and surgeon
do not collide, ensuring safe human-robot collaboration.
Gesture recognition. To evoke a
command, a member of the surgical
staff places a hand on the patient’s
torso and gestures. The moment the
hand is in the field of view, the gesture is captured by the Kinect sensor
and segmented from the background
through a depth-segmentation algo-