Technology | DOI: 10.1145/2160718.2160724
Robots Like us
Thanks to new research initiatives, autonomous
humanoid robots are inching closer to reality.
THe SCIeNCe fICTIoN roBoTS of our youth always looked reassuringly familiar. From the Iron Giant to C-3PO, they almost invariably sported
two arms, a torso, and some type of metallic head.
Real-world robots, alas, have largely
failed to live up to those expectations.
Roombas, factory assembly arms, and
even planetary rovers all seem a far cry
from the Asimovian androids we had
That may be about to change, however, thanks to a series of engineering
advances and manufacturing economies of scale that could eventually
bring autonomous humanoid robots
into our everyday lives. Before Rosie
the Robot starts clearing away the dinner dishes, however, roboticists still
need to overcome several major technical and conceptual hurdles.
In June 2011, U.S. President Obama
announced the National Robotics Initiative, a $70 million effort to fund the
development of robots “that work beside, or cooperatively with people.” The
government sees a wide range of potential applications for human-friendly robots, including manufacturing, space
exploration, and scientific research.
While roboticists have already made
major strides in mechanical engineering—designing machines capable of
walking on two legs, picking up objects,
and navigating unfamiliar terrain—
humanoid robots still lack the sophistication to work independently in unfamiliar conditions. With the aim of making
robots that can function more autonomously, some researchers are honing
their strategies to help robots sense and
respond to changing environments.
Honda’s well-known ASIMO robot
employs a sophisticated array of sensors to detect and respond to external
conditions. When ASIMO tries to open
a thermos, for example, it uses its sensors to gauge the shape of the object,
then chooses a sequence of actions
the newest Geminoid robot is modeled
after project collaborator henrik scharfe.
appropriate to that category of object.
ASIMO can also sense the contours of
the surface under its feet and adjust its
Negotiating relationships with the
physical world may seem difficult
enough, but those challenges pale in
comparison to the problem of interacting
with some of nature’s most unpredictable variables: namely, human beings.
“People and robots don’t seem to
mix,” says Tony Belpaeme, reader in
Intelligent Systems at the University of
Plymouth, whose team is exploring new
models for human-robot interaction.
The problem seems to cut both
ways. On the one hand, robots still
have trouble assessing and responding to human beings’ often unpredictable behavior; on the other hand, human beings often feel uncomfortable
Roboticists call this latter phenomenon the “uncanny valley”—the
sense of deep unease that often sets
in when human beings try to negotiate relationships with human-looking
machines. The more realistic the machine, it seems, the more uncomfortable we become.
“Robots occupy this strange category where they are clearly machines,
but people relate to them differently,”
says Bill Smart, associate professor of
computer science and engineering at
Washington University in St. Louis.
“They don’t have free will or anima,
but they seem to.”
Perhaps no robot has ever looked so
uncannily human as the Geminoids,
originally designed by Osaka University
professor Hiroshi Ishiguro in collabo-
ration with the Japanese firm Kokoro.
To date the team has developed three
generations of successively more life-
like Geminoids, each modeled after an
actual human being.
The latest Geminoid is a hyper-re-alistic facsimile of associate professor
Henrik Scharfe of Aalborg University, a
recent collaborator on the project. With
its Madame Tussaud-like attention
to facial detail, the machine is a dead
ringer for Scharfe. But the Geminoid’s
movement and behavior, while meticulously calibrated, remains unmistakably, well, robotic.
In an effort to create more fluid hu-man-like interactions, some researchers are starting to look for new perspectives beyond the familiar domains of
computer science and mechanical engineering, to incorporate learning from
psychology, sociology, and the arts.
Belpaeme’s CONCEPT project aspires to create robots capable of making
realistic facial expressions by embracing what he calls “embodied cognition.” For robots to interact effectively
with us, he believes, they must learn
to mimic the ways that human intelligence is intimately connected with the
shape and movements of our bodies.
To that end, Belpaeme’s team is developing machine learning strategies
modeled on the formative experiences
of children. “Our humanoid robots can
be seen as young children that learn, explore, and that are tutored, trained, and
taught,” he says. The robots learn that
language by interacting directly with
people, just as a child would.
In order to make those learning exchanges as smooth as possible, the