Technology | DOI: 10.1145/2742486 Gregory Mone
a Human Touch
Empowering smart machines with tactile feedback
could lead to tremendous new applications.
It’s just a very hard problem.”
Platt hopes GelSight could be one
possible solution. The sensor consists
of a square of synthetic rubber coated
on one side with metallic paint. The
other side of the rubber is attached to a
transparent piece of plastic equipped
with light-emitting diodes (LEDs) and
a small camera. The camera captures
the LED light as it bounces off the lay-
er of metallic paint, and a computer
processes this data.
When the robot grips that USB plug,
the layer of rubber and paint deforms
as if the device had been pressed into
a piece of clay. This change in the surface also changes the way the LED light
is reflected. The computer interprets
these alterations, generating a fine-grained picture of what the robot has
in its hand. “It’s able to feel the geometry of the surface it’s touching in a very
IN A NORTHEASTERN Univer- sity lab in Boston, MA, a red and roughly humanoid robot named Baxter spots a USB cable dangling from a cord.
Baxter, made by Boston’s Rethink
Robotics, reaches out and grasps the
plug between two fingers, then slowly
guides it downward. On the desk below, a USB socket is plugged into a
power strip. A camera mounted on
Baxter’s arm helps the robot locate the
socket; then the hand guides the USB
plug into place, wriggling it briefly before completing the connection.
A job this delicate is difficult
enough for people, let alone robots.
Machines have long been able to use
vision to identify the plug and socket,
but to complete this task autonomously, Baxter relied on another type
of feedback: a sense of touch. The
robot was equipped with a new high-resolution GelSight tactile sensor.
GelSight, developed by Massachusetts
Institute of Technology (MIT) engineer Edward Adelson and put to work
on Baxter with the help of Northeastern University roboticist Robert Platt,
does not just tell the robot whether it
is holding something in its hand; the
sensor provides high-resolution feedback that allows the robot to identify
the object from its shape and imprinted logo, then determine its orientation in space. As a result, Baxter can
figure out whether it is holding the
plug the right way and inserting it into
the socket correctly.
Humans do the same thing when
handling small objects; we rely on feel.
In fact, the human hand is evidence
that equipping robots with a sense of
touch could be incredibly powerful,
leading to new applications in surgery,
manufacturing, and beyond. “We have
in our own hands a proof that tactile
sensing is very important,” says Bill
Townsend, CEO of robotics manufac-
turer Barrett Technology. “The ques-
tion is how do we get from where we
are with tactile sensors now to what we
know can be possible based on what
our own hands can do.”
Researchers have been working to develop powerful tactile sensors for decades, but few of these technologies
have successfully transitioned out of
the academic lab. “There have been
lots and lots of designs, but nobody
has ever come up with a really reliable
tactile sensor,” says Ken Goldberg, director of the Center for Automation
and Learning for Medical Robotics
at the University of California, Berkeley. “It’s easy to build a camera and
get high resolution at a low cost, but
a sense of touch is extremely difficult.
Equipped with the high-resolution GelSight tactile sensor, a robot can grasp a cable
and plug it into a USB port.