can turn off just the rays that will hit the
raindrops so that you would not be able
to see them, and could drive as if it’s a
fine day. I had this idea a long time ago,
but could not do it. Professor Narasimhan’s team made this magic real.
That sounds amazing.
The real use of the raindrop control thing may be still a little ways off.
But our system can recognize oncoming cars and turn off just the rays that
would hit driver’s eyes, so that you
don’t have to switch between your high
and low beams. This is what I call augmented reality. What people call “
augmented reality” are often applications
where a smartphone or a head-mount
display appends additional information to the object that you’re viewing.
But that’s not augmented reality. That’s
the augmented display of reality. With
smart headlights, reality is indeed augmented, because to your naked eye, it
appears to be more informative.
You’re also involved with a number of
Quality of Life Technology initiatives
that are focused on developing systems
that enable people with disabilities to
live more independently.
The work has given me a different
perspective on robotics. In autonomous
systems, the implicit goal is to reduce
or even eliminate human involvement.
That’s natural for applications like space
and defense, where it’s either difficult,
undesirable, or impossible for humans
to go. In Quality of Life robotics, people
are a part of the system, and people want
to be helped only when they need it;
otherwise, they want to do things themselves. That is probably the most important aspect that we have to respect.
So your goal is to figure out how to in-
crease human involvement.
In order for the system to work, we
have to understand each component.
I often say that people are the weakest
link, because they are the least understood. In cars, for example, we have
worked so hard to understand mechanical factors like tire traction, air pressure
and resistance, and so on. But we don’t
understand the driver, and that’s why I
think we still have a lot of accidents.
Leah Hoffmann is a technology writer based in Piermont, NY.
© 2016 ACM 0001-0782/16/12 $15.00
So you began building your own ALVs,
Agency) to develop vision capabili-
ties for Autonomous Land Vehicles,
or ALVs. But the arrangement didn’t
work very well, because we didn’t have
a real car that gave us real input in real
time. Instead, we were given video, and
asked to analyze whether we could rec-
ognize this or that. But recognition is
relative to what the car does. Recogni-
tion and action and the way that input
changes make a loop, and you have to
understand that whole loop yourself.
and later, cars.
In the beginning, the work was on
natural terrain, but that’s harder to do
in the city. So we began to use a pathway in Schenley Park, and our interests
shifted to path and road following.
At one point, I understand that a tree
presented something of an obstacle.
I often use the story. One of the trees
in the park accidentally aligned with a
path when seen from particular direction. Of course, in a three-dimensional
world, the tree edge is a vertical edge
and the road edge is a horizontal edge.
But as a two-dimensional picture, they
appeared aligned, as if they were a single object. At that time, we didn’t have
a good understanding of how to deal
with appearance vs. reality, so we just
tried to see whether a pair of edges created the appearance of a road boundary, like parallel lines.
And the edges of this particular tree
trunk fooled the system.
Yes, whenever the car went the wrong
direction, we had to push the “kill” button. We called it the killer tree. We once
asked students how we could avoid this
problem, and the best answer was, “Call
up the Pittsburgh Parks Department
and ask them to cut the tree.”
What are you working on now?
One project I’m working on with Professor Srinivasa Narasimhan is smart
headlights. When you’re driving at night
or on a rainy or snowy day, the raindrops
or snowflakes appear as bright dots in
your visible scene, because they are very
reflective. But if you replace the headlight with a projector, you can turn the
individual rays on and off. Then, by recognizing where the raindrops are, you
[CONTINUED FROM P. 144]
ACM’s Interactions magazine
explores critical relationships
between people and
emerging innovations and
industry leaders from around
the world across important
applications of design thinking
and the broadening ;eld of
Our readers represent a growing
community of practice that is
of increasing and vital global
To learn more about us,
visit our award-winning website
Follow us on
Facebook and Twitter
IX_XRDS_ThirdVertical_V01.indd 1 3/18/15 3:35PM