adaptive cruise control, incorrectly. In
some cases, Norman says, these automated systems cause the car to speed
up as motorists exit a highway because
there’s suddenly no car in front. If a
driver isn’t paying attention, an accident can occur.
In the case of airplane pilots and
train operators, one solution is regular
training sessions in which the pilot or
operator is required to turn off their
automated system and operate everything manually. This can help them retain their skills and alertness.
But even this is not likely to eliminate breakdowns. Human-machine
interface failures occur for a number
of reasons, experts say. Sometimes, designers rely on a wrong set of assumptions to build a system. They simply
don’t understand the way people use
technology or the cultural differences
that occur. In some instances, thousands and sometimes millions of variables exist and capturing everything
in a single algorithm is exceedingly
difficult. In fact, Norman argues that
machine logic doesn’t necessarily jibe
with the human brain. “If you look at
‘human error’ it almost always occurs
when people are forced to think and
act like machines,” he says.
Worse, complex algorithms often
prompt humans to relate to devices as
if they were fellow human beings. As
a result, the autopilot on a plane, the
cruise control on a car, and automated
speed-control systems in mass transit
become either aids or crutches, depending the situation.
Too often, the sum of a system is not
equal to the individual parts, says Sidney W. A. Dekker, director of research at
the Leonardo da Vinci Center for Complexity and Systems Thinking at Lund
University in Sweden. “There is often a
great deal of human intuition involved
in a process or activity and that’s not
something a machine can easily duplicate,” says Dekker. “If you look at
delivering babies, there’s a reason we
have midwives and nurses. Machines
can monitor and help, but they can’t
detect subtle signs and they’re unable
to adapt to situations as seamlessly.”
David D. Woods, professor of cognitive engineering at Ohio State University, says that designers can easily succumb to the trap of thinking
“a little more technology will solve
the problem.” However, understanding variables and identifying possible
exceptions and disruptions is paramount. For example, when the Metro
D.C. train crashed, it may have been
due to wet leaves on the tracks and a
computerized system that wasn’t programmed for such a scenario. “The
automation system functioned as it
was designed,” Woods says. “The situation simply fell outside the model of
what engineers envisioned.”
Make no mistake, human factors experts constantly scrutinize automation. Many believe that if human error
exists, it falls on the shoulders of those
engineering, designing, and programming technology. “In reality, there is no
such thing as operator error. Too often,
systems aren’t designed as whole and
those creating them overlook important factors,” argues Nancy Leveson,
professor of aeronautics and astronautics at Massachusetts Institute of Technology and author of the forthcoming
book Engineering a Safer World.
Yet, progress is taking place. Consider the airline industry: In 1989,
1. 4 crashes per 1 million departures
occurred. By 2008, the number had
dropped to 0.2 fatal accidents per 1
million departures. In fact, crashes
have steadily dropped over the decades while survivability has increased.
Dekker, who is a pilot and has flown
various aircraft, including a Boeing
737, says that the industry has gotten serious about stamping out flaws,
bugs, and oversights.
These improvements have taken
place because the airline industry has
moved beyond studying ergonomics
and discreet processes. In fact, Leveson
says that researchers have put a microscope to cognitive functions, psychology, cultural issues, and a variety of other components that comprise human
factors. “They have evolved toward a
system view and worked to understand
how everything—hardware, software,
procedures, and humans—interact.
It’s a model that other industries must
embrace,” she says.
One thing is certain: Automation
disconnects won’t disappear anytime
soon. Leveson believes that, ultimately, the people designing systems must
take a more holistic view and get past
the notion that when a problem or
breakdown occurs it’s a result of “
human error.” She believes that universities must place a greater focus on
human factors and that programmers
and others must understand that, without a big picture view of what they are
building, the end result will continually fall short.
Others, such as Dekker, argue that
society must examine larger issues,
including whether automation automatically translates into progress. “In
reality, not every function or process
is best automated,” he says. “In some
cases, automation simply creates new
or different tasks and doesn’t provide
any real benefit.” Automation may also
change processes to the point where
people are more confused and entirely
new social dynamics take place. At that
point, he says, designers may attempt
to add new features, which only ratchet
up confusion and complexity further.
To be sure, imperfect people continue to build imperfect systems. The
need to focus on human-machine interfaces has never been greater. “
Designers, engineers, programmers, and
others must take an expansive view of
automation and understand all the
possibilities and variables,” concludes
Norman. “Only then can we build systems that improve performance and
solve real-world problems.”
Ironies of automation. New Technology and
Human Error, J. Rasmussen, K. Duncan, J.
Leplat (Eds.). Wiley, Chichester, U.K., 1987.
The Field Guide to Understanding Human
Error. Ashgate Publishing, Farnham, Surrey,
The Field Guide to Human Error
Investigations. Ashgate Publishing,
Farnham, Surrey, U.K., 2002.
The Design of Future Things. Basic Books,
New York, 2009.
Sarter, N. B., Woods, D. D., and Billings, C. E.
Automation surprises. Handbook of Human
Factors and Ergonomics (3rd ed.). Wiley,
New York, 2006.
Samuel Greengard is an author and freelance writer
based in West Linn, OR.