be confused with truth, even though
that confusion appears remarkably
common. People who believe everything they read on Facebook, Google,
Amazon, Twitter, and other Internet
sites are clearly delusional.
People who are less aware of technol-ogy-related risks tend to overendow
computers as perfect, while computers
have little respect for people. Neither
computer behavior nor human behavior is always perfect, and should not be
expected to be so. There are significant
risks in blindly believing in computer
trustworthiness and human truthfulness. We must not believe in computer
infallibility, or in everything we read on
the Internet in the absence of credible
corroboration. But then we should also
not believe people who pervasively dishonor truthfulness.
Unfortunately, the trends for the future seem relatively bleak. Computer
system trustworthiness and the implications of its absence are increasingly
being questioned. For example, a recent article by Bruce G. Blair (
Hacking our Nuclear Weapons, The New
York Times, Mar. 14, 2017) suggests
“Loose security invites a cyberattack
with possibly horrific consequences.”
Semi- and fully autonomous systems,
the seemingly imminent Internet of
Things, and artificial intelligence are
providing further examples in which
increasing complexity leads to obscure
and unexplainable system behavior.
The concept of trustworthiness seems
to becoming supplanted with people
falsely placing their trust in systems
and people that are simply not trustworthy—without any strong cases being made for safety, security, or indeed
assurance that might otherwise be
found in regulated critical industries
such as aviation. However, the risks of
false would-be “facts” may be the ultimate danger. An obvious consequence
might be the extensive institutional
loss of trust in what is neither trustworthy nor truthful. The truth and trustworthiness may be even more important now than ever before.
Peter G. Neumann ( firstname.lastname@example.org) moderates
the ACM Risks Forum and is Senior Principal Scientist in
SRI International’s Computer Science Lab. He is grateful
to Donald Norman for considerable useful feedback.
Copyright held by author.
and altruism. A knee-jerk attempt to
rein in social engineering could involve
eliminating these very desirable social
attributes (which might also eliminate
civility and decency from our society).
How Does This All Fit Together?
It should be fundamental to readers
of Inside Risks articles that point solutions to local problems are generally insufficient, and that we have to consider
trustworthiness in the total-system context that includes hardware, software,
networking, people, environmental
concerns, and more. On September 22,
1988, Bob Morris (then chief scientist of
the National Computer Security Center
at NSA) said in a session of the National Academies’ Computer Science and
Telecommunications [now Technology]
Board on problems relating to security,
“To a first approximation, every computer in the world is connected with every other computer.” That quote is even
more relevant today, almost 30 years later. Similarly, all of the issues considered
in this column involving computers and
people may be intimately intertwined.
Science is never perfect or immutable—it is often a work in progress.
Hence, scientists can rarely if ever
know they have the absolute final answer. However, scientific methods have
evolved over time, and scientists generally welcome challenges and disagreements that can ultimately be resolved
through better theories, experimental
evidence, and rational debate. Occasionally, we even find fake science and
untrustworthy scientists, although
these aberrations tend to be refuted
eventually via peer pressure. Where
science has strong credible evidence,
it deserves to be respected—because
in the final analysis reality should be
able to trump fantasies (although this
in fact may not work).
Truth is perhaps even more in flux
than science, and certainly relative,
not absolute—with many caveats.
However, truth matters. We might
paraphrase the oft-cited Albert Einstein quote as “Everything should be
stated as simply as possible, but not
simpler.” Oversimplifications, the lack
of foresight, and a seriously non-ob-jective perspective are often sources of
serious misunderstandings, and can
result in major catastrophes. On the
other hand, untruthfulness must not
the Avatar Dream
How Is IT Important?
Inference Auction Design
in Online Advertising
Front and Center
The IDAR Graph
Research for Practice:
Tracing and Debugging
When Does Law
to Read Your Data
Become a Demand
to Read Your Mind?
Proving Safety and
Liveness of Practical
Fast and Powerful
Plus the latest news about
DNA data storage, and AI—
Is it more artificial