Vviewpoints
wrong, or may change over time—for
example, as new types of threats are detected and exploited.
In addition to trustworthiness or
untrustworthiness of people relevant
to their interactions with computers in
the above sense, trustworthiness and
specifically personal integrity are also
meaningful attributes of people and
governments in their daily existence.
In particular, truthfulness and honesty
are typically thought of as trustworthiness attributes of people. The question of whether a particular computer
system is honest would generally not
be considered, because such a system
has no moral compass to guide it. However, truthfulness is another matter. A
system might actually be considered
dishonest or even untruthful if it consistently or even intermittently gives
wrong answers just in certain cases—
especially if it had been programmed
explicitly to do exactly that. For example, such behavior has been associated
with certain proprietary voting systems—see Douglas W. Jones and Bar-
TRUSTWORTHINESS IS AN at- tribute that is fundamental to our technologies and to ur human relations. Overly trusting something that is
not trustworthy often leads to bad results. Not trusting something that really is trustworthy can also be harmful.
In many of the past 240 Inside Risks
columns, we have been concerned extensively with trustworthiness, which
should be a basic requirement of all
computer-related systems—
particularly when used in mission-critical
applications, but also in personal settings such as maintaining your own
quality of life. Trustworthiness is absolutely essential to the proper behavior
of computers and networks, and to the
well-being of entire nations and industries that rely on proper behavior of
their computer-based enterprises.
Computer-Based
Systems and People
Trustworthy system behavior typically
may depend on trustworthiness of
people—for example, system design-
ers, hardware developers and program-
mers, operational staff, and high-level
managers. Many systems that might
have some assessment of trustwor-
thiness can nevertheless be seriously
compromised by malicious malware,
external adversaries, and insider mis-
use, or otherwise disrupted by denial-
of-service attacks. If such compromises
arise unexpectedly, then those systems
were most likely not so trustworthy as
had been believed.
Thus, we need system designs and
implementations that are tolerant of
people who might usually be trustworthy but who make occasional errors, as
well as systems that are resistant to and
resilient following many other potential adversities. More importantly, we
need measures of assurance—which
assess how trustworthy a system might
actually be in certain circumstances
(albeit typically evaluated only against
perceived threats). Unfortunately,
some the assumptions made prior to
the evaluation process may have been
Inside Risks
Trustworthiness
and Truthfulness
Are Essential
Their absence can introduce huge risks …
DOI: 10.1145/3084344