and disruptions—especially using components without substantive audit trails
or paper records that should be able to
make forensics-worthy analysis possible. Although greater system trustworthiness would be helpful, many of the
existing problems are not technological
and must also be addressed.
˲ Recast software engineering and
system engineering as engineering disciplines, with more focus on hardware
and software vulnerabilities, aspects
of system trustworthiness, importance
of well-defined system requirements,
proactive design, system usability, risk
assessment, computer-science theory
˲ Revamp software-engineering educational programs to ensure graduates have
the necessary abilities and resources. 3
˲ Recognize there are no one-size-fits-all solutions, and that many potential
trade-offs must be considered. Furthermore, technology by itself is not enough,
and many other factors must be considered—especially critical systems.
˲ Stress learning, not just teaching,
to instill an awareness of the issues discussed here from elementary school on,
dealing with complexity, principles, abstraction, respecting holistic long-term
thinking rather than just short-term premature optimization, logical reasoning,
altruism, and much more. Encourage
rational and logical thinking from the
outset, and later on, the use of practical
formal methods to improve the quality
of our computer systems. Formal methods have come a long way in recent years
(for example, Deep Spec) and are increasingly finding their way into practice.
˲ Pervasively respect the importance
of human issues (for example, with
greater emphasis on usability, personal
privacy, and people-tolerant interfaces)
as well as issues that are less technological (for example, compromises of supply-chain integrity, environmental hazards,
and disinformation). Also, independent
oversight is often desirable, as for example is the case in aircraft safety, business
accountability, and elections.
˲ Respect history, study the literature, learn from past mistakes, and benefit from constructive experiences of
yours and others.
˲ Recognize this list is incomplete and
only a beginning. For example, I have not
even mentioned the risks of side channels, speculative execution, direct-mem-
˲ Accept that we cannot build adequately trustworthy applications on top
of compromisable hardware and flawed
systems of today, particularly for those
with life-critical requirements. (The
Common Vulnerabilities Enumerators
list— cve.mitre.org—now includes over
120,000 vulnerabilities!) For example,
the best cryptography and useful artificial intelligence can be completely
subverted by low-level attacks, insider
misuse, and hardware failures, whereas applications are still a huge source
of vulnerabilities even in the presence
of stronger operating-system security.
Vulnerabilities tend to be pervasive. On
the other hand, building new systems or
cryptography from scratch is likely to be
riskful. Thus, having a set of trustworthy
basic system and cryptographic (for example, EverCrypt) components would
be a highly desirable starting point.
˲ Establish a corpus of theoretical
and practical approaches for predictable
composition of such components—
addressing both composability (requiring
the preservation of local properties) and
compositionality (requiring the analysis
of emergent properties of compositions,
some of which are vital, as in safety and
security—but some of which may be
dangerous or otherwise failure-prone,
as in exposed crypto keys and privacy
violations). Composition itself can often
introduce new vulnerabilities.
˲Develop and systematically use
more ways to reliably increase trustworthiness through composition. Desirable approaches might include (for
example) the use of error-correcting
codes, cryptography, redundancy,
cross-checks, architectural minimization of what has to be trusted, strict encapsulation, and hierarchical layering
that avoids adverse dependencies on
˲ Whenever a technology or a component might be potentially unsound,
trustworthiness (for both composition
and compositionality) must be independently evaluated—for example,
when using machine learning in life-critical applications.
˲ Adopt and honor underlying principles of computer systems, especially
with respect to total-system trustworthiness for safety and security.
˲ Eschew the idea of inserting back
doors (or not patching existing ones)
in computer and communication sys-
tems. 1 It should be intuitively obvious
that if back doors in systems have ex-
ploitable vulnerabilities, they would be
exploited by people and programs sup-
posedly not authorized to use them.
Nevertheless, governments repeatedly
fantasize that there can be bypasses
that would be securely accessible only
to ‘authorized’ entities. (Consider again
the first bullet item in this column.)
˲ Recognize that trustworthiness in
the “Internet of Things” may always be
suspect (for example, regarding security, integrity, human safety, and privacy).
Various hardware-software and operational approaches must be developed
(perhaps easily securable and locally
maintainable firewalls?) that can help
control and monitor how various classes
of devices can more soundly be connected to the Internet. Self-driving vehicles,
fully autonomous highways, and totally interconnected smart cities imply
the components and the total systems
must be significantly more trustworthy.
However, the risks of ubiquitously putting your crown jewels on the Io T would
seem to be excessively unwise.
˲ Accept the fact that the use of remote processors and storage (for example, cloud computing) will not necessarily make your computer systems more
trustworthy, although there are clearly
considerable cost and operational savings that can result from not having to
manage local hardware and software.
Nevertheless, trusting trustworthy
third-party cloud providers would be
more desirable than attempting to create one’s own. As in other cases, there
are many trade-offs to be considered.
˲ Accept the reality that we cannot
build operational election systems that
are trustworthy enough to withstand
hacking of registration databases, insider misuse, rampant disinformation
We need some
in the state of the
art and practice of
developing and using