ernments and ways of life though what
is sometimes known by the military as
influence operations{ 24.09}.
6
Before launching into the principles, one more important point needs
to be made: Engineers are responsible
for the safety and security of the systems they build { 19. 13}. In a conversation with my mentor’s mentor, I
once made the mistake of using the
word customer to refer to those using
the cybersecurity systems we were designing. I will always remember him
sharply cutting me off and telling me
that they were “clients, not customers.” He said, “Used-car salesmen
have customers; we have clients.”
Like doctors and lawyers, engineers
have a solemn and high moral responsibility to do the right thing and keep
those who use our systems safe from
harm to the maximum extent possible, while informing them of the risks
they take when using our systems.
In The Thin Book of Naming Elephants,
5 the authors describe how the
National Aeronautics and Space Administration (NASA) shuttle-engineer-ing culture slowly and unintentionally
transmogrified from that adhering to a
policy of “safety first” to “better, faster,
cheaper.” This change discouraged
engineers from telling truth to power,
including estimating the actual probability of shuttle-launch failure. Management needed the probability of launch
failure to be less than 1 in 100,000 to
allow launch. Any other answer was an
annoyance and interfered with on-time
and on-schedule launches. In an independent assessment, Richard Feynman found that when engineers were
allowed to speak freely, they calculated
the actual failure probability to be 1 in
100.5 The engineering cultural failure
killed many great and brave souls in
two separate shuttle accidents.
I wrote Engineering Trustworthy Systems and this article to help enable and
encourage engineers to take full charge
of explicitly and intentionally managing system risk, from the ground up,
in partnership with management and
other key stakeholders.
Principles
It was no easy task to choose only 5%
of the principles to discuss. When in
doubt, I chose principles that may be
less obvious to the reader, to pique cu-
design attacks and defenses. It is now
time to begin abstracting and codify-
ing this knowledge into principles of
cybersecurity engineering. Such prin-
ciples offer an opportunity to multiply
the effectiveness of existing technol-
ogy and mature the discipline so that
new knowledge has a solid foundation
on which to build.
Engineering Trustworthy Systems8
contains 223 principles organized into
25 chapters. This article will address
10 of the most fundamental principles
that span several important categories
and will offer rationale and some guidance on application of those principles
to design. Under each primary principle, related principles are also included as part of the discussion.
For those so inclined to read more in
Engineering Trustworthy Systems, after
each stated principle is a reference of the
form “{x.y}” where x is the chapter number in which it appears and y is the y-th
principle listed in that chapter (which
are not explicitly numbered in the book).
Motivation
Society has reached a point where it is
inexorably dependent on trustworthy
systems. Just-in-time manufacturing,
while achieving great efficiencies,
creates great fragility to cyberattack,
amplifying risk by allowing effects
to propagate to multiple systems
{01.06}. This means that the potential
harm from a cyberattack is increasing
and now poses existential threat to institutions. Cybersecurity is no longer
the exclusive realm of the geeks and
nerds, but now must be considered as
an essential risk to manage alongside
other major risks to the existence of
those institutions.
The need for trustworthy systems
extends well beyond pure technology.
Virtually everything is a system from
some perspective. In particular, essential societal functions such as the military, law enforcement, courts, societal
safety nets, and the election process
are all systems. People and their beliefs
are systems and form a component of
larger societal systems, such as voting.
In 2016, the world saw cyberattacks
transcend technology targets to that of
wetware—human beliefs and propensity to action. The notion of hacking
democracy itself came into light,
10 posing an existential threat to entire gov-
Students of
cybersecurity
must be students
of cyberattacks
and adversarial
behavior.