were vetted, and card decks and printouts were locked
in safes—all physical or administrative measures.
Process confinement, kernelized operating systems,
and formally specified programs were all a decade in
Cryptography, in the limited quarters in which it
was known and practiced, would have been more recognizable but very primitive. From World War I,
when mechanized cryptography got its real start,
through World War II, most military cryptography
was mechanical, performed by electromechanical
machines whose action combined a number of table
lookups with modular arithmetic. A character to be
enciphered was put through a sequence of table
lookups interspersed with the addition of keying characters—a slow process capable of encrypting teletype
traffic but utterly inadequate for coping with voice.
The 1950s were dominated by the effort to bring
the speed of encryption closer to the speed of the
modern world. Military cryptography—to the degree
it had gone beyond rotor machines—consisted primarily of what are called long-cycle systems. The
backbones of these systems were linear feedback shift
registers of maximal period. Two techniques were
used to make the output (which was to be XORed
with the plain text) nonlinear. The registers stuttered
(paused or skipped states about half the time), and the
output came from tapping a number of stages and
combining them into one bit with nonlinear combinational logic.
Only one small laboratory, the Air Force Cambridge Research Center (military but out of the mainstream of cryptography), had begun looking at the
ancestors of the U.S. Data Encryption Standard and
many other systems while working on cryptographic
techniques for identification friend or foe, the technique by which a fire-control radar recognizes that an
incoming plane is friendly and should not be fired on.
The radar sends the plane a challenge; if the plane
decrypts the challenge, modifies it in an agreed-upon
way, and reencrypts it correctly, the radar tells the gun
to hold its fire.
The process of recognizing a signal by its correct
encryption is one to which the stream ciphers of communications are ill suited. Rather than a system in
which each bit of the message depends on one bit of
key, with which it was XORed, a system is needed in
which every bit of output depends on every bit of
input. Today we call such a system block ciphers or
electronic code books.
Over the past 50 years, both computer security and
cryptography have made great strides, and CACM
has played an important role in the growth of each.
Computer security as we think of it today was the off-
spring of time sharing and multiprocessing. Once a
computer could run jobs on behalf of several users at
a time, guarding the computer room was no longer
sufficient. It was necessary to guarantee that each
individual process inside the computer was not spying
on another such process.
Time sharing was born in the early 1960s and by
the late 1960s was a major force. It was in use in computing laboratories around the world and offered commercially by service bureaus that, five years earlier, had
been running one program at a time for their customers who submitted decks of cards. The turn from
the 1960s to the 1970s marked the birth of both computer security and the modern era in cryptography.
Computer security came first. The introduction of
timesharing had been particularly disruptive in the
culture of military laboratories. Time sharing allowed
those doing unclassified work to move into a crude
approximation of the environment we enjoy today—
15-character-per-second model- 35 teletypes, then
primitive cathode-ray tube screens rather than high-speed flat-screen displays—but interactive work
within one’s own office during normal working
hours. Those dependent on classified computing
found themselves ghettoized into working in the
computer area for a few hours in the evening after the
others had gone home. The result was a major program to produce far more secure computers. The formula was simple, starting with writing better code. As
we envisioned it then, this meant mathematically
proven to be correct. But as not all of one’s code can
be one’s best code, less-trusted code had to be confined so it couldn’t do any damage. These are problems on which much time has been expended yet still
have no fully satisfactory solution.
Curiously, computer security in the late 20th century was rescued by another great development of
computer science—networking—particularly client-server computing. Networking brought forth the
need for cryptography, a subject kept secret from and
neglected by the computer science community at the
The 1970s saw the development of public-key
cryptography, a new approach to secure communication that surmounted a long-accepted obstacle to the
broad use of cryptography, that is, to communicate
securely you must share a secret at the outset. Public-key cryptography made a major improvement in key
management—by eliminating most of the need to
transport secret keys—and made possible digital signatures. Together they improved key management,
and digital signatures fostered the growth of Internet
commerce in the 1990s. The appearance of public-key also sparked an explosion of public, business, and