Vviewpoints
DOI: 10.1145/1400214.1400223
Privacy and security
A Multidimensional
Problem
It’s not just science or engineering that will be needed to address
security concerns, but law, economics, anthropology, and more.
WhEn thosE oF us who are now editors of
this magazine were
in graduate school, it
was easy to believe that
with the inevitable exception of automation, social implications of computing technology could be ignored
with impunity. Yes, before the public
Internet, there was discussion of social
impact—Joe Weizenbaum’s ELIZA, the
Department of Health, Education, and
Welfare report, Records, Computers,
and the Rights of Citizens,a the establishment in Europe and Canada of data
commissioners, the “I am a [person].
Please don’t bend, fold, spindle or mutilate me,” joke that made the rounds
in the 1970s,b the role of computers
in President Reagan’s Star Wars program—but it was easy to maintain the
fiction that the machines we built and
the code we wrote had as much social
impact as the designers of John Deere
tractors have on the migratory patterns
of cliff swallows: minimal and not really significant.
Tom Lehrer once sarcastically characterized a famous astronautics engineer, “‘Once the rockets are up, who
cares where they come down? That’s
illustration by phil hulinG
a This report, which recommended legislation
supporting Fair Information Practices for
automated data systems, was highly influential in both Europe and the United States; see
http://aspe.hhs.gov/DATACNCL/1973privacy/
tocprefacemembers.htm.
b This was a takeoff on IBM’s instructions for
the handling of punch cards.
not my department,’ says Wernher von
Braun.” 3 But while the view that scientists bear responsibility for the social
impact of their work was perhaps radical when it was espoused by Joseph Rot-blat (a nuclear physicist who later won
a Nobel Peace Prize for his work on nuclear disarmament) in the decade after
Hiroshima and Nagasaki, this expectation is no longer unusual. It is also no
less true for technologists now than for
scientists.
This is part of the ACM code. The
original ACM Code of Ethics and Professional Conduct stated, “An ACM
member shall consider the health,
privacy and general welfare of the public in the performance of the mem-
ber’s work.” It went on to say that, “An
ACM member, when dealing with data
concerning individuals, shall always
consider the principle of individual
privacy and seek the following: To
minimize the data collection; To limit
authorized access to the data; To provide proper security for the data; To determine the required retention period
of the data; To ensure proper disposal
of the data.” (The current ACM code of
ethics contains a similar set of principles, though it omits the requirement
regarding proper disposal of data.)
But observing current computer privacy and security practices leads one to
question whether this code is honored
more often in the breach.
Each week brings yet another news
story of a major security breach, of the
ability to do a cross-site scripting attack
on the new favorite mailer, of the polymorphic virus code that changes its
signature to evade detection. We aren’t
getting privacy and security right.
We aren’t even asking the right questions. A recent U.S. Department of Defense (DoD) effort to develop an Iraqi
ID database of bad guys is one such
example. The database includes not
just names, but biometric identifiers:
fingerprint records and iris scans; its
purpose is to maintain records on the
people who keep turning up in an area
soon after an explosion has occurred. 2
As any developer knows, of course, this
database will not be used only in this
way. One such likely use will be at checkpoints—and currently in Iraq, it can be