to undermine democracy. For example,
various dirty-tricks efforts in Richard
Nixon’s House, Senate, and Presidential
elections were a harbinger of the use of
non-technological tactics. The Kerry
Swift-boating attacks in 2004 should
have been another warning sign. However, the 2016 election should really bring
Citizens’ United, targeted disinformation, creative redistricting, and other issues to the forefront, although most of
the computer scientists working in this
area are generally still focused primarily on the computer systems, because
the politicians have often been rather
disinterested in the big picture or in the
technology—apart from some recent
concerns about Facebook, Cambridge
Analytica, and related issues.
Redistricting and disenfranchisement
are also huge concerns.
As I write this, the Supreme Court
has just upheld Ohio’s law to remove
voters who are not voting “frequently
enough.” In addition, the Supremes
seem to be unable to cope with mathematical reasoning and sound logic.
Leah, as you yourself have suggested
to me quite incisively in a broader context, “if we can’t agree on parameters for
using or distributing a particular set of
tools, we cede it to malicious forces as a
matter of course.”
When we spoke previously, you were
working on an effort to develop new
systems that could be much more trust-
worthy. Can you give me an update?
Certainly. Our hardware-software
design and development efforts based
on our CHERI (Capability Hardware
Enhanced RISC Instructions) instruc-tion-set architecture (ISA) began in
2010, and will now continue into early
2021. We are also formally verifying
that the ISA satisfies certain critical
properties. This is joint work between
SRI and the University of Cambridge.
Our website ( http://www.cl.cam.
ac.uk/research/security/ctsrd/) includes the latest hardware ISA specification (along with several variant possible CHERI implementations, and
ongoing tech transfer), as well as our
published papers.
Leah Hoffmann is a technology writer based in Piermont,
N Y, USA.
© 2018 ACM 0001-0782/18/12 $15.00
ly on the technical security of voting
systems, and not enough on the hackability of human behavior? Aside from
the public opprobrium that Facebook
and other social media outlets have
since faced, do you think we are doing
better to incorporate the total scope of
threats to free and fair elections?
Given the enormous risks of direct-recording election equipment (DREs)
with only proprietary software, proprietary data formats, and proprietary data
during elections, and no meaningful audit trails or possibilities for remediating
obviously fraudulent results, our initial
efforts were urgently devoted to making
the case for voter-verified paper trails
that would be the ballot choices of record. For example, I testified in January
1995 for the New York City Board of Elections, and David Dill, Barbara Simons,
and I spoke in multiple hearings in 2003
before the Santa Clara County (CA) supervisors, who were planning to acquire
$24-million worth of paperless DREs.
Dan Boneh, David Dill, Doug Jones, Avi
Rubin, Dave Wagner, Dan Wallach, and
I participated for seven years beginning
in 2005 in an NSF (National Science
Foundation) collaborative effort called
ACCURATE: A Center for Correct, Usable, Reliable, Auditable and Transparent Elections. (For more recent analysis,
see Broken Ballots: Will Your Vote Count,
by Doug Jones and Barbara Simons;
http://www.timbergroves.com/bb/.)
It was evident that unauditable DREs
were a huge weak link. On the other
hand, I have long maintained that es-
sentially every step in the election pro-
cess represents a potential weak link
mentors,
Roger Nash Baldwin, who founded
the ACLU (American Civil Liberties
Union) in 1919.
Altruism is perhaps the most impor-
tant virtue we need to maintain, espe-
cially in times of adversity. “And that’s
the word: Altruism.”
With today’s fake news and myriad
forms of disinformation, perhaps we
need a word similar to altruism for the
“unselfish concern for the truth,” which
we might pronounce as all-true-ism.
What do you make of the fallout from
the 2016 election?
As a civilization, we must work even
harder to promote common sense, reality, the relevance of science and dependable engineering, and above all, the
truth. All of these are fundamental to human rights, election integrity, freedom
of speech, civil rights, and the preservation of democracy. The risks relating to
technologies such as artificial intelligence, machine learning, the Internet
of Things, and social media, are generally not sufficiently well understood
from the perspective of security—
especially in the absence of trustworthy
systems and trustworthy people, and
in the presence of misanthropes with
no respect for values, ethics, morals,
and established knowledge.
Are you optimistic about the GDPR
(the European Union’s General Data
Protection Regulation) and other pri-
vacy initiatives?
Many efforts relating to security and
privacy fall victim to the reality that our
computer systems are still inherently
untrustworthy, and easily attacked by
the Russians, Chinese, corrupt insiders, and potentially everyone else. Everything seems to be hackable, or otherwise adversely influenced, including
elections, automobiles, the Internet of
Things, cloud servers, and more. Essentially, the needs for better safety,
reliability, security, privacy, and system
integrity that I highlighted 24 years ago
in my book, Computer-Related Risks, are
still with us in one form or another today. If we do not have systems that are
sufficiently trustworthy, respecting privacy remains even more challenging.
Before 2016, do you think computer sci-
entists were guilty of focusing too close-
“As a civilization,
we must work even
harder to promote
common sense, reality,
the relevance
of science and
dependable
engineering, and
above all, the truth.”
[CONTINUED FROM P. 128]