The 12th Annual International
Conference on Mobile Systems,
Applications, and Services,
Bretton Woods, NH,
Contact: David Kotz,
on Measurement and Modeling
of Computer Systems,
Contact: Sanjay Shakkottai,
Innovation and Technology in
Computer Science Education
Contact: Asa Cajander,
Designing Interactive Systems
Contact: Steve Harrison,
on Management of Data,
Salt Lake City, UT,
Contact: Curtis Dyreson,
ACM Web Science Conference,
Contact: Filippo Menczer,
The 23rd International
Symposium on High-Performance Parallel and
Contact: Beth A. Plale,
search community (and
Communications readers) certainly are not directly
responsible for this state of affairs.
However, as in many fields, “in order
for evil to flourish it suffices that good
men do nothing.” And flourish it certainly does: numbers whose quality is
“below abysmal” get repeated by poli-cymakers4 and trillion-dollar cyber-crime numbers become the conventional wisdom. The answer to “How are
we doing in security?” is, to quote Viega, 7 “we have no clue.” FUD could not
achieve this without the acquiescence
of many. It is this aspect that we wish
Why is there so much FUD? Reuter5
suggests that certain conditions favor the spread of fabulist claims and
“mythical numbers”: the presence of
a constituency interested in having
the numbers high and the absence
of a constituency interested either in
having them low or accurate. Drug
and crime statistics, for example, can
easily become mythical: enforcement
agencies have budgets that depend on
the numbers being high, but no group
is correspondingly interested in understating the numbers. The same
one-sided bias applies in our domain.
Unlike global warming or the dangers
of genetically modified foods, where
lobbies exist on both sides of the issue, many in security have the incentive to exaggerate dangers and few if
any gain by understating them or ensuring accuracy.
The upward incentives may be be-
yond our ability to change. However,
we can be the constituency that de-
mands accuracy. In our roles as au-
thors, practitioners, and members of
the research community we can put
out the unwelcome mat for unsub-
stantiated claims. It is up to those of
us in security research to avoid taking
shortcuts in making the case for what
we do. We can refuse to cite question-
able reports, vague claims, or out-
landish dollar estimates. As program
committee members and reviewers
we can ask our colleagues to aspire to
a higher standard also. We do not ac-
cept sloppy papers, so citing dubious
claims (which are simply pointers to
sloppy work) should not be acceptable
either. An impressive collection of ra-
tionalizations is available to excuse
the use of unreliable information:
data is difficult to get, cybercrime may
be underreported (how would one
measure that?), and we do face active
adversaries. However, unless we resist
these temptations we look like a com-
munity that responds to uncertainty
by lowering standards.
The broader computer science
community also has a vital role to
play. To have influence, FUD needs
to spread. To spread widely it needs
the efforts of many people. If they
circulate unchallenged, after a while
bad numbers come to be accepted as
good; unless someone objects, the
more they are repeated the more they
become part of the consensus view.
In this way numbers with little basis
in reality3, 6 shape priorities and influ-
ence policy. 1, 4 Anyone can help halt
that process by refusing to forward,
quote, or tweet claims that do not
seem to add up. This need not be dif-
ficult: FUD requires an unskeptical
audience and does not fare well under
scrutiny. Performing even basic sanity
tests and asking “where did that num-
ber come from?” or “how would you
measure that?” is often all it takes.
Security has many difficulties, but
the field has no problem that FUD
does not make worse. It will not just go
away, but we can help stop the spread.
We can make it more expensive to
spread FUD than good information by
challenging FUD claims every time we
1. Anderson, R. et al. Measuring the cost of cybercrime.
In Proceedings of WEIS 2012.
2. Bonneau, J. The science of guessing: Analyzing an
anonymized corpus of 70 million passwords. IEEE
Security and Privacy (Nov. 2012).
3. Florêncio, D. and Herley, C. Sex, lies and cyber-crime
surveys. In Proceedings of WEIS 2011.
4. Maas, P. and Rajagopalan, M. Does cybercrime really
cost $1 trillion? ProPublica (Aug. 1, 2012); http://www.
5. Reuter, P. The (continued) vitality of mythical
numbers. The Public Interest, 1987.
6. Shostack, A. and Stewart, A. New School Security.
7. Viega, J. Ten years on, how are we doing? (Spoiler
alert: We have no clue). IEEE Security and Privacy
Dinei Florêncio ( firstname.lastname@example.org) is a researcher in
the Multimedia, Interaction and Communication group at
Microsoft Corporation, Redmond, WA.
Cormac Herley ( email@example.com) is a principal
researcher in the Machine Learning Department at
Microsoft Corporation, Redmond, WA.
Adam Shostack ( Adam.Shostack@microsoft.com) is a
principal program manager on Microsoft’s Trustworthy
Computing Team in Redmond, WA.
Copyright held by Authors.