Q: You have a reputation for taking
contrarian stands on issues. This
seems to result from your desire to
understand whether popular claims
stand on solid ground—and frequently they do not. A few years ago you challenged Metcalfe’s Law that the value
of a network grows with the square of
the number of nodes. What was your
challenge and what came of that?
A: The argument (developed in a
paper with Briscoe and Tilly) was that
Metcalfe’s Law overestimated the
value of a network. We proposed that
usually a more accurate measure was
given by the product of the number of
nodes and the logarithm of that. This
proposal has held up quite well. This
leads to a more realistic view of the
size of network effects for new technologies, and therefore of the prospects of new ventures.
More generally, contrary opinions
do help broaden people’s horizons and
prepare them for the inevitable surprises. In some cases, the dominant consensus is not just wrong, but leads to
substantial waste of time and resources. That is the case with the apocalyptic claims about cybersecurity. There
is much talk about need for drastic
action and reengineering our systems
from the ground up. But this talk is
not matched by actions. Technologists
overestimate their chances of making
big impacts with their radical proposals. There is a need for improved security technologies, but when we look at
decisions that are being made, we see
they implicitly assume that security is
important but not urgent. This is likely
to continue. I expect us to continue to
make good progress in staying ahead of
criminals and attackers without radical
changes in Internet and operating system architectures.
Andrew Odlyzko is a mathematician and a former head
of the University of Minnesota’s Digital Technology
Center and of the Minnesota Supercomputing Institute,
USA. He was previously a researcher and research
manager at Bell Labs and AT&T Labs, USA. His recent
works are available at his home page http://www.dtc.
Peter J. Denning ( email@example.com) is Distinguished
Professor of Computer Science and Director of the
Cebrowski Institute for information innovation at the
Naval Postgraduate School in Monterey, CA, USA, is
Editor of ACM Ubiquity, and is a past president of ACM.
The author’s views expressed here are not necessarily
those of his employer or the U.S. federal government.
Copyright held by author.
As you mentioned, we are unable
to formally verify the giant operating
systems we most rely on. But we can
formally verify small systems, such as
those needed to run reliable backup
systems. Those are key to recovery,
and thus to resilience.
Q: You have said that some of the still-
popular older technologies for secu-
rity such as firewalls are less secure.
Can you say more?
A: Firewalls have been getting less
effective. One reason is that more and
more of the traffic is encrypted, and
thus increasingly difficult for firewalls
to classify. Another is that the entire
digital environment of the enterprise
has changed. Originally, firewalls were
a good way to protect trusted internal
systems from hostile penetration. Today the architecture of enterprises has
changed considerably. Their systems
are intertwined with those of suppliers,
partners, and customers, as well as with
devices owned by employees. Much
computation happens in the cloud, not
the local network. In this environment,
security professionals have less ability
to see and control what is happening.
There is no well-defined security perimeter for a firewall to protect.
In addition, far more of the attacks
rely on human engineering—for example, phishing, whaling, ransom-ware, frauds, deceptions, social engineering. Firewalls cannot stop them.
On the other hand, firewalls continue to improve. They are far more
sophisticated than their early incarnations of two decades ago. They are
not about to disappear.
As the economy
at large increase
crime is migrating
Advertise with ACM!
Reach the innovators
and thought leaders
working at the
Request a media kit
+ 1 212-626-0686