than in patching flaws in deployed
code that puts users at risk?
Another issue is whether the USG
would be willing to disclose all vulnerabilities. Under current policy, software
flaws it uncovers are generally to be disclosed to vendors in order that they can
be patched, but they can be kept secret
and exploited when there is “a clear
national security or law enforcement”
10 At the same time, it seems
likely that many discovered vulnerabilities will never reach the USG, being
held for the purposes of exploitation
by criminals and foreign governments.
And many persons might simply oppose reporting them to the USG.
Finally, vulnerability disclosure has
the downside of increasing the risks
of those using the reported products,
at least until they can acquire and install the necessary patches. Consider
ShellShock, a flaw in the UNIX Bourne-again shell (Bash), which lets attackers
remotely execute code or escalate privileges. Disclosure of the flaw allowed
attackers to harvest vulnerable computers into a botnet that sent out over
100,000 phishing email messages.
Symantec study found that attacks exploiting particular zero-day vulnerabilities increased by as much as 100,000-
fold after their disclosure.
A better approach to reducing vulnerabilities would be to hold software companies liable for damages incurred by
cyber-attacks that exploit security flaws
in their code. Right now, companies
are not liable, protected by their licensing agreements. No other industry enjoys such dispensation. The manufacturers of automobiles, appliances, and
other products can be sued for faulty
products that lead to death and injury.
In Geekonomics, David Rice makes
a strong case that industry incentives
to produce secure software are inad-
equate under current market forces,
USG. Even if the average price rose to
$100,000, the annual cost would still
be reasonable at $800 million. How-
ever, the costs could become much
higher and the problems worse if the
program perversely incentivized the
creation of bugs (for example, an in-
side developer colluding with an out-
side bounty collector).
1 Costs could
also rise from outrageous monetary
demands or the effects of more people
looking for bugs in more products.
I especially worry that by shifting
the cost from the private sector to the
USG, companies would lose an eco-
nomic incentive to produce more se-
cure software in the first place. As it is,
an empirical study by UC Berkeley re-
searchers of the bug bounty programs
offered by Google and Mozilla for their
respective browsers, Chrome and
Firefox, found the programs were eco-
nomically efficient and cost effective
compared to hiring full-time security
researchers to hunt down the flaws.
Would it not be better to shift the in-
centives so it was more economical to
invest in secure software development