makers decide to issue some dictum
just because it seems like a good
idea. Then, the technology solution
providers readily agree, knowing full
well that the new policy will be totally
unenforceable. The policy becomes
nothing more than window-dressing
for the industry.
Data is money, and that goes to the
core of the browser-security debate.
Browser users do not fully appreciate
the value of their own data, but the
Facebooks and Googles of the world
certainly do. Introducing measures to
help users protect their data gets in
the way of milking that data for all its
worth. That is a strong disincentive for
implementing strong browser privacy
Adding stronger security also comes
with a trade-off—more security usually means less functionality. With loss
of functionality comes loss of market
share, which vendors fear more than
Only when users begin to see the value of their data and demand more protection for it will privacy measures get
their due. If the market shifts in this
direction and vendors see that adding better protection to their browsers
could actually increase market share,
then and only then will those measures
become standard operating practice.
GN-N: We talked a little earlier about how
it’s the browser users, rather than the
browsers themselves, that are the real
products here. Anyone care to expand?
RB: Well, that is the case, and it’s fundamental to this whole space. I would
argue that every last conundrum in the
area of browser security is rooted in the
fact that we are not dealing with a classic commercial model. That is, at present users don’t pay browser makers for
software or, for that matter, the maintenance and upkeep of that software.
JG: The browser makers are monetizing your data, directly or indirectly, and
therefore cannot see a way to protect
that data without losing money. That
makes for a really difficult situation.
BL: I’m not sure you can actually say
it’s the browser makers who are “
monetizing your data.” If anything, it’s the
sites that are monetizing your data.
JG: Actually, there is a clear interplay
there. Just look at Google Chrome; it’s
pretty obviously monetizing your data.
The Mozilla guys derive 98% of their
revenue directly from Google. Then
you’ve got Microsoft, which you could
argue is also desperate now to get into
the advertising business. So that raises the question: How can you work to
institute healthier business incentives
when those efforts are so obviously at
odds with the foundation the whole
business sits upon?
BL: I don’t know. One of the problems with privacy is that it is difficult
to put a value on it. It’s difficult even to
convince the users that their own privacy is actually worth all that much.
JG: Maybe users just aren’t all that
aware of what they’re giving up with every single mouse click.
BL: Right, but there are a few companies such as Allow (http://i-allow.
com) that will sign you up quite explicitly for $20 to $50 for each site you’re
willing to share your information
with. There also are various experiments under way to establish the value of each Facebook “Like,” for example. They are finding that, while some
users’ information is quite valuable,
there are many others whose information is largely useless.
RB: I think this question rides a bigger value wave where the age dynamic
comes into play. It’s hard to find anybody under the age of, say, 25 who really cares about privacy. My young nieces
happily tell me they have never felt like
they had any privacy to begin with, so
why should they start caring now?
GN-N: You have also got those people who lived through the 1960s and
1970s when the stories were rampant
about people having their data exposed by the government. There are
plenty of people that age who have
just become inured to privacy violations. They might have cared at one
point in their lives, but they’re over
JG: Another aspect of this is that security and privacy have become conflated. For example, if you have decided you can trust Google with your data,
then the question is no longer about
privacy; it’s all about security. On the
other hand, if you don’t trust your provider, you can distinguish between security and privacy.
Once you cross that threshold and
decide to trust someone with your data,
you’re in kind of the same situation we
were talking about earlier with regard
to the CA model. That is, you’re essentially stuck with trusting them forever.
It’s not like you can take back your data
from Facebook and say, “Hey, you’re
not allowed to have that anymore.”
GN-N: Yeah, just try!
JG: You can get a copy of your
data—and, according to [WikiLeaks’]
Julian Assange, that can literally run
to 1,000 pages. But, guess what, I
don’t think they are going to delete
RB: Violations of our trust are already common occurrences even in the
holy of holies—namely, the healthcare
space, where you’d like to believe the
protection of personal data would be
considered sacrosanct. If people’s trust
isn’t being honored in that domain,
what hope can we hold out for more
faithful protection anywhere else?
JG: That’s why the fact that Do Not
Track is off by default really bothers
me. By the time users figure out what
it is they’ve given up, there’s no way to
undo the damage or to take back any
degree of control. As [computer security specialist] Bruce Schneier once
pointed out, there’s no delete button
in the cloud, or at least there’s no guarantee that, once you’ve pressed delete,
things are actually going to be deleted.
GN-N: Is there any cause for hope?
JG: I have a pretty good strategy for
protecting my own data—at least it’s
good enough to improve my level of
comfort. I think it’s an approach other
people could use. The challenge is that
it takes some behavioral discipline
and a bit of know-how, both of which
are lacking for most users. There has
also been little motivation for people
to work on cleaning up their acts since,
for the most part, they’re not even
aware of the issues we’ve been talking
about. Still, I’d say there is some reason
for hope in that there are steps you can
take to protect yourself.
GN-N: Do you see the browser vendors helping matters at all?
JG: No. To give you an example: since
I really don’t like the whole SSL model,
I’ve put SSL VPNs (virtual private networks) on the Amazon cloud so that, no
matter where I am, I can be encrypted
over a hostile or untrusted network