the specific legislative privacy principles of their region or sector or the
OECD Privacy guidelines as a starting
point to determine privacy protection
goals. In Europe, for example, the European Data Protection Directive 95/46/
EC or its successor should be taken. It
includes the following privacy goals:
˲ Safeguarding personal data quality through data avoidance, purpose-specific processing, and transparency
vis-à-vis data subjects.
˲ Ensuring the legitimacy of personal and sensitive data processing.
˲Complying with data subjects’
right to be informed, to object to the
processing of their data, and to access,
correct, and erase personal data.
˲ Ensuring confidentiality and security of personal data.
Security and privacy in this view are
clearly distinguished. Security means
the confidentiality, integrity, and availability of personal data are ensured.
From a data protection perspective security is one of several means to ensure
privacy. A good PbD is unthinkable
without a good Security by Design plan.
The two approaches are in a “positive
sum” relationship. That said, privacy
is about the scarcity of personal data
creation and the maximization of individuals’ control over their personal
data. As a result, some worry that PbD
could undermine law enforcement
techniques that use criminals’ data
traces to find and convict them. More
research and international agreement
in areas such as anonymity revocation
are certainly needed to demonstrate
this need not be the case even if we
have privacy-friendly systems.
After privacy goals are clearly defined, we must identify how to reach
them. The PIA Framework mentioned
earlier is built on the assumption that
a PbD methodology could largely resemble security risk assessment processes such as NIST or ISO/IEC 27005.
These risk assessment processes identify potential threats to each protection goal. These threats and their probabilities constitute a respective privacy
risk. All threats are then systematically
mitigated by technical or governance
controls. Where this cannot be done,
remaining risks are documented to be
As in security engineering, PbD
controls heavily rely on systems’ ar-
2 Privacy scholars still
put too much focus on information
practices only (such as Web site pri-
vacy policies). Instead, they should
further investigate how to build sys-
tems in client-centric ways that maxi-
mize user control and minimize net-
work or service provider involvement.
Where such privacy-friendly architec-
tures are not feasible (often for busi-
ness reasons), designers can support
PbD by using technically enforceable
default policies (“opt-out” settings)
or data scarcity policies (erasure or
granularity policies), data portabil-
ity, and user access and delete rights.
Where such technical defaults are not
feasible, concise, accurate, and easy-
to-understand notices of data-han-
dling practices and contact points for
user control and redress should come
For privacy to be embedded in the sys-
tem development life cycle and hence
in organizational processes, compa-
nies must be ready to embrace the do-
main. Unfortunately, we still have too
little knowledge about the real dam-
age that is being done to brands and
a company’s reputation when privacy
breaches occur. The stock market sees
some negligible short-term dips, but
people flock to data-intensive services
(such as social networks); so far, they
do not sanction companies for pri-
vacy breaches. So why invest in PbD
measures? Will there be any tangible
benefits from PbD that justifies the in-
vestment? Would people perhaps be
willing to pay for advertisement-free,
privacy-friendly services? Will they in-
cur switching costs and move to com-
petitive services that are more privacy
friendly? Would the 83% of U.S. con-
sumers who claim that they would stop
doing business with a company that
breaches their privacy really do so? We
need to better understand these dy-
namics as well as the current changes
in the social perception of what we re-
gard as private.
1. Cavoukian, a. Privacy by design Curriculum 2.0, 2011;
2. spiekermann, s. and Cranor, l.F. engineering privacy.
IEEE Transactions on Software Engineering 35, 1
(Jan./Feb. 2009), 67–82.
Sarah Spiekermann ( firstname.lastname@example.org) is the head of
the institute for Management information systems at
the vienna university of economics and business, vienna,