abuse data entrusted to them must expect to be held accountable.
Facebook/Cambridge Analytica was
not the first example of abuse, nor will
it be the last. The FTC’s privacy protection is evidently not working very well.
Maybe the time has come for comprehensive privacy legislation focused on
aligning corporate incentives so their
products provide the privacy people
expect and deserve. The California law
might be a step in this direction.
A society where individuals are
willing to share data for social benefit
must make individuals confident that
shared data are unlikely to be abused
and that abusers can be identified
and made accountable.
a Research into the weaknesses of anonymiza-tion or de-identification schemes is needed
to understand the limitations of these techniques. Like research that exposes security
weaknesses in systems, it must respect the
concerns of those whose data is being studied.
1. Assembly Bill 375, California Consumer Privacy Act of
2. Barlyn, S. Strap on the Fitbit: John Hancock to sell
only interactive life insurance. Reuters (Sept. 19,
3. Carpenter v. U.S. 16-402. Decided June 22, 2018;
4. Confessore, N. Audit approved of Facebook policies,
even after Cambridge Analytica leak. The New York
Times (Apr. 19, 2018); https://nyti.ms/2vBniFI
5. Department of Commerce, NTIA, RIN 0660–XC043.
Developing the administration’s approach to consumer
privacy. Federal Register 83,187 (Sept. 26, 2018);
6. Erlich, Y. et al. Identity inference of genomic data
using long-range familial searches. Science (Oct. 11,
2018); https://bit.ly/2CadG TP
7. Hempel, J. A short history of Facebook’s privacy
gaffes. WIRED (Mar. 30, 2018); https://bit.ly/2Gj TPVD
8. Murphy, H. How an unlikely family history website
transformed cold case investigations. The New York
Times (Oct. 15, 2018); https://nyti.ms/2EnGHhE
9. Murphy, H. Most white Americans’ DNA can be
identified through genealogy databases. The New York
Times (Oct. 11, 2018); https://nyti.ms/2pRFhBX
10. NIST Privacy Framework Fact Sheet, Sept. 2018;
11. NIS T Framework for Improving Critical
Infrastructure Cybersecurity, Version 1. 1 (Apr. 16,
2018); https://nvlpubs.nist.gov/nistpubs/CS WP/NIST.
12. Official Journal of the European Union. General Data
Protection Regulation. 4. 5.2016. (English version);
13. Public Law 93-579. Privacy Act of 1974. (Dec. 31,
14. Vengattil, M. and Paresh, D. Facebook now says data
breach affected 29 million users, details impact.
Reuters (Oct. 12, 2018); https://reut.rs/2CGewZz
15. Wasabayashi, D. Google Plus will be shut down after
user information exposed. The New York Times (Oct. 8,
Carl Landwehr ( email@example.com) is Lead
Research Scientist the Cyber Security Policy and Research
Institute (CSPRI) at George Washington University in
Washington, D.C., and Visiting McDevitt Professor of
Computer Science at LeMoyne College in Syracuse, NY.
Copyright held by author.
for some baseline of measures seems
warranted, even essential.
And Congress, for the first time in
years, is showing some interest in
drafting comprehensive privacy legislation. This may become a hot topic
for the 116th U.S. Congress if public interest continues to be strong.
Returning to the Facebook/Cambridge Analytica incident, this is of
immediate importance to those in
the computing profession, particularly those conducting research. A researcher with academic connections
gained permission from Facebook
to put up an app to collect data for
research purposes in 2014. This app
collected data from some Facebook
users who consented to the collection, but also from millions of others
without their knowledge or consent.
This collection would now violate
Facebook’s policies, but it was not a
violation at the time. The researcher
provided this data to Cambridge Analytica, presumably in violation of
Facebook’s policies. Cambridge Analytica exploited the data for commercial purposes.
The primary issue here is accountability. This was either a violation of
the academic’s agreement with Facebook, or evidence that the agreements
were insufficient to meet Facebook’s
2011 consent decree with the Federal Trade Commission (FTC). The
privacy of millions of people was violated and the reputation of legitimate
academic researchers was tarnished.
Facebook apparently had little incentive to hold the researcher and Cambridge Analytica to account. Aware
of what happened over a year before
the disclosure, Facebook belatedly issued yet another in a long history of
The FTC and the Securities and Exchange Commission (SEC) are investigating this incident. The SEC could
find Facebook liable for failing to inform its shareholders of the incident
when discovered. The FTC could find
Facebook violated the terms of their
2011 consent agreement by failing to
protect their customers’ data in accordance with the consent decree.
A court could make Facebook pay
fines large enough to give it suffi-
cient incentive to enforce the correct
privacy policies on researchers and
those commercial entities that use
Facebook data. The U.K. has already
levied a fine of £500,000, the largest
its legislation allows, but this is un-
likely to provide much incentive to a
company whose 2017 net income was
over $15 billion. The GDPR permits
penalties of up to 4% of global rev-
enues, which for Facebook would be
well over $1 billion, but the incident
occurred before the GDPR took effect.
The threat of future fines should give
Facebook incentive to prevent recur-
Fines levied by the FTC go into the
U.S. Treasury. Facebook’s users took
the risks and are suffering the consequences. Should they be compensated? A penny or dime for each user
whose privacy was violated might not
be the answer. Perhaps more progress
would come from financing investigative journalism or other controls,
but might not be within the scope of
actions regulatory agencies can take.
Imagination might be required to
help Facebook hold their clients to account in ways that compensate Facebook users.
Computing professionals involved
in “big data” research should pay attention if they wish to gain access to datasets containing or derived from personal information. They must abide by
agreements made with dataset providers and remember that exposing data
improperly damages public trust in
research. Accidental or intentional release of personal data provided for research purposes to anyone else, even if
aggregated and anonymizeda attracts
public attention. Researchers who
the first time