should not only have the authority to
remove false or inaccurate content,
they should also be charged with the
responsibility for doing so.
A book-review scenario provides
good examples of this test. In it, a sev-en-year-old girl has posted a negative
book review that turns out to have been
written by her father. Who has the authority to remove it? Hypotheticals we
spelled out in our surveys tested removal by six different entities: the commercial host of the review website (
Amazon); two potential content owners (the
father or the girl); the book’s author
(Maurice Sendak); Sendak’s publicist;
and an Amazon customer who learned
the review was posted under false circumstances. Who do participants believe should be given prevailing authority to remove the content, given a good
reason to do so?
Study participants endorsed the
idea of granting Amazon, the commercial host of the reviews, the responsibility of removing dubious content
(89%, or 180/203, positive responses).
The father or the girl (now a teen), as
content owners, were secondarily given the authority to remove the review
(77% and 79% positive, respectively).
In spite of the question of self-interest, 51% of the participants thought
Sendak (the author) should be able to
remove the apparently fraudulent review. Only 24% thought a knowledgeable customer (aware of the circumstances under which the review was
written) should normally be allowed
to remove the review. Unsurprisingly,
the least popular option was to allow
the publicist, who was clearly acting in
her client’s self-interest, to remove the
review; 83% thought this should not
be allowed in normal circumstances.
Figure 3 compares the action taken by
different stakeholders.
Removal is generally the most
controversial of the three actions. Al-
though social media users want to be
able to groom their own online self-
presentation, there is a concomitant
expectation that others should not re-
move content willy-nilly. The surprise
in the study participants’ responses to
these hypotheticals was the degree of
authority invested in the commercial
service providers. The norm is to see
commercial service providers as con-
tent guardians when such an action
loss, participants also seek control over
their digital identities. An important
reason participants do not want the Li-
brary of Congress to maintain social
media collections is that their old, un-
revised, public selves may be on display
in such collections; ordinary citizens
wish to control the retrospective and
future versions, not just the current ver-
sion, of themselves. Media types more
closely associated with one’s sense of
self—Facebook profiles, photos, tweets,
gaming data, even reviews—provoke a
stronger anti-collection response. For
media types more aligned with personal
privacy—photos, tweets, gaming data—
survey participants reported that limit-
ing access to researchers—even without
a definition of what constitutes a re-
searcher—mitigates anticipated harm;
for media types strongly aligned with
one’s identity, the preferred mitigation
strategy is to place the collection under
a long-term embargo restricting access
to the collection for a specified period.
Figure 2b and Figure 2c compare reactions to retaining a collection of gaming
data as opposed to a collection of online
book reviews. Gaming data provokes
a strong anti-collection response if everyone is given immediate access, mitigated somewhat by limiting access to
researchers, and even more by putting
the collection under a 50-year embargo.
On the other hand, the harm associated
with collecting online book reviews is
better mitigated by limiting access to
the collection to researchers, and less by
the proposed 50-year embargo.
This finding returns the discussion
to the issue of intent; to garner popular support, reuse for the public good
must be weighed against individuals’
ability to develop and maintain a sense
of digital identity. Likewise, notions
like veracity, which normally do not enter into the legal calculus of fair use, do
play a part in defining social norms.
Figure 2 summarizes important
reuse concepts, as they give rise to social norms by providing contrasting
responses to pairs or triples of hypotheticals.
Removing Social Media
Removing social media content by any-
one besides the person who posted it
is the most speculative of the three ac-
tions we have investigated. Our surveys
refer to this action as “removal” rather
than “deletion” because it is intended
to be nondestructive. Removal targets
the copy in a particular place or context,
not the content itself. Through their
responses to open-ended questions
about content removal, participants’ re-
vealed they usually remove material for
three curatorial reasons: as a personal
information management task (such as
“cleaning up” one’s account); in service
of online identity management (such
as untagging an unflattering photo of
oneself); or to reflect one’s changing
understanding of privacy or some other
aspect of online life (such as removing
one’s birthday from a profile).
Most social media services do not
allow users to remove content created
or posted by someone else. We thus
derived hypotheticals from what par-
ticipants in other studies mentioned
they wanted to do. 7, 19 Our surveys’ re-
moval hypotheticals primarily tested
three variants: changes in the remov-
er’s relationship to the content (such
as Should social media users be able
to remove published content accord-
ing to their own self interest?); cir-
cumstances in which a neutral non-
owner can remove material (such as
Should non-owners be able to remove
published content they believe is de-
monstrably wrong); and situations in
which removal requires requesting
and receiving permission.
Participants generally did not sup-
port the idea of removing someone
else’s content in one’s own self-inter-
est, regardless of whether other miti-
gating circumstances were introduced.
What about the case of fraud detect-
ed by a neutral non-owner? Wikipedia
has inured its users to the idea that con-
tent will be reviewed and removed if it
does not pass the acid test. The results
of our studies have revealed that cer-
tain entities are imbued with sufficient
authority to support this hypothetical
type of removal. As a social norm, ve-
racity is apparently balanced with the
first factor—self-interest. If self-inter-
est is involved, a content non-owner is
not entrusted with the public welfare.
Our study participants have often
felt a custodial relationship to the content (such as a website owner, service
provider, or podcast producer) should
give users the authority to remove content. In fact, the common expectation
is that commercial service providers