such as these, which reflect choices
about what content can appear on the
website and in what form, are editorial
choices that fall within the purview of
traditional publisher functions.”
This legal standard facilitates inno-
vation in several ways.
First, services may freely experiment
with new ways of gathering, sorting,
and presenting user-generated content.
Under different liability rules, those experiments would expose the services to
liability for any harmful content they
missed, discouraging experimentation
and innovation. For example, plaintiffs
have argued that user-generated content sites should face liability for different ways they algorithmically promote
or excerpt user content. For now, Section 230 forecloses those arguments.
Second, Section 230 helps innovative
services launch without having been
perfected, so services can error-correct
and fine-tune their technology in response to actual usage. For example,
new online services can launch without
replicating Google’s $100M+ investment in filtering technology or hiring
Oakmont, moderating user content
increased the service’s potential legal liability for any harmful content
it missed. Accordingly, services had
to moderate user-submitted content
perfectly or accept liability for any mistaken decisions. Alternatively, it might
be legally wiser for Internet services to
passively host user content—like CompuServe’s passive distribution of Rumorville—than to do any moderation
at all. Following Cubby and Stratton
Oakmont, the Internet community was
not sure which approach was better.
The Online Pornography Overreaction.
In 1995, sensational (and largely overblown) stories reported that children
could easily access pornography online. Congress responded to this panic
with a new crime that would send online service operators to jail if they allowed children to access pornography.
Two Congressmen, Reps. Cox and
Wyden, envisioned a different ap-
proach. Instead of banning online por-
nography, they thought online services
would voluntarily curb user-submitted
pornography—if the services did not
face the Moderator’s Dilemma cre-
ated by the Stratton Oakmont decision.
Thus, Cox and Wyden proposed shield-
ing online services from liability for
third-party content, with the hope that
online services would feel legally se-
cure enough to perform content mod-
eration duties that benefit everyone.
That proposal became Section 230.
Though the criminal liability provisions and Section 230’s immunity were
intended as alternatives, Congress combined them into a single law called the
Communications Decency Act (“CDA”).
In 1997, the U.S. Supreme Court struck
down the CDA’s criminal provisions,
leaving Section 230 in place.
What Section 230 Does
Section 230 gives technologists enormous freedom to design and implement user-generated content services.
As a federal appeals court explained
in 2016 while ruling in favor of Section 230’s immunity: “[the plaintiff’s]
claims challenge features that are part
and parcel of the overall design and
operation of the website … Features