Section 230: An Overview

Section 230: An Overview
January 4, 2024
Section 230 of the Communications Act of 1934, enacted as part of the Communications
Decency Act of 1996, provides limited federal immunity to providers and users of interactive
Valerie C. Brannon
computer services. The statute generally precludes providers and users from being held liable—
Legislative Attorney
that is, legally responsible—for information provided by another person, but does not prevent

them from being held legally responsible for information that they have developed or for
Eric N. Holmes
activities unrelated to third-party content. Courts have interpreted Section 230 to foreclose a wide
Attorney-Adviser
variety of lawsuits and to preempt laws that would make providers and users liable for third-party
(Constitution Annotated)
content. For example, the law has been applied to protect online service providers like social

media companies from lawsuits based on their decisions to transmit or take down user-generated
content.

Two provisions of Section 230 are the primary framework for this immunity. First, Section 230(c)(1) specifies that service
providers and users may not “be treated as the publisher or speaker of any information provided by another information
content provider.” In Zeran v. America Online, Inc., an influential case interpreting this provision, a federal appeals court said
that Section 230(c)(1) bars “lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional
editorial functions—such as deciding whether to publish, withdraw, postpone or alter content.” Second, Section 230(c)(2)
states that service providers and users may not be held liable for voluntarily acting in good faith to restrict access to “obscene,
lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable” material. Section 230(c)(2) is thus more
limited: it applies only to good-faith takedowns of objectionable material, while courts have interpreted Section 230(c)(1) to
apply to both distribution and takedown decisions.
Section 230 contains statutory exceptions. This federal immunity generally will not apply to suits brought under federal
criminal law, intellectual property law, any state law “consistent” with Section 230, certain privacy laws applicable to
electronic communications, or certain federal and state laws relating to sex trafficking.
Government officials and outside commentators have debated the proper scope of Section 230. While the law has a number
of defenders, others have argued that courts have interpreted Section 230 immunity too broadly. Recent Congresses have
seen a number of bills that would have amended the scope of Section 230 immunity. These proposals ranged from outright
repeal, to placing certain conditions on immunity, to creating narrower exceptions allowing certain types of lawsuits. Some
bills sought to amend the scope of Section 230(c)(1), limiting “publisher” immunity in an attempt to encourage sites to take
down certain types of undesirable content. Others sought to encourage sites to host more content by narrowing immunity for
certain types of takedown decisions.
Proposals to amend Section 230 may raise two distinct types of First Amendment issues. The first issue is whether any given
proposal infringes the constitutionally protected speech of either providers or users. This concern may be especially acute if a
proposal restricts providers’ editorial discretion or creates content- or viewpoint-based distinctions. The second issue is
whether, if Section 230 is repealed in whole or in part, the First Amendment may nonetheless prevent private parties or the
government from holding providers liable for publishing content. The First Amendment might prevent some claims premised
on decisions to host or restrict others’ speech, but its protections are likely less extensive than the current scope of Section
230 immunity.
Congressional Research Service


link to page 4 link to page 8 link to page 9 link to page 11 link to page 12 link to page 12 link to page 14 link to page 19 link to page 22 link to page 24 link to page 25 link to page 26 link to page 27 link to page 29 link to page 30 link to page 30 link to page 32 link to page 33 link to page 33 link to page 35 link to page 35 link to page 37 link to page 40 link to page 42 link to page 43 link to page 46 link to page 52 link to page 54 Section 230: An Overview

Contents
Text and Legislative History ........................................................................................................... 1
Section 104: Online Family Empowerment .............................................................................. 5
Stratton Oakmont, Inc. v. Prodigy Services Co. ........................................................................ 6
Judicial Interpretation ...................................................................................................................... 8
Section 230(c)(1): Publisher Activity ........................................................................................ 9
Early Interpretations: Zeran v. America Online, Inc. .......................................................... 9
Service Provider Role as Publisher .................................................................................... 11
Information Provided by Another Information Content Provider ..................................... 16
Algorithmic Sorting and Promotion .................................................................................. 19
Section 230(c)(2)(A): Restricting Access to Objectionable Material...................................... 21
Good Faith ........................................................................................................................ 22
Objectionable Material...................................................................................................... 23
Section 230(c)(2)(B): Enabling Access Restriction ................................................................ 24
Section 230(e): Exceptions ..................................................................................................... 26
Federal Criminal Law ....................................................................................................... 27
Intellectual Property Law .................................................................................................. 27
State Law .......................................................................................................................... 29
Electronic Communications Privacy Act of 1986 ............................................................. 30
Sex Trafficking Law (FOSTA).......................................................................................... 30

Reform Proposals and Considerations for Congress ..................................................................... 32
Overview of Reform Proposals and Select Legal Considerations .......................................... 32
Liability for Hosting Content ............................................................................................ 34
Liability for Restricting Content ....................................................................................... 37
Free Speech Considerations .................................................................................................... 39
Background Principles ...................................................................................................... 40
First Amendment Issues with Reform Proposals .............................................................. 43
Comparing the Operation of First Amendment and Section 230 Protections ................... 49

Contacts
Author Information ........................................................................................................................ 51

Congressional Research Service


Section 230: An Overview

n 1996, Congress passed a suite of measures to amend the Communications Act of 1934 in
order to protect children on the internet. The new measures were known collectively as the
I Communications Decency Act (CDA).1 Some portions of the CDA directly imposed liability
for transmitting obscene or harassing material online,2 including two provisions that the Supreme
Court struck down as unconstitutional in 1997.3 The CDA’s new Section 230 of the
Communications Act4 took a different approach.5 It sought to allow users and providers of
“interactive computer services” to make their own content moderation decisions, while still
permitting liability in certain limited contexts.6
Since its passage, federal courts have interpreted Section 230 as creating expansive immunity for
claims based on third-party content that appears online.7 Consequently, internet companies and
users frequently rely on Section 230’s protections to avoid liability in federal and state litigation.
But in recent years, commentators and jurists have expressed concern that the broad immunity
courts have recognized under Section 230 is beyond the law’s intended scope.8
This report explores the origins, current application, and future of Section 230. It first discusses
the history and passage of Section 230 and the CDA. The report then analyzes how courts have
applied Section 230 in litigation. The report concludes with a discussion of proposed reforms to
Section 230 and legal considerations relevant to reform efforts.
This report focuses on Section 230 protections from liability and does not more broadly address
the potential liability other laws may impose for hosting or restricting others’ content.9 This report
also does not discuss the possible international trade implications of amending Section 230.10
Text and Legislative History
Congress enacted the CDA as part of the Telecommunications Act of 1996.11 According to the
conference report, the CDA as a whole was intended to “modernize the existing protections

1 Pub. L. No. 104-104, Tit. V, 110 Stat. 133 (1996).
2 E.g., 47 U.S.C. § 223(d).
3 Reno v. ACLU, 521 U.S. 844, 882 (1997).
4 47 U.S.C. § 230. Although Section 230 is sometimes referred to as “Section 230 of the CDA” or “CDA Section 230,”
“Section 230” more accurately refers to the statute’s place in the Communications Act.
5 141 CONG. REC. H8470 (daily ed. Aug. 4, 1995) (statement of Rep. Ron Wyden) (noting that the approach of Section
230 stands “in sharp contrast to the work of the other body,” which sought “to try to put in place the Government rather
than the private sector about this task of trying to define indecent communications and protecting our kids”).
6 See 47 U.S.C. § 230(b) (expressing a deregulatory policy goal); id. § 230(e) (providing limited exceptions).
7 See, e.g., Zeran v. Am. Online, Inc. 129 F.3d 327, 330–31 (4th Cir. 1997).
8 See, e.g., Force v. Facebook, Inc., 934 F.3d 53, 84 (2d Cir. 2019) (Katzmann, J., concurring in part) (opining that
Section 230 as applied creates “extensive immunity . . . for activities that were undreamt of in 1996” and “[i]t therefore
may be time for Congress to reconsider the scope of § 230”); Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC,
141 S. Ct. 13, 14–15 (2020) (Thomas, J., statement respecting the denial of certiorari) (positing that the “modest
understanding” of what Section 230 is meant to do based on its text “is a far cry from what has prevailed in court”); 1
R. SMOLLA, LAW OF DEFAMATION § 4.86 (2d ed. 2019) (“[C]ourts have extended the immunity in § 230 far beyond
anything that plausibly could have been intended by Congress.”).
9 See, e.g., Twitter, Inc. v. Taamneh, 143 S. Ct. 1206 (2023) (holding that social media platform was not liable for
claims brought under the Anti-Terrorism Act irrespective of whether the platform was eligible for protection under
Section 230). Whether a plaintiff has stated a legally actionable claim will depend on the particular claim alleged and
the facts present in each case—issues that are outside the scope of this report.
10 The legal aspects of this issue are discussed briefly in CRS Legal Sidebar LSB10484, UPDATE: Section 230 and the
Executive Order on Preventing Online Censorship
, by Valerie C. Brannon et al.
11 Pub. L. No. 104-104, § 501, 110 Stat. 133–43 (1996).
Congressional Research Service
1

Section 230: An Overview

against obscene, lewd, indecent or harassing uses of a telephone.”12 Since its enactment in 1996,13
Section 230 has been amended twice: once to add a new obligation for interactive computer
services to notify customers about parental control protections,14 and once to create an exception
for certain civil and criminal cases involving prostitution or sex trafficking.15
Section 230 contains findings16 and policy statements,17 expressing, among other things, that
Congress sought to promote the free development of the internet, while also “remov[ing]
disincentives” to implement “blocking and filtering technologies” that restrict “children’s access
to . . . inappropriate online material”18 and “ensur[ing] vigorous enforcement of Federal criminal
laws to deter and punish trafficking in obscenity, stalking, and harassment” online.19 The heart of
Section 230, however, is arguably the immunity created in subsection (c):
(c) PROTECTION FOR “GOOD SAMARITAN” BLOCKING AND SCREENING OF OFFENSIVE
MATERIAL.—
(1) TREATMENT OF PUBLISHER OR SPEAKER.—No provider or user of an interactive
computer service shall be treated as the publisher or speaker of any information provided
by another information content provider.
(2) CIVIL LIABILITY.—No provider or user of an interactive computer service shall be
held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability
of material that the provider or user considers to be obscene, lewd, lascivious, filthy,
excessively violent, harassing, or otherwise objectionable, whether or not such
material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers
or others the technical means to restrict access to material described in [subparagraph
(A)].20
Thus, Section 230(c) contains two distinct provisions that together create a broad immunity from
suit for a “provider or user of an interactive computer service.” Section 230(c)(1) specifies that
service providers may not “be treated as the publisher or speaker of any information provided by

12 S. REP. NO. 104-23, at 59 (1995); see also id. (“The decency provisions increase the penalties for obscene, indecent,
harassing or other wrongful uses of telecommunications facilities; protect privacy; protect families from uninvited and
unwanted cable programming which is unsuitable for children and give cable operators authority to refuse to transmit
programs or portions of programs on public or leased access channels which contain obscenity, indecency, or nudity.”).
The Supreme Court struck down some of these provisions as unconstitutional in Reno v. ACLU, 521 U.S. 844, 882
(1997).
13 Pub. L. No. 104-104, § 509, 110 Stat. 137–39 (1996).
14 Pub. L. No. 105-277, § 1404, 112 Stat. 2681-739 (1998). This 1998 law also amended 47 U.S.C. § 230(e)(1) to
clarify that Section 230 should not be construed to impair the enforcement of 47 U.S.C. § 231, a new provision created
by the 1998 law. Id.
15 Allow States and Victims to Fight Online Sex Trafficking Act of 2017 (FOSTA), Pub. L. No. 115-164, § 4, 132 Stat.
1253 (2018). FOSTA also created criminal and civil liability for owning, managing, or operating an interactive
computer service “with the intent to promote or facilitate the prostitution of another person . . . .” Id. § 3.
16 47 U.S.C. § 230(a).
17 Id. § 230(b).
18 Id. § 230(b)(4).
19 Id. § 230(b)(5).
20 Id. § 230(c). Courts have read 47 U.S.C. § 230(c)(2)(B)’s reference to “paragraph (1)” to mean § 230(c)(2)(A). E.g.
Zango, Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169, 1173 n.5 (9th Cir. 2009) (“We take it that the reference to the
‘material described in paragraph (1)’ is a typographical error, and that instead the reference should be to paragraph (A),
i.e., § 230(c)(2)(A). . . . Paragraph (1) pertains to the treatment of a publisher or speaker and has nothing to do with
‘material,’ whereas subparagraph (A) pertains to and describes material.”) (citation omitted).
Congressional Research Service
2

Section 230: An Overview

another information content provider,”21 while Section 230(c)(2) ensures that service providers
may not be held liable for voluntarily acting to restrict access to objectionable material.22
Both “interactive computer service” and “information content provider” are statutorily defined
terms.23 An “interactive computer service” is “any information service, system, or access software
provider that provides or enables computer access by multiple users to a computer server.”24 In a
computer network, a server is generally the hardware or software that provides a service, such as
transmitting information, to another piece of hardware or software called the client. Courts have
accordingly interpreted “interactive computer service” broadly.25 They have considered online
service providers such as Google,26 Facebook,27 Amazon,28 and Craigslist29 to be “interactive
computer service” providers.30 Given the breadth of this definition, courts have also concluded
that it extends to companies that provide broadband internet access31 or web hosting.32 Most
litigation has focused on online service providers, but the definition can include services
providing access to private servers33 and brick-and-mortar entities such as libraries34 or
employers35 who provide computer access.36

21 47 U.S.C. § 230(c)(1).
22 Id. § 230(c)(2).
23 Id. § 230(f). For more information on “online platforms” more generally, see CRS Report R47662, Defining and
Regulating Online Platforms
, coordinated by Clare Y. Cho.
24 47 U.S.C. § 230(f)(2).
25 See, e.g., Ricci v. Teamsters Union Local 456, 781 F.3d 25, 27–28 (2d Cir. 2015) (observing that the definition of
interactive computer service “has been construed broadly to effectuate the statute’s speech-protective purpose”);
Carafano v. Metrosplash.com, Inc., 339 F.3d 1119, 1123 (9th Cir. 2003) (observing that reviewing courts have
“adopt[ed] a relatively expansive definition of ‘interactive computer service’”); IAN C. BALLON, 4 E-COMMERCE &
INTERNET LAW 37.05[2] (2020 update) (“[A]lmost any networked computer service would qualify as an interactive
computer service, as would an access software provider.”).
26 E.g., Marshall’s Locksmith Serv. v. Google, LLC, 925 F.3d 1263, 1268 (D.C. Cir. 2019).
27 E.g., Klayman v. Zuckerberg, 753 F.3d 1354, 1357 (D.C. Cir. 2014).
28 E.g., Erie Ins. Co. v. Amazon.com, Inc., 925 F.3d 135, 139 (4th Cir. 2019).
29 Chi. Lawyers’ Comm. for Civil Rights Under Law, Inc. v. Craigslist, Inc., 519 F.3d 666, 671 (7th Cir. 2008).
30 See also Universal Commc’n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 419 (1st Cir. 2007) (“Providing access to the
Internet is . . . not the only way to be an interactive computer service provider.”).
31 See e360Insight, LLC v. Comcast Corp., 546 F. Supp. 2d 605, 607 (N.D. Ill. 2008); see also, e.g., Winter v. Bassett,
No. 1:02CV00382, 2003 WL 27382038, at *6 (M.D.N.C. Aug. 22, 2003) (concluding Section 230 protects Verizon and
AT&T as interactive computer service providers).
32 Ricci v. Teamsters Union Local 456, 781 F.3d 25, 28 (2d Cir. 2015); see also, e.g., Gucci Am., Inc. v. Hall &
Assocs., 135 F. Supp. 2d 409, 412 (S.D.N.Y. 2001) (describing Mindspring, a web hosting service, as an “interactive
computer service”).
33 Cf., e.g., Zango, Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169, 1175 (9th Cir. 2009) (rejecting argument that definition
includes only services that enable “people to access the Internet or access content found on the Internet”); In re Zoom
Video Commc’ns Privacy Litig., 525 F. Supp. 3d 1017, 1030 (N.D. Cal. 2021) (ruling the definition “does not
recognize a public/private distinction”).
34 The statute specifically provides that the definition includes “such systems operated or services offered by libraries
or educational institutions.” 47 U.S.C. § 230(f)(2). See, e.g., Kathleen R. v. City of Livermore, 104 Cal. Rptr. 2d 772,
777 (Cal. Ct. App. 2001) (“Respondent provides an ‘interactive computer service’ in this case because its library
computers enable multiple users to access the Internet.”).
35 E.g., Miller v. Fed. Express Corp., 6 N.E.3d 1006, 1017 (Ct. App. Ind. 2014).
36 Section 230 applies to both providers and users of interactive computer services. Some courts have opined that
website operators are themselves users of interactive computer services (such as internet access service) and therefore
are entitled to Section 230’s protection regardless of whether the website in question provides an interactive computer
service. See, e.g., Batzel v. Smith, 333 F.3d 1018, 1031 (9th Cir. 2003).
Congressional Research Service
3

Section 230: An Overview

An “information content provider” is “any person or entity that is responsible, in whole or in part,
for the creation or development of information provided through the Internet or any other
interactive computer service.”37 Thus, Section 230 distinguishes those who create content from
those who provide access to that content, providing immunity from suit to the latter group.38 An
entity may be both an “interactive computer service” provider and an “information content
provider,” but the critical inquiry for applying Section 230’s immunity provisions is whether the
service provider developed the content that is the basis for liability.39
Section 230(e) contains “exceptions” to the law’s immunity provision:40
(e) EFFECT ON OTHER LAWS.—
(1) NO EFFECT ON CRIMINAL LAW.—Nothing in this section shall be construed to
impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity)
or 110 (relating to sexual exploitation of children) of title 18, United States Code, or any
other Federal criminal statute.
(2) NO EFFECT ON INTELLECTUAL PROPERTY LAW.—Nothing in this section shall be
construed to limit or expand any law pertaining to intellectual property.
(3) STATE LAW.—Nothing in this section shall be construed to prevent any State from
enforcing any State law that is consistent with this section. No cause of action may be
brought and no liability may be imposed under any State or local law that is inconsistent
with this section.41
(4) NO EFFECT ON COMMUNICATIONS PRIVACY LAW.—Nothing in this section shall
be construed to limit the application of the Electronic Communications Privacy Act of 1986
or any of the amendments made by such Act, or any similar State law.
(5) NO EFFECT ON SEX TRAFFICKING LAW.—Nothing in this section (other than
subsection (c)(2)(A)) shall be construed to impair or limit:
(A) any claim in a civil action brought under section 1595 of Title 18, if the
conduct underlying the claim constitutes a violation of section 1591 of that title;
(B) any charge in a criminal prosecution brought under State law if the conduct
underlying the charge would constitute a violation of section 1591 of Title 18; or
(C) any charge in a criminal prosecution brought under State law if the conduct
underlying the charge would constitute a violation of section 2421A of Title 18, and
promotion or facilitation of prostitution is illegal in the jurisdiction where the
defendant's promotion or facilitation of prostitution was targeted.42
Courts have interpreted the language providing that Section 230 will not “limit” or “impair the
enforcement of” other laws as creating “exceptions” to Section 230.43 As one court reasoned, if
intellectual property laws would impose liability on a provider, then applying Section 230 to bar
that lawsuit “would ‘limit’ the laws pertaining to intellectual property in contravention of

37 47 U.S.C. § 230(f)(3).
38 See id. § 230(c), (f).
39 See, e.g., Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157, 1174 (9th Cir. 2008) (en banc).
40 E.g., Universal Commc’n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 418 (1st Cir. 2007) (“[Plaintiff] has attempted to
plead around that immunity . . . by asserting causes of action that purportedly fall into one of the statutory exceptions to
Section 230 immunity.” (emphasis added)).
41 In contrast to the exceptions created by most of subsection (e), courts have read the second sentence of Section
230(e)(3) to “preempt contrary state law.” E.g., Doe v. GTE Corp., 347 F.3d 655, 658 (7th Cir. 2003).
42 47 U.S.C. § 230(e).
43 See, e.g., Universal Commc’n Sys., Inc., 478 F.3d at 418.
Congressional Research Service
4

link to page 30 Section 230: An Overview

§ 230(e)(2).”44 Accordingly, Section 230 immunity generally will not apply to suits brought under
federal criminal law,45 intellectual property law,46 any state law “consistent” with Section 230,47
certain electronic communications privacy laws,48 or certain federal and state laws relating to sex
trafficking.49
Section 104: Online Family Empowerment
Representatives Cox and Wyden offered the provision that would become Section 230 as Section
104 of House Bill 1555,50 an amendment to the House version of the CDA titled “Online Family
Empowerment.”51 Representative Cox stated that Section 104 would serve two purposes:
First, it will protect computer Good Samaritans, online service providers, anyone who
provides a front end to the Internet, let us say, who takes steps to screen indecency and
offensive material for their customers. It will protect them from taking on liability . . . .
Second, it will establish as the policy of the United States that we do not wish to have
content regulation by the Federal Government of what is on the Internet . . . .52
Many of those who spoke in favor of this amendment on the floor of the House argued that it
would allow private parties, in the form of parents and internet service providers, to regulate
offensive content, rather than the FCC.53 In particular, then-Representative Wyden emphasized
that “parents and families are better suited to guard the portals of cyberspace and protect our

44 Gucci Am., Inc. v. Hall & Assocs., 135 F. Supp. 2d 409, 413 (S.D.N.Y. 2001).
45 47 U.S.C. § 230(e)(1).
46 Id. § 230(e)(2). As discussed in more detail below, courts have disagreed about whether this exception includes only
federal laws, or state laws as well. Infra “Intellectual Property Law.
47 47 U.S.C. § 230(e)(3).
48 Id. § 230(e)(4).
49 Id. § 230(e)(5).
50 H.R. 1555, 104th Cong. (1995).
51 See 141 CONG. REC. H8468 (daily ed. Aug. 4, 1995).
52 See Id. (statement of Rep. Christopher Cox). See also, e.g., Carafano v. Metrosplash.com, Inc., 339 F.3d 1119, 1122
(9th Cir. 2003) (“Congress enacted this provision as part of the Communications Decency Act of 1996 for two basic
policy reasons: to promote the free exchange of information and ideas over the Internet and to encourage voluntary
monitoring for offensive or obscene material.”); Zeran v. Am. Online, Inc., 129 F.3d 327, 330–31 (4th Cir. 1997)
(“Section 230 was enacted, in part, to maintain the robust nature of Internet communication and, accordingly, to keep
government interference in the medium to a minimum. . . . Another important purpose of § 230 was to encourage
service providers to self-regulate the dissemination of offensive material over their services.”).
53 See 141 CONG. REC. H8470 (daily ed. Aug. 4, 1995) (statement of Rep. Christopher Cox) (“[W]e do not wish to have
a Federal Computer Commission with an army of bureaucrats regulating the Internet because frankly the Internet has
grown up to be what it is without that kind of help from the Government.”); id. at H8470 (statement of Rep. Joe
Barton) (arguing this amendment provides “a reasonable way to . . . help [service providers] self-regulate . . . without
penalty of law”); id. at H8471 (statement of Rep. Rick White) (arguing the responsibility for “protect[ing children]
from the wrong influences on the Internet” should lie with parents instead of federal government); id. at H8471
(statement of Rep. Zoe Lofgren) (arguing that amendment should be adopted to “preserve . . . open systems on the
Net”); id. at H8471 (statement of Rep. Bob Goodlatte) (“The Cox-Wyden amendment is a thoughtful approach to keep
smut off the net without government censorship.”). Some have questioned whether the text of the amendment, in fact,
prevented the federal government from regulating the Internet. See Robert Cannon, The Legislative History of Senator
Exon’s Communications Decency Act: Regulating Barbarians on the Information Superhighway
, 49 FED. COMM. L.J.
51, 68 (1996) (“The opposition [to the Senate version of the CDA] proclaimed that the Cox/Wyden Amendment
forbade FCC regulation of the Internet; it did not. The opposition claimed that it preempted state regulation of the
Internet; it did not.”) (citations omitted).
Congressional Research Service
5

Section 230: An Overview

children than our Government bureaucrats,” and argued against federal censorship of the
Internet.54
The conference report echoed these concerns:
This section provides “Good Samaritan” protections from civil liability for providers or
users of an interactive computer service for actions to restrict or to enable restriction of
access to objectionable online material. One of the specific purposes of this section is to
overrule Stratton-Oakmont v. Prodigy and any other similar decisions which have treated
such providers and users as publishers or speakers of content that is not their own because
they have restricted access to objectionable material. The conferees believe that such
decisions create serious obstacles to the important federal policy of empowering parents to
determine the content of communications their children receive through interactive
computer services.55
As originally introduced and passed by the House, Section 104 also contained a section stating
that the CDA should not be construed “to grant any jurisdiction or authority” to the Federal
Communications Commission (FCC) to regulate the Internet.56 However, this language was
removed during the conference committee on the bill.57
Stratton Oakmont, Inc. v. Prodigy Services Co.
As observed on the floor of the House58 and in the conference report,59 the amendment that would
become Section 230 sought to overturn the result in Stratton Oakmont, Inc. v. Prodigy Services
Co.
, a 1995 New York state trial court decision.60 The plaintiff in that case had sued Prodigy for
libel—that is, defamation in written form.61 Although Prodigy, an internet service provider,62 had
not itself made the allegedly libelous statements, the plaintiff alleged that Prodigy was legally
responsible for publishing those statements because it hosted the message boards on which the
statements were posted.63 Prodigy’s liability depended on a determination that the company was a
“publisher,” because under ordinary principles of defamation law, a publisher like a newspaper
“who repeats or otherwise republishes a libel is subject to liability as if he had originally

54 141 CONG. REC. H8470 (daily ed. Aug 4, 1995) (statement of Rep. Ron Wyden).
55 S. REP. NO. 104-230, at 194 (1996).
56 See H.R. REP. NO. 104-223, at 29 (1995); 141 CONG. REC. H8469 (daily ed. Aug. 4, 1995); 141 CONG. REC. H9988
(daily ed. Oct. 12, 1995).
57 See S. REP. NO. 104-230, at 86–87 (1996). For more information on conference committees, see CRS Report 98-696,
Resolving Legislative Differences in Congress: Conference Committees and Amendments Between the Houses, by
Elizabeth Rybicki.
58 141 CONG. REC. H8469–70 (daily ed. Aug. 4, 1995) (statement of Rep. Christopher Cox).
59 S. REP. NO. 104-230, at 194 (1996).
60 Stratton Oakmont, Inc. v. Prodigy Servs. Co., No. 31063/94, 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995). In
contrast, Representative Cox noted approvingly a federal trial court decision holding that CompuServe could not be
held liable for allegedly defamatory statements that were posted on an internet forum over which it exercised no
editorial control. 141 CONG. REC. H8469 (daily ed. Aug. 4, 1995) (statement of Rep. Christopher Cox); Cubby, Inc. v.
CompuServe, Inc., 776 F. Supp. 135, 140 (S.D.N.Y. 1991).
61 Stratton Oakmont, Inc., 1995 WL 323710, at *1.
62 Prodigy was “a consumer-oriented online service” that allowed users to “trade emails, participate in online message
board discussions, read the daily news, shop for mail-order items, check the weather, stocks, sports scores, play games,
and more.” Benj Edwards, Where Online Services Go When They Die, THE ATLANTIC (July 12, 2014),
https://www.theatlantic.com/technology/archive/2014/07/where-online-services-go-when-they-die/374099. “It was
very much like a microcosm of the modern Internet—if the entire World Wide Web was published by a single
company.” Id.
63 See Stratton Oakmont, Inc., 1995 WL 323710, at *2.
Congressional Research Service
6

Section 230: An Overview

published it.”64 By contrast, speech “distributors” such as libraries or newsstands may be held
liable for circulating publications that contain defamatory statements only if they know or have
reason to know of the defamatory statements.65 A 1991 decision from a federal trial court, Cubby
v. CompuServe, Inc.
, applied this notice-based distributor liability to another early internet service
provider, CompuServe, that the court determined was sufficiently similar to a newsstand.66
The plaintiffs in Stratton Oakmont argued that Prodigy should be considered a publisher rather
than a distributor because it “held itself out as an online service that exercised editorial control
over the content of messages posted on its computer bulletin boards.”67 Prodigy argued in
response that it was more like a bookstore or newsstand than a newspaper, citing Cubby and
claiming that it did not exercise “sufficient editorial control over its computer bulletin boards to
render it a publisher” of the allegedly unlawful material.68 Prodigy pointed out that it did not—
and could not—manually review “all messages prior to posting” them.69
The court concluded that Prodigy was a publisher of the alleged libel because it controlled the
content of its message boards through an “automatic software screening program” and “Board
Leaders” who removed messages that violated Prodigy’s guidelines.70 The court held that “[b]y
actively utilizing technology and manpower to delete notes from its computer bulletin boards on
the basis of offensiveness and ‘bad taste,’ for example, [Prodigy] is clearly making decisions as to
content . . . , and such decisions constitute editorial control.”71 The court emphasized that it was
Prodigy’s “conscious choice” to exercise editorial control, implemented through “policies,
technology and staffing decisions,” that had “opened it up to a greater liability.”72
One of the sponsors of Section 104 argued on the floor of the House that the ruling against
Prodigy was “backward.”73 Representative Cox argued that Congress should be encouraging
internet service providers “like Prodigy, like CompuServe, like America Online, like the new
Microsoft network, to do everything possible for us, the customer, to help us control, at the
portals of our computer, at the front door of our house, what comes in and what our children
see.”74 It was to this end, Representative Cox contended, that Section 104 sought to protect
“computer Good Samaritans,” protecting them “from taking on liability such as occurred in the

64 Id. at *3.
65 Id.
66 Cubby, Inc. v. CompuServe, Inc., 776 F. Supp. 135, 140–41 (S.D.N.Y. 1991). See id. at 140 (“A computerized
database is the functional equivalent of a more traditional news vendor, and the inconsistent application of a lower
standard of liability to an electronic news distributor such as CompuServe than that which is applied to a public library,
book store, or newsstand would impose an undue burden on the free flow of information.”).
67 Stratton Oakmont, Inc., 1995 WL 323710, at *2.
68 Id. at *3.
69 Id.
70 Id. at *4.
71 Id. (citation omitted).
72 Id. at *5. Cf. Cubby, Inc. v. CompuServe, Inc., 776 F. Supp. 135, 140 (S.D.N.Y. 1991) (“[A third party] uploads the
text of Rumorville into CompuServe’s data banks and makes it available to approved . . . subscribers [to CompuServe’s
publishing service] instantaneously. CompuServe has no more editorial control over such a publication than does a
public library, book store, or newsstand, and it would be no more feasible for CompuServe to examine every
publication it carries for potentially defamatory statements than it would be for any other distributor to do so.”); id. at
140–41 (holding CompuServe could not be held liable unless “it knew or had reason to know of the allegedly
defamatory Rumorville statements”).
73 141 CONG. REC. H8470 (daily ed. Aug. 4, 1995) (statement of Rep. Christopher Cox).
74 Id. See also id. at H8471 (statement of Rep. Ron Wyden) (“Under our approach and the speed at which these
technologies are advancing, the marketplace is going to give parents the tools they need . . . .”).
Congressional Research Service
7

link to page 19 link to page 4 Section 230: An Overview

Prodigy case in New York that they should not face for helping us and for helping us solve this
problem.”75 Ultimately, Section 104 made it into the CDA, largely unchanged, as Section 230.76
Judicial Interpretation
Courts have interpreted Section 230 as creating broad immunity that allows the early dismissal of
many legal claims against interactive computer service providers,77 preempting lawsuits and
statutes that would impose liability based on third-party content.78 Courts have generally
interpreted Section 230(c)’s two separate provisions as creating two distinct liability shields.
Section 230(c)(1) states that interactive computer service providers and users may not “be treated
as the publisher or speaker of any information provided by another” person.79 Section 230(c)(2)
provides that interactive computer service providers and users may not be “held liable” for any
voluntary, “good faith” action “to restrict access to or availability of material that the provider or
user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise
objectionable.”80 One conception of these two provisions is that Section 230(c)(1) applies to
claims for content that is “left up,” while Section 230(c)(2) applies to claims for content that is
“taken down.”81 In practice, however, courts have also applied Section 230(c)(1) to “take down”
claims, and courts sometimes collapse Section 230’s two provisions into a single liability shield
or do not distinguish between the two provisions.82 A defendant’s chosen statutory basis for
immunity under Section 230 is consequential: Section 230(c)(2) includes a good faith
requirement absent from Section 230(c)(1), while Section 230(c)(1) is limited to claims based on
another’s content.83
Section 230’s provisions apply to users and providers of “interactive computer services,” a
defined term discussed above.84 Under this definition, courts have recognized that a website
operated by a print or broadcast media provider may be an interactive computer service.85 Thus, a
“traditional” media outlet could receive protection under Section 230 for material posted on its

75 Id. at H8470 (statement of Rep. Christopher Cox).
76 See S. REP. NO. 104-230, at 86–87 (1996).
77 But see G.G. v. Salesforce.com, Inc., 76 F.4th 544, 566 (7th Cir. 2023) (saying Section 230(c)(1) does not create
“immunity” but functions as an affirmative defense).
78 See, e.g., David S. Ardia, Free Speech Savior or Shield for Scoundrels: An Empirical Study of Intermediary
Immunity under Section 230 of the Communications Decency Act
, 43 LOY. L.A. L. REV. 373, 438–39 (2010) (reporting
that almost all unreversed federal decisions involving invocations of Section 230 between Section 230’s passage and
September 30, 2009, happened at the motion to dismiss or summary judgment stage).
79 47 U.S.C. § 230(c)(1).
80 Id. § 230(c)(2).
81 E.g., Doe v. GTE Corp. 347 F.3d 655, 659 (7th Cir. 2003); cf. Malwarebytes, Inc. v. Enigma Software Grp. USA,
LLC, 141 S. Ct. 13, 15 (2020) (Thomas, J., statement respecting the denial of certiorari) (articulating this view of
Section 230 before positing that “[t]his modest understanding is a far cry from what has prevailed in court”).
82 E.g., Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1103 (9th Cir. 2009) (saying that imposing liability for removing
content would treat a party as “a publisher” under Section 230(c)(1)); Malwarebytes, 141 S. Ct. at 17 (Thomas, J.,
statement respecting the denial of certiorari) (collecting cases).
83 Although Section 230(c)(1) refers to content created by “another information content provider,” there is not judicial
agreement about whether Section 230(c)(1) applies when a plaintiff’s own content is at issue—in other words, courts
are divided as to whether a plaintiff itself may be “another information content provider” under Section 230(c)(1). For
more discussion of this issue, see infra note 166.
84 47 U.S.C. § 230(c); see supra “Text and Legislative History.”
85 See, e.g., Straw v. Streamwood Chamber of Commerce, Inc., No. 1-14-3094, 2015 IL App (1st) 143094-U, at *8 (Ill.
App. Ct. Sept. 29, 2015) (applying Section 230 to a letter to the editor published on a newspaper’s website).
Congressional Research Service
8

link to page 4 link to page 9 Section 230: An Overview

website while facing a different standard for material it prints or broadcasts.86 That said, courts
may deny Section 230’s protections without determining whether a party claiming its protections
is a provider or user of an interactive computer service, as detailed below.87
Section 230(c)(1): Publisher Activity
Section 230(c)(1) states that a provider or user of an interactive computer service will not be
considered a publisher or speaker of content “provided by another information content
provider.”88 Courts asked to apply Section 230(c)(1) to dismiss legal claims therefore ask three
questions89:
1. Is the defendant a provider or user of an interactive computer service?90
2. Does the plaintiff seek to hold the defendant liable as a publisher or speaker?
3. Does the plaintiff’s claim arise from information provided by another
information content provider?
If the answer to any of these questions is “no,” Section 230(c)(1) will not bar liability.
As discussed above, courts have construed the definition of “interactive computer service”
broadly.91 Cases thus often turn on the answers to the other two questions, which depend on the
legal claims’ specific facts: an entity may act as an information content provider for certain
content, but still be entitled to protection under Section 230(c)(1) for other content.92 This section
will first summarize Section 230(c)(1) case law before probing specific judicial interpretations of
when a service provider is acting as a publisher of another’s information or an information
content provider.
Early Interpretations: Zeran v. America Online, Inc.
While the legislative history of Section 230 reflects, among other things, an intent to overturn the
result in Stratton Oakmont, as discussed above,93 courts have applied Section 230(c)(1) broadly to
cover other circumstances. The first federal court of appeals decision to examine the scope of

86 Cf. Blumenthal v. Drudge, 992 F. Supp. 44, 49 (D.D.C. 1998) (“Congress decided not to treat providers of interactive
computer services like other information providers such as newspapers, magazines or television and radio stations[.]”).
87 See, e.g., FTC v. Leadclick Media, LLC, 838 F.3d 158, 176 (2d Cir. 2016) (ruling that claims were based on
information developed by defendant); FTC v. Accusearch, Inc., 570 F.3d 1187, 1197–98 (10th Cir. 2009) (reaching the
same conclusion and leaving the question of whether defendant is an interactive computer service “to another day”).
88 47 U.S.C. § 230(c)(1).
89 See, e.g., Universal Commc’n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 418 (1st Cir. 2007); Jones v. Dirty World
Entm’t Recordings LLC, 755 F.3d 398, 409 (6th Cir. 2014).
90 Although many cases involving Section 230(c)(1) are brought against providers of interactive computer services,
Section 230(c)(1) also provides protection to users of interactive computer services. See, e.g., Barrett v. Rosenthal, 146
P.3d 510, 526–27 (Cal. 2006) (applying Section 230(c)(1) to an individual who posted a third-party article on a
message board); see also Batzel v. Smith, 333 F.3d 1018, 1031 (9th Cir. 2003) (opining that a website’s operator is a
“user” of interactive computer services, such as internet access service, and is therefore entitled to protection under
Section 230(c)(1)).
91 See supra “Text and Legislative History.
92 See, e.g., Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157, 1162 (9th Cir. 2008) (en banc) (observing
that a website may avoid liability under Section 230(c)(1) for “passively display[ing] content that is created by third
parties,” but such website could be subject to liability for “content that it creates itself”).
93 See supra Stratton Oakmont, Inc. v. Prodigy Services Co.
Congressional Research Service
9

link to page 9 Section 230: An Overview

Section 230(c)(1) was the Fourth Circuit’s 1997 decision in Zeran v. America Online, Inc.,94 a
case with several differences from Stratton Oakmont. Since its publication, other courts of
appeals have largely adopted Zeran’s reasoning and broadly construed Section 230(c)(1),95
although some more recent cases have signaled a potential retreat from Zeran.96
In Zeran, an unidentified user on an America Online (AOL) bulletin board posted an
advertisement for T-shirts featuring slogans celebrating the bombing of the Alfred P. Murrah
Federal Building in Oklahoma City.97 The user invited AOL subscribers interested in purchasing
these shirts to call the plaintiff, Kenneth Zeran, at his home phone number and “ask for Ken”
upon calling.98 Despite this invitation, Zeran did not post the ad himself, nor did he direct anyone
to post the ad on his behalf.99 Zeran received harassing and threatening calls, and consequently he
contacted AOL and asked the company to remove the ad.100 An AOL employee assured Zeran that
AOL would take down the ad, but after AOL removed the ad, a similar ad took its place.101 Zeran
brought negligence claims against AOL on the theory that once Zeran notified AOL of the ads,
AOL had a duty to remove the ads, notify users that the ads were deceptive, and screen for similar
postings.102
Zeran premised his claim against AOL on a theory of “distributor” liability.103 At common law, as
discussed above,104 vendors and distributors of defamatory publications are liable for the content
of those publications if they know or have reason to know of the illegal or tortious content.105
Central to Zeran’s theory was the notion that, although Section 230(c)(1) prohibited the court
from holding AOL liable as a “publisher” of the defamatory statements, as the court treated
Prodigy in Stratton Oakmont,106 it did not eliminate notice-based distributor liability. In support
of this argument, Zeran noted that Section 230 specifically uses the term “publisher.”107

94 Zeran v. Am. Online, Inc., 129 F.3d 327 (4th Cir. 1997). For purposes of brevity, references to a particular circuit in
this report (e.g., the Fourth Circuit) refer to the U.S. Court of Appeals for that particular circuit (e.g., the U.S. Court of
Appeals for the Fourth Circuit).
95 See Ben Ezra, Weinstein, & Co., Inc. v. Am. Online, Inc., 206 F.3d 980 (10th Cir. 2000); Green v. Am. Online
(AOL), 318 F.3d 465 (3d Cir. 2003); Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003); Universal Commc’n Sys., Inc. v.
Lycos, Inc., 478 F.3d 413 (1st Cir. 2007); Doe v. MySpace, Inc., 528 F.3d 413 (5th Cir. 2008); Johnson v. Arden, 614
F.3d 785 (8th Cir. 2010); Klayman v. Zuckerberg, 753 F.3d 1354 (D.C. Cir. 2014); Jones v. Dirty World Entm’t
Recordings LLC, 755 F.3d 398 (6th Cir. 2014); Ricci v. Teamsters Union Local 456, 781 F.3d 25 (2d Cir. 2015); see
also
Almeida v. Amazon.com, Inc., 456 F.3d 1316 (11th Cir. 2006) (recognizing agreement among other courts of
appeals but reaching a decision on other grounds); cf. Chi. Lawyers’ Comm. for Civil Rights Under Law, Inc. v.
Craigslist, Inc., 519 F.3d 666 (7th Cir. 2008) (partially rejecting the reasoning in Zeran but nonetheless finding that
Section 230 barred Fair Housing Act claims against online service provider).
96 See, e.g., Henderson v. Source for Pub. Data, L.P., 53 F.4th 110, 121–22 (4th Cir. 2022) (discussing the definition of
“publisher”).
97 Zeran, 129 F.3d at 329.
98 Id.
99 Zeran v. Am. Online, Inc., 958 F. Supp. 1124, 1126 (E.D. Va. 1997).
100 Zeran, 129 F.3d at 329.
101 Id.
102 Id. at 330.
103 Though Zeran characterized his claims as stemming from America Online’s negligence, the Fourth Circuit noted
that the claims were “indistinguishable from a garden variety defamation action.” Id. at 332.
104 See supra Stratton Oakmont, Inc. v. Prodigy Services Co.
105 See Cubby, Inc. v. CompuServe, Inc., 776 F. Supp. 135, 139–40 (S.D.N.Y. 1991). This limitation on distributor
liability is rooted in the First Amendment. Id. (citing Smith v. California, 361 U.S. 147, 152–53 (1959)).
106 Stratton Oakmont, Inc. v. Prodigy Servs. Co., No. 31063/94, 1995 WL 323710, at *4 (N.Y. Sup. Ct. May 24, 1995).
107 See Zeran, 129 F.3d at 331–32.
Congressional Research Service
10

link to page 17 link to page 17 Section 230: An Overview

The Fourth Circuit rejected this argument. Writing for a unanimous panel, Chief Judge Wilkinson
posited that “distributor” liability depends on a distributor’s publication of tortious material, and a
distributor is therefore a publisher.108 Judge Wilkinson therefore reasoned that both at common
law and in Section 230, the use of the term “publisher” includes original publishers as well as
distributors.109 The court suggested that subjecting a computer service provider to liability based
on the provider’s knowledge would “reinforce[] service providers’ incentives to restrict speech
and abstain from self-regulation” and “deter service providers from regulating the dissemination
of offensive material over their own services.”110 Chief Judge Wilkinson therefore concluded that
reading Section 230(c)(1) to leave notice-based distributor liability intact would conflict with
Section 230’s purposes.111
As discussed below, Zeran has informed the approach of a vast number of courts interpreting
Section 230(c)(1). As one commentator has noted, “the rule of Zeran [barring distributor liability]
has been uniformly applied by every federal circuit court to consider it and by numerous state
courts.”112 Even so, some jurists have expressed skepticism about the Fourth Circuit’s approach.
In a statement written to accompany a denial of certiorari in a Section 230 case, U.S. Supreme
Court Justice Clarence Thomas suggested, contrary to the holding in Zeran, that Section
230(c)(1) might not limit distributor liability.113 Two federal appellate judges concurring in
separate Section 230 cases also questioned whether Zeran’s definition of “publisher” interpreted
Section 230(c)(1) beyond its intended scope.114 In addition to the skepticism expressed by
individual jurists, a 2022 Fourth Circuit opinion appeared to narrow Zeran’s conception of
“publisher” activity without disturbing its basic ruling on distributor liability.115 This decision is
discussed below.116
Service Provider Role as Publisher
While Zeran may be understood as addressing Section 230(c)(1)’s general scope, the case also
addressed how courts may determine whether a claim treats a defendant as a “publisher or
speaker” of another’s content.117 The Zeran court determined that the provision bars “lawsuits
seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial
functions—such as deciding whether to publish, withdraw, postpone, or alter content.”118 More

108 Id. at 332 (citing W. PAGE KEETON ET AL., PROSSER AND KEETON ON THE LAW OF TORTS § 113, at 803 (5th ed. 1984)).
109 Id. at 333–34.
110 Id. at 333.
111 Id.
112 Ian C. Ballon, Zeran v. AOL and Its Inconsistent Legacy, LAW JOURNAL NEWSLETTERS (Dec. 2017),
https://www.lawjournalnewsletters.com/sites/lawjournalnewsletters/2017/12/01/zeran-v-aol-and-its-inconsistent-
legacy/?slreturn=20201103124726 (noting, though, that different federal appeals courts apply Zeran differently).
113 Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC, 141 S. Ct. 13, 15–16 (2020) (Thomas, J., statement
respecting the denial of certiorari) (arguing that the imposition of distributor liability elsewhere in the CDA and the use
of terms different from those used in Stratton Oakmont might suggest that Section 230 was not meant to limit
distributor liability).
114 E.g., Force v. Facebook, Inc., 934 F.3d 53, 84 (2d Cir. 2019) (Katzmann, J., concurring in part) (opining that
Section 230 as applied creates “extensive immunity . . . for activities that were undreamt of in 1996”); Gonzalez v.
Google, LLC, 2 F.4th 871, 915 (9th Cir. 2021) (Berzon, J., concurring) (arguing that the legislative history of Section
230 does not support a broad reading of publisher functions).
115 Henderson v. Source for Public Data, L.P., 53 F.4th 110, 121 n.12 (4th Cir. 2022).
116 Infra text accompanying notes 146 to 150.
117 See generally Force v. Facebook, Inc., 934 F.3d 53, 64 n.18 (2d Cir. 2019) (discussing the scope of “publisher
liability”).
118 Zeran, 129 F.3d at 330.
Congressional Research Service
11

Section 230: An Overview

generally, the Fourth Circuit interpreted Section 230(c)(1) as “creat[ing] a federal immunity to
any cause of action that would make service providers liable for information originating with a
third-party user of the service.”119 This interpretation would apply beyond the defamation claims
brought in Zeran and Stratton Oakmont, and courts of appeals have barred many claims on the
theory that the defendant computer service is being treated as a publisher or speaker.120 Many
courts have used this “traditional editorial functions”121 standard to interpret the scope of
“publisher” immunity under Section 230.122
For instance, the D.C. Circuit affirmed the dismissal of a lawsuit claiming Facebook acted
negligently in failing to promptly remove an allegedly threatening page, saying that deciding
“whether to print or retract a given piece of content” constitutes “the very essence of
publishing.”123 A number of courts have held that Section 230 not only bars lawsuits seeking
monetary damages, but also bars suits for injunctive relief that would require sites to take specific
actions with respect to third-party content.124 For example, in Hassell v. Bird, the California
Supreme Court said that Section 230 required the dismissal of a claim that sought to enforce a
court order against Yelp.125 The plaintiffs had sued the author of allegedly defamatory statements
posted about their business on Yelp and obtained a default judgment in their favor after the
defendant failed to respond to the lawsuit.126 The plaintiff then attempted to enforce that judgment
against Yelp, who was not originally a party to the litigation, asking the court to enter an
injunction requiring Yelp to take down the defamatory statements.127 In the state court’s view, the
lawsuit sought “to overrule Yelp’s decision to publish the three challenged reviews,”
impermissibly treating it as a publisher of third-party information.128 The court said that allowing
injunctions could “impose substantial burdens” on internet intermediaries, contrary to Section

119 Id.
120 See, e.g., Doe v. Backpage.com, LLC, 817 F.3d 12, 18–24 (1st Cir. 2016) (applying Section 230(c)(1) to claims
brought under federal and state sex trafficking statutes); Doe v. MySpace, Inc., 528 F.3d 413, 420 (5th Cir. 2008)
(rejecting negligence liability for a service provider when an adult user used the service to meet and allegedly abuse
minor children); Chi. Lawyers’ Comm. for Civil Rights Under Law, Inc. v. Craigslist, Inc., 519 F.3d 666, 668–69 (7th
Cir. 2008) (affirming dismissal of a federal housing discrimination claim); Force v. Facebook, Inc., 934 F.3d 53, 65–68
(2d Cir. 2019) (applying Section 230(c)(1) to federal civil claims based on terrorist attacks encouraged and coordinated
by users of a service); Universal Commc’n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 422 (1st Cir. 2007) (affirming
dismissal of claims brought under state securities and cyberstalking laws).
121 Zeran, 129 F.3d at 330.
122 See, e.g., Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398, 407 (6th Cir. 2014); Barnes v. Yahoo! Inc.,
570 F.3d 1096, 1102 (9th Cir. 2009); Shiamili v Real Estate Grp. of N.Y., Inc., 952 N.E.2d 1011, 1019 (N.Y. 2011).
123 Klayman v. Zuckerberg, 753 F.3d 1354, 1355 (D.C. Cir. 2014).
124 See, e.g., Hassell v. Bird, 420 P.3d 776, 788 (Cal. 2018) (plurality opinion); id. at 794 (Kruger, J., concurring); see
also
Noah v. AOL Time Warner Inc., 261 F. Supp. 2d 532, 539–540 (E.D. Va. 2003) (collecting Section 230 cases
dismissing claims for injunctive relief and concluding that the “continuing authority” of a 1998 trial court case holding
that Section 230 did not bar injunctive relief was “questionable”); Republican Nat’l Comm. v. Google, Inc., No. 2:22-
cv-01904, 2023 WL 5487311, at *8 (E.D. Cal. Aug. 24, 2023) (concluding injunctive relief was also barred under
Section 230(c)(2)).
125 Hassell, 420 P.3d at 778–79 (plurality opinion); id. at 794 (Kruger, J., concurring).
126 Id. at 780–81 (plurality opinion).
127 Id. at 781–82.
128 Id. at 789; accord id. at 794 (Kruger, J., concurring). See also id. at 790 (plurality opinion) (“The duty that plaintiffs
would impose on Yelp, in all material respects, wholly owes to and coincides with the company’s continuing role as a
publisher of third party online content.”).
Congressional Research Service
12

Section 230: An Overview

230’s goal of “spar[ing] republishers of online content . . . from this sort of ongoing entanglement
with the courts.”129
In limited circumstances, courts have concluded that a particular claim does not treat a defendant
as a publisher or speaker and is thus not barred by Section 230. One such case decided by the
Ninth Circuit, Doe v. Internet Brands, Inc., involved a negligent failure to warn claim against a
service provider, arguing that under state law, the provider had a duty to warn users that third
parties had used its site to target and lure victims in a “rape scheme.”130 The court held that
Section 230 did not bar the claim because the alleged duty resulted from information the service
provider acquired offline, rather than from user content generated on the provider’s website, and
the service provider could satisfy this duty to warn without removing any user content or
changing how it monitored user content.131
Similarly, the Ninth Circuit refused to bar a state contract law claim based on a provider’s
promise to remove third-party content.132 The court said that liability for the “promissory
estoppel” claim came “not from [the provider’s] publishing conduct, but from [the provider’s]
manifest intention to be legally obligated to do something, which happens to be removal of
material from publication.”133 Another case decided by the Seventh Circuit involved a claim
against a provider of customer relationship management software based on the provider’s alleged
knowing participation in sex trafficking undertaken by one of its clients, a website that used its
software.134 The court held that this claim depended on the provider’s offering of business support
to its client, rather than its publication of any particular content.135
Claims founded on economic regulations of online services have also survived Section 230(c)(1)
preemption. For example, in City of Chicago v. Stubhub!, Inc., the Seventh Circuit declined to
apply Section 230(c)(1) to bar collection of a city amusement tax from an online ticket resale
platform, noting that the tax “does not depend on who ‘publishes’ any information or is a
‘speaker.’”136 Likewise, the Ninth Circuit held that Section 230(c)(1) did not preempt a local
ordinance regulating short-term property rentals, as applied to websites that hosted listings of
such rentals.137 In the Ninth Circuit’s view, the ordinance merely required platforms to monitor
booking transactions listed in a city-run registry of rental properties and did not require platforms
to police the content of third-party listings.138 The court thus did not believe that the ordinance
would impermissibly treat the platforms as publishers of third-party content.139 Courts have also

129 Id. at 791 (plurality opinion). See also Noah, 261 F. Supp. 2d at 540 (“[G]iven that the purpose of § 230 is to shield
service providers from legal responsibility for the statements of third parties, § 230 should not be read to permit claims
that request only injunctive relief. After all, in some circumstances injunctive relief will be at least as burdensome to
the service provider as damages, and is typically more intrusive.”).
130 Doe v. Internet Brands, Inc., 824 F.3d 846, 849 (9th Cir. 2016).
131 Id. at 851.
132 Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1107 (9th Cir. 2009).
133 Id. See also, e.g., Darnaa, LLC v. Google, Inc., No. 15-cv-03221-RMW, 2016 WL 6540452, at *8 (N.D. Cal. Nov.
2, 2016) (“Plaintiff’s claim for breach of the implied covenant of good faith and fair dealing . . . is not precluded by
§ 230(c)(1) because it seeks to hold defendants liable for breach of defendants’ good faith contractual obligation to
plaintiff, rather than defendants’ publisher status.”).
134 G.G. v. Salesforce.com, Inc., 76 F.4th 544, 548 (7th Cir. 2023).
135 Id. at 567.
136 City of Chicago v. Stubhub!, Inc., 624 F.3d 363, 366 (7th Cir. 2010).
137 HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676 (9th Cir. 2019).
138 Id. at 682.
139 Id. at 682–83; see also In re Zoom Video Commc’ns Privacy Litig., 525 F. Supp. 3d 1017, 1033 (N.D. Cal. 2021)
(continued...)
Congressional Research Service
13

link to page 21 link to page 21 Section 230: An Overview

sometimes found claims arising from an online marketplace’s role as a seller of a defective
product to fall outside of Section 230’s protection.140
Federal courts have also declined to apply Section 230(c)(1) to lawsuits brought by the Federal
Trade Commission (FTC) against service providers alleging violations of Section 5 of the Federal
Trade Commission Act.141 The first Court of Appeals to address this issue was the Tenth Circuit in
FTC v. Accusearch, Inc.142 The majority opinion in Accusearch did not decide whether the
defendant was being treated as a publisher or speaker, instead concluding that Section 230 did not
bar the suit because the defendant had contributed to the allegedly unlawful content.143 However,
Judge Tymkovich wrote in a concurring opinion that the cause of action sought to hold the
defendant liable for its own conduct, rather than for third-party content, and thus the defendant
was not being treated as a publisher or speaker.144 In FTC v. Leadclick Media, LLC, the Second
Circuit agreed with Judge Tymkovich’s concurrence and determined that a claim brought under
Section 5 of the FTC Act depended on the defendant’s own deceptive acts or practices and
therefore did not treat the defendant as a publisher or speaker.145
One recent decision signals an approach that may define publisher activity more narrowly than
some courts previously applied Zeran. In Henderson v. Source for Public Data, L.P., the Fourth
Circuit held that to treat a service provider as a publisher or speaker, a claim must hold a service
provider liable based on the improper content of the disseminated information.146 The court drew
this requirement from defamation law, under which a defendant’s liability as a publisher depends
on the improper, “false and defamatory” nature of the material published.147 Under this view,
Section 230 did not bar claims alleging that a website had failed to comply with the Fair Credit
Reporting Act.148 Although the claims would have held the site liable for improperly
disseminating information, they did not depend on the information’s content being improper.149
The opinion cited and purported to apply Zeran, but Henderson appeared to add a new
requirement given that Zeran made no reference to the content of information.150 Several
subsequent decisions from state and federal courts outside of the Fourth Circuit have declined to

(characterizing this as “content-neutral liability”). Cf. Airbnb, Inc. v. City of Boston, 386 F. Supp. 3d 113, 120–24 (D.
Mass. 2019) (ruling that a similar regulation was not preempted by Section 230, but concluding Section 230 likely did
preempt portions of the regulation requiring a “booking agent” to remove improper listings).
140 See, e.g., Lee v. Amazon.com, Inc., 291 Cal. Rptr. 3d 332, 378–79 (Cal. Ct. App. 2022); Erie Ins. Co. v.
Amazon.com, Inc., 925 F.3d 135, 139–40 (4th Cir. 2019). As discussed, the unavailability of Section 230 in cases
brought against online marketplaces does not necessarily mean the marketplace will face liability. See, e.g., Erie Ins.
Co.
, 925 F.3d at 142 (holding that Amazon was not liable for sale of a defective product).
141 15 U.S.C. § 45.
142 FTC v. Accusearch, Inc., 570 F.3d 1187, 1197 (10th Cir. 2009).
143 Id.
144 Id. at 1204 (Tymkovich, J., concurring). For more discussion of Accusearch, see infra “Subsequent Developments in
Material Contribution Analysis.”

145 FTC v. Leadclick Media, LLC, 838 F.3d 158, 176–77 (2d Cir. 2016).
146 Henderson v. Source for Public Data, L.P., 53 F.4th 110, 122 (4th Cir. 2022).
147 Id. (citing RESTATEMENT (SECOND) OF TORTS § 558(a) (AM. L. INST. 1965)).
148 Id. at 117.
149 Id. at 123–24.
150 See Zeran v. Am. Online, Inc. 129 F.3d 327, 330 (4th Cir. 1997) (referencing the exercise of “traditional editorial
functions” without reference to the content of information). Because the material at issue in Zeran was allegedly
defamatory, see id., the Fourth Circuit’s decision in Henderson does not call into question the outcome of Zeran.
Congressional Research Service
14

Section 230: An Overview

follow Henderson, reasoning that the decision conflicts with binding precedent in their
jurisdictions that reads Section 230(c)(1) more broadly.151
Product Design Claims
Courts have seen a rise in suits alleging, often as product liability claims,152 that online services
were designed negligently.153 Some opinions have held that Section 230(c)(1) barred claims
seeking to hold sites liable for failing to adopt safety features that plaintiffs claim would have
prevented violence.154 To take one example, in Doe v. MySpace, Inc., the Fifth Circuit affirmed
the dismissal of a lawsuit alleging that MySpace acted negligently in failing “to implement basic
safety measures to prevent sexual predators from communicating with minors on its Web site.”155
The plaintiff, a minor, had used the site to meet and communicate with an older teenager who
later sexually assaulted her at an in-person meeting.156 The plaintiff argued that her negligence
claims depended on “MySpace’s failure to implement basic safety measures” and therefore would
not treat the site as a publisher.157 The Fifth Circuit disagreed, saying the allegations were “merely
another way of claiming that MySpace was liable for publishing the communications.”158 In the
court’s view, the negligence claims hinged on MySpace’s publisher functions: its decisions
relating to the “monitoring, screening, and deletion” of third-party content.159 As a result, Section
230(c)(1) barred liability.160
In contrast, the Ninth Circuit determined more recently that claims brought against the maker of
Snapchat for negligently designing its platform to include a “speed filter” that encouraged users
to drive at recklessly high speeds would not be barred by Section 230(c)(1).161 The Ninth Circuit
determined that the claims based on Snapchat’s speed filter did not treat the platform as a
“publisher or speaker,” because the claims “treat[ed] Snap as a products manufacturer, accusing it
of negligently designing a product (Snapchat) with a defect.”162 Citing Internet Brands, the failure
to warn case discussed above, the court observed that “Snap could have satisfied” the alleged
obligation to design a better product “without altering the content that Snapchat’s users

151 E.g., Divino Grp. LLC v. Google LLC, No. 19-04749, 2023 WL 218966, at *2 (N.D. Cal. Jan. 17, 2023)
(“Henderson is not binding on this Court; and . . . the Fourth Circuit’s narrow construction of Section 230(c)(1) appears
to be at odds with Ninth Circuit decisions indicating that the scope of the statute’s protection is much broader.”); Prager
Univ. v. Google LLC, 85 Cal. App. 5th 1022, 1033 n.4 (Cal. Ct. App. 2022) (“Henderson’s narrow interpretation of
section 230(c)(1) is in tension with the California Supreme Court’s broader view, which we follow, absent a contrary
ruling by the United States Supreme Court.”).
152 For a discussion of products liability claims, see CRS In Focus IF11291, Introduction to Tort Law, by Andreas
Kuersten.
153 See generally, e.g., Peter Karalis & Golriz Chrostowski, Analysis: Product Claims Spike as SCOTUS Ponders
Section 230 Fix
, BLOOMBERG LAW (Mar. 2, 2023), https://news.bloomberglaw.com/bloomberg-law-analysis/analysis-
product-claims-spike-as-scotus-ponders-section-230-fix.
154 E.g., Herrick v. Grindr LLC, 765 F. App’x 586, 590–91 (2d Cir. 2019) (affirming dismissal of product liability,
negligence, and infliction of emotional distress claims alleging Grindr should have adopted safety features that would
have protected a user from an ex-boyfriend’s “campaign of harassment” conducted on the service).
155 Doe v. MySpace, Inc., 528 F.3d 413, 416 (5th Cir. 2008).
156 Id. The suit was brought by the minor and her mother under the aliases Jane and Julie Doe. See id. at 415–16. This
report refers to a singular plaintiff for convenience.
157 Id. at 419.
158 Id. at 420.
159 See id. (quoting Green v. Am. Online (AOL), 318 F.3d 465, 471 (3rd Cir. 2003)).
160 Id. at 422.
161 Lemmon v. Snap, Inc., 995 F.3d 1085, 1091–94 (9th Cir. 2021).
162 Id. at 1092.
Congressional Research Service
15

link to page 16 link to page 16 Section 230: An Overview

generate.”163 A state court in Georgia reached a similar conclusion, holding that claims based on
Snapchat’s speed filter did “not seek to hold Snapchat liable for publishing” and therefore could
proceed.164
Information Provided by Another Information Content Provider
Section 230(c)(1)’s protections extend only to claims that would hold a defendant liable for
“information provided by another information content provider.”165 Put another way, Section
230(c)(1) does not protect defendants from claims arising from their own content.166 For example,
Section 230(c)(1) would not bar a defamation claim against a social media website based on the
content of a label or disclaimer added by the website to third-party content.167 But as recognized
in Zeran and other cases, Section 230(c)(1) does allow a defendant to make some editorial
adjustments to third-party content without being considered the provider of that content.168
Whether a defendant is being treated as the publisher of information provided by “another
information content provider” depends in part on whether the defendant is an information content
provider itself.169 As defined in Section 230, an “information content provider” is “any person or
entity that is responsible, in whole or in part, for the creation or development of information
provided through the Internet or any other interactive computer service.”170 When a case involves
third-party content, courts routinely focus on the defendant’s role in the “creation or
development” of the content.171

163 Id.; see supra text accompanying notes 130 to 131.
164 Maynard v. Snapchat, Inc., 816 S.E.2d 77, 81 (Ga. Ct. App. 2018). The court emphasized that the alleged liability
stemmed from actions taken before a third party had posted any content. Id. at 80.
165 47 U.S.C. § 230(c)(1) (emphasis added).
166 A separate but related question is whether a plaintiff bringing claims based on their own content is “another
information content provider” under Section 230(c)(1). Some courts have declined to apply Section 230(c)(1) to
content created by a plaintiff, reasoning that allowing Section 230(c)(1) to cover such content would render Section
230(c)(2) superfluous. See, e.g., e-ventures Worldwide, LLC v. Google, Inc., No. 2:14-cv-646-FtM-PAM-CM, 2017
WL 2210029, at *3 (M.D. Fla. Feb. 8, 2017) (declining to apply Section 230(c)(1) to unfair competition claims based
on Google’s removal of plaintiff’s advertising material). Other courts have applied Section 230(c)(1) to such claims.
See, e.g., Riggs v. MySpace, Inc., 444 F. App’x 986, 987 (9th Cir. 2011) (affirming dismissal under Section 230(c)(1)
of claims based on removal of plaintiff-created profile pages); Sikhs for Justice “SFJ”, Inc. v. Facebook, Inc., 144 F.
Supp. 3d 1088, 1093–94 (N.D. Cal. 2015) (applying Section 230(c)(1) to dismiss claims based on blocking access to
plaintiff-created page), aff’d, 697 F. App’x 526 (9th Cir. 2017); cf. Batzel v. Smith, 333 F.3d 1018, 1031 (9th Cir.
2003) (interpreting Section 230(c)(1)’s reference to “another information content provider” to “distinguish[] the
circumstance in which the interactive computer service itself meets the definition of ‘information content provider’
with respect to the information in question”).
167 Cf. Maffick, LLC v. Facebook, Inc., No. 20-05222, 2020 WL 5257853, at *1 (N.D. Cal. Sept. 3, 2020) (ignoring
Section 230 entirely in a case based on Facebook’s labeling of user accounts as “Russia state-controlled media”).
168 Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997); see Batzel v. Smith, 333 F.3d 1018, 1031 (9th Cir.
2003) (making minor alterations to email before posting email to listserv did not render defendant liable for third-party
content); Ben Ezra, Weinstein, & Co. v. America Online, Inc., 206 F.3d 980, 985–86 (10th Cir. 2000) (deleting
erroneous information from a database containing third-party content did not render defendant liable for third-party
content); Blumenthal v. Drudge, 992 F. Supp. 44, 51–52 (D.D.C. 1998) (reserving right to “require reasonable
changes” to content did not render service provider liable for content).
169 See 47 U.S.C. § 230(c)(1).
170 Id. § 230(f)(3).
171 See, e.g., Batzel, 333 F.3d at 1031; Ben Ezra, Weinstein, & Co., 206 F.3d at 985.
Congressional Research Service
16

Section 230: An Overview

Fair Housing Council v. Roommates.com, LLC
A foundational case on this issue is the Ninth Circuit’s decision in Fair Housing Council v.
Roommates.com, LLC
(Roommates).172 In Roommates, housing agencies in San Diego and the
San Fernando Valley sued the operators of the website Roommates.com,173 a website that allows
individuals to locate prospective roommates.174 New Roommates.com users were required to
complete a questionnaire that included the user’s preferences for a roommate’s age, gender,
sexual orientation, and number of children.175 Roommates.com then displayed the answers to
these questions in personal profiles, which users of the site could search and view.176 The housing
agencies alleged that Roommates.com had violated a provision of the Fair Housing Act that
prohibits publishing advertisements for the sale or rental of a dwelling that indicate any
preference based on sex, familial status, or other protected characteristics.177 In defense,
Roommates.com argued that the housing agencies were seeking to hold Roommates.com liable
for content generated by individual users, and therefore Section 230(c)(1) would bar liability.178
In an en banc ruling, the Ninth Circuit rejected this contention, saying that Roommates.com’s
required questionnaire “induce[d] third parties to express illegal preferences.”179 According to the
court, because this questionnaire was created by Roommates.com and not its users, Section
230(c)(1) did not apply.180
Addressing Roommates.com’s liability for displaying its user’s preferences on personal profiles,
the court acknowledged that the “illegal preferences” at issue were pieces of information
provided by information content providers other than Roommates.com.181 But the Ninth Circuit
noted that Roommates.com may still have “develop[ed] . . . in part” this information, such that
Roommates.com could be considered the “information content provider” of the information.182
The court determined that by requiring users to answer its questionnaire, Roommates.com had at
least in part developed the information.183 The Ninth Circuit cabined the reach of its holding by
specifying that “passive conduits” or “neutral tools,” such as a search engine that filters content
only by user-generated criteria, would not be responsible for developing content.184 The court also
concluded that Section 230(c)(1) did bar liability for user comments made in an “Additional

172 Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (en banc).
173 The defendant’s corporate name in Roommates is the singular Roommate.com, LLC. However, the domain of the
website operated by the defendant is the plural roommates.com. This linguistic mismatch resulted in the party being
named as “Roommates.com” in the Ninth Circuit case. Cf. Fair Hous. Council v. Roommate.com, LLC, No. 03-09386,
2004 WL 3799488 (C.D. Cal. Sept. 30, 2004). For clarity, this report will refer to the defendant website operator as
“Roommates.com.”
174 Roommates, 521 F.3d at 1162.
175 Id. at 1161.
176 Id.
177 42 U.S.C. § 3604(c).
178 Roommates, 521 F.3d at 1162.
179 Id. at 1165.
180 Id.
181 Id.
182 Id.; see 47 U.S.C. § 230(c)(1) (applying only to information provided by “another information content provider”).
183 Roommates, 521 F.3d at 1166.
184 Id. at 1167–69.
Congressional Research Service
17

Section 230: An Overview

Comments” section of user profiles, a blank box where users could post text with no
constraints.185
Writing for the majority, Chief Judge Kozinski summarized the Roommates court’s holding: “a
website helps to develop unlawful content . . . if it contributes materially to the alleged illegality
of the conduct.”186 In a later Ninth Circuit opinion, the court clarified that this “material
contribution” test “draw[s] the line at ‘the crucial distinction between, on the one hand, taking
actions (traditional to publishers) that are necessary to the display of unwelcome and actionable
content and, on the other hand, responsibility for what makes the displayed content illegal or
actionable.’”187
Subsequent Developments in Material Contribution Analysis
Since the Ninth Circuit’s decision in Roommates, other federal courts of appeals and state courts
have adopted variations on Roommates’ “material contribution” analysis in determining whether a
defendant is the information content provider of the information at issue. The next federal appeals
court to consider Roommates was the Tenth Circuit in FTC v. Accusearch, Inc., which adopted—
and possibly expanded upon—the Ninth Circuit’s reasoning in Roommates.188 At issue in
Accusearch was whether a website that sold information contained in telephone records could
claim Section 230 protection from an FTC enforcement action when the operator acquired these
records from third parties.189 Accusearch argued that it did not add anything to the information
after receiving it and thus was not an information content provider of the information.190 In an
opinion written by Judge Hartz, the Tenth Circuit held that a defendant’s solicitation of and
payment for telephone records rendered the defendant an information content provider of these
records.191
The Tenth Circuit focused on whether the defendant had played any role in “developing” the
information. Judge Hartz opined that the inclusion of two terms—“creation” and
“development”—in Section 230’s definition of “information content provider” suggested that the
two terms had distinct meanings.192 Unwilling to adopt a redundant definition of “development,”
the court turned to dictionary definitions of the term and determined that information may be
“developed” when the information is made “‘visible,’ ‘active,’ or ‘usable.’”193 The Tenth Circuit
therefore concluded that by making telephone records public on its website, the defendant had
“developed” those records.194 Noting that Section 230 defines an information content provider as
one “responsible, in whole or in part” for the creation or development of content,195 the
Accusearch court followed Roommates in holding that a party is “responsible” for content only

185 Id. at 1173–75; see also Chi. Lawyers’ Comm. for Civil Rights Under Law v. Craigslist, Inc., 519 F.3d 666, 671
(2008) (concluding Section 230(c)(1) barred a similar Fair Housing Act case brought against website that hosted
apartment listings, but listings were written entirely by users).
186 Roommates, 521 F.3d at 1168.
187 Kimzey v. Yelp! Inc., 836 F.3d 1263, 1269 n.4 (9th Cir. 2016) (quoting Jones v. Dirty World Entmt. Recordings
LLC, 755 F.3d 398, 413–14 (6th Cir. 2014).
188 FTC v. Accusearch, Inc., 570 F.3d 1187, 1198 (10th Cir. 2009).
189 Id. at 1190.
190 Id. at 1197–98.
191 Id. at 1200.
192 Id. at 1198.
193 Id. (quoting WEBSTER’S THIRD NEW INT’L DICTIONARY 618 (2002)).
194 Id.
195 47 U.S.C. § 230(f)(3).
Congressional Research Service
18

Section 230: An Overview

when the party “in some way specifically encourages development of what is offensive about the
content.”196 To the Tenth Circuit, what was “offensive” about the information at issue was that it
had been publicly exposed: as the court observed, federal law generally prohibits the disclosure of
telephone records to third parties.197 Judge Hartz noted that Accusearch had “affirmatively
solicited” telephone records from its paid researchers and “knowingly sought to transform
virtually unknown information into a publicly available commodity,” and was therefore
responsible for the records being made public.198
Courts interpreting Roommates and Accusearch have attempted to define the contours of when a
defendant has or has not “materially contributed” to content. A North Carolina appellate court
held that “a website must effectively control the content . . . or take other actions which
essentially ensure the creation of unlawful content” to be considered an information content
provider.199 The Sixth Circuit has emphasized that mere encouragement does not rise to the level
of material contribution, asserting that holding otherwise “would inflate the meaning of
‘development’ to the point of eclipsing the immunity from publisher-liability that Congress
established.”200 Even the Ninth Circuit has cautioned against the broad application of Roommates,
declining to hold, for example, that a defendant materially contributed to content when the
defendant did not “require[] users to post specific content,” as Roommates.com did by requiring
users to complete its questionnaire.201 In one of the few instances where a court has recognized
material contribution, a California Court of Appeal decision applying Roommates held that a
social media platform’s advertising tools, which required advertisers to select a target age range
and gender, materially contributed to the alleged proliferation of discriminatory advertisements
on the platform.202
Algorithmic Sorting and Promotion
A recurring issue in Section 230 cases is whether Section 230(c)(1) immunizes the use of
algorithms to filter and sort content in a particular way—a common feature on social media
websites and search engines.203 Claims brought against websites for their use of algorithms often
cast a website’s use of algorithms either as “development” of third-party content, much like the
theories of Roommates and Accusearch, or as nonpublisher activity to which Section 230(c)(1)
would not apply. Federal courts of appeals that have considered this issue thus far have uniformly
rejected these theories.204 For a more detailed discussion of these cases and recent developments,
see CRS Report R47753, Liability for Algorithmic Recommendations, by Eric N. Holmes.

196 Accusearch, 570 F.3d at 1199.
197 Id.; see 47 U.S.C. § 222.
198 Accusearch, 570 F.3d at 1200.
199 Hill v. Stubhub, Inc., 727 S.E.2d 550, 561 (N.C. App. 2012).
200 Jones v. Dirty World Entmt. Recordings LLC, 755 F.3d 398, 414 (6th Cir. 2014); see Fair Hous. Council v.
Roommates.com, LLC, 521 F.3d 1157, 1161 n.19 (9th Cir. 2008) (en banc) (noting that Roommates.com “does much
more than encourage or solicit”).
201 Dyroff v. Ultimate Software Grp., Inc., 934 F.3d 1093, 1099 (9th Cir. 2019).
202 Liapes v. Facebook, Inc., 313 Cal. Rptr. 3d 330, 346 (Cal. Ct. App. 2023).
203 For more information on content recommendation and moderation algorithms, see CRS In Focus IF12462, Social
Media Algorithms: Content Recommendation, Moderation, and Congressional Considerations
, by Kristen E. Busch.
204 E.g., Dyroff, 934 F.3d at 1098–99 (opining that plaintiffs could not frame “website features as content” and that the
site’s recommendation and notification functions did not materially contribute to alleged unlawfulness of content);
Force v. Facebook, Inc., 934 F.3d 53, 66–69 (2d Cir. 2019) (rejecting theories that algorithmic sorting rendered website
a nonpublisher or materially contributed to development of content); Marshall’s Locksmith Serv., Inc. v. Google, LLC,
(continued...)
Congressional Research Service
19

Section 230: An Overview

A thorough examination of the relationship between algorithmic content and Section 230 is the
Second Circuit’s opinion in Force v. Facebook, Inc., a case brought by victims of terrorist attacks
allegedly coordinated and encouraged on Facebook by individual users.205 In Force, the plaintiffs
contended that Facebook’s use of algorithms to display personalized content and friend
suggestions was nonpublisher activity outside Section 230’s scope or, alternatively, materially
contributed to the development of user content by “mak[ing] that content more visible, available,
and usable.”206 The Second Circuit declined to endorse either of these arguments and instead held
that Section 230 barred the plaintiffs’ claims.207 Addressing the first argument, the court decided
that how and where to display content is a quintessential editorial decision protected under
Section 230, and therefore plaintiffs sought to hold Facebook liable as a publisher.208 The Second
Circuit likewise held that Facebook had not developed user content when its algorithms “take the
information provided by Facebook users and ‘match’ it to other users—again, materially
unaltered—based on objective factors applicable to any content.”209
The Force court’s treatment of algorithmic sorting applies the “neutral tools” language first
appearing in Roommates.210 Several earlier cases adopt a similar approach to such neutral tools
that, though originating with this language from Roommates, slightly diverges from Roommates
material contribution analysis. In an early case on the issue, the D.C. Circuit held that “a website
does not create or develop content when it merely provides a neutral means by which third parties
can post information of their own independent choosing online.”211 Both the D.C. Circuit and the
Second Circuit have elaborated on particular features that may make a website’s tools “neutral.”
In Marshall’s Locksmith Service, Inc. v. Google, a case involving search engines that
automatically converted addresses provided by third parties into “pinpoints” appearing on the
search engines’ mapping websites, the D.C. Circuit emphasized that the search engines’ tools did
“not distinguish” between different types of user content.212 Instead, the algorithm translated all
types of information, both legitimate and scam information, in the same manner.213 The Second
Circuit in Force characterized Facebook’s involvement in user content as “neutral” when
Facebook did not require users to provide more than “basic identifying information” and its
sorting algorithms used “objective factors” that applied in the same way “to any content.”214

925 F.3d 1263, 1271 (D.C. Cir. 2019) (declining to treat search engines’ conversion of fraudulent addresses from
webpages into “map pinpoints” as developing content).
205 Force, 934 F.3d at 57.
206 Id. at 70 (internal quotations omitted); id. at 65–66.
207 Id. at 71. In a partially dissenting opinion, Chief Judge Katzmann wrote that he would not apply Section 230(c)(1),
reasoning that claims based on Facebook’s friend and content suggestion systems did not treat Facebook as a publisher
of another’s content. Id. at 76–89 (Katzmann, J., concurring in part and dissenting in part).
208 Id. at 66–67 (majority opinion); see Carafano v. Metrosplash.com, Inc., 339 F.3d 1119, 1124–25 (9th Cir. 2003)
(applying Section 230 to a website’s “decision to structure the information provided by users”); Marshall’s Locksmith
Serv.
, 925 F.3d at 1269 (holding that “the choice of presentation” is a publisher function protected by Section 230); cf.
O’Kroley v. Fastcase, Inc., 831 F.3d 352, 355 (6th Cir. 2016) (applying Section 230 to “automated editorial acts”).
209 Force, 934 F.3d at 70.
210 Id. at 66 (“[W]e find no basis . . . for concluding that an interactive computer service is not the ‘publisher’ of third-
party information when it uses tools such as algorithms that are designed to match that information with a consumer’s
interests.”) (citing Fair Hous. Council v. Roommates.com, Inc., 521 F.3d 1157, 1172 (9th Cir. 2008) (en banc)).
211 Klayman v. Zuckerberg, 753 F.3d 1354, 1358 (D.C. Cir. 2014); accord Kimzey v. Yelp! Inc., 836 F.3d 1263, 1270
(9th Cir. 2016) (characterizing a rating system based on third-party input as a “neutral tool”).
212 Marshall’s Locksmith Serv., 925 F.3d at 1271.
213 Id.
214 Force, 934 F.3d at 70.
Congressional Research Service
20

link to page 11 link to page 26 Section 230: An Overview

The Ninth Circuit reached a similar conclusion in a since-vacated decision in Gonzalez v. Google
LLC
, which also involved claims brought by victims of terrorist attacks against social media
providers.215 As in Force, two members of the three-judge panel held that Section 230 would bar
these claims because they sought to impose liability based on a decision not to remove terrorist
content and because Google’s algorithms applied to terrorist content no differently than they
applied to other content.216 The Supreme Court granted certiorari in Gonzalez, but vacated the
Ninth Circuit’s judgment without addressing Section 230.217
Section 230(c)(2)(A): Restricting Access to Objectionable Material
Section 230(c)(2)(A) states that service providers and users may not “be held liable” for
voluntary, “good faith” actions “to restrict access to or availability of material that the provider or
user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise
objectionable, whether or not such material is constitutionally protected.”218 This provision is
more limited than Section 230(c)(1) in a few ways. First, as discussed above,219 while a number
of courts have held that Section 230(c)(1) shields decisions both to distribute and to restrict
others’ content, Section 230(c)(2) applies only to decisions to restrict content. For example,
providers have successfully invoked Section 230(c)(2) in claims challenging decisions to restrict
user videos,220 suspend accounts,221 prevent unsolicited bulk emails,222 or not to run certain ads.223
In addition, unlike Section 230(c)(1), Section 230(c)(2) applies only to voluntary, good-faith
actions, and it applies only to the listed categories of “objectionable” material.224 These limits on
Section 230(c)(2) immunity have been litigated in the courts and have led courts to conclude, in
some circumstances, that providers cannot claim Section 230 immunity.225

215 Gonzalez v. Google LLC, 2 F.4th 871 (9th Cir. 2021), vacated, 598 U.S. 617 (2023) (per curiam).
216 Id. at 892, 896.
217 Gonzalez v. Google LLC, 598 U.S. 617, 621 (2023) (per curiam) (ruling the complaint failed to state a claim for
aiding and abetting an act of international terrorism).
218 47 U.S.C. § 230(c)(2)(A).
219 Supra note 82 and accompanying text.
220 E.g., Divino Grp. LLC v. Google LLC, No. 19-cv-04749-VKD, 2022 WL 4625076, at *18 (N.D. Cal. Sept. 30,
2022) (involving state discrimination and unfair competition claims).
221 E.g., Berenson v. Twitter, Inc., No. 21-09818, 2022 WL 1289049, at *2 (N.D. Cal. Apr. 29, 2022) (involving,
among others, federal and state unfair competition laws and state common carrier law); Dipp-Paz v. Facebook, No. 18-
CV-9037, 2019 WL 3205842, at *3 (S.D.N.Y. July 12, 2019) (involving constitutional free speech claims).
222 E.g., Green v. Am. Online (AOL), 318 F.3d 465, 473 (3rd Cir. 2003) (involving negligence, breach of contract,
constitutional free speech, and consumer fraud claims); Holomaxx Techs. v. Microsoft Corp., 783 F. Supp. 2d 1097,
1105 (N.D. Cal. 2011) (involving, among others, intentional interference with contract and intentional interference with
prospective business advantage claims); e360Insight, LLC v. Comcast Corp., 546 F. Supp. 2d 605, 607 (N.D. Ill. 2008)
(involving federal Computer Fraud and Abuse Act, constitutional free speech, tortious interference with prospective
economic advantage, and consumer fraud claims).
223 E.g., Langdon v. Google, Inc., 474 F. Supp. 2d 622, 630–31 (D. Del. 2007) (involving free speech, fraud, breach of
contract, deceptive business practices, and “public calling” claims).
224 See 47 U.S.C. § 230(c). See also, e.g., Fyk v. Facebook, Inc., 808 Fed. App’x 597, 598 (9th Cir. 2020) (“Unlike 47
U.S.C. § 230(c)(2)(A), nothing in § 230(c)(1) turns on the alleged motives underlying the editorial decisions of the
provider of an interactive computer service.”). As discussed below, some courts have interpreted these categories
broadly. See infra “Objectionable Material.
225 See, e.g., Enhanced Athlete Inc. v. Google LLC, No. 19-cv-08260-HSG, 2020 WL 4732209, at *4 (N.D. Cal. Aug.
14, 2020); e-ventures Worldwide, LLC v. Google, Inc., No. 2:14-cv-646-FtM-PAM-CM, 2017 WL 2210029, at *3
(M.D. Fla. Feb. 8, 2017); Darnaa, LLC v. Google, Inc., No. 15-cv-03221-RMW, 2016 WL 6540452, at *8 (N.D. Cal.
Nov. 2, 2016).
Congressional Research Service
21

Section 230: An Overview

Good Faith
Providers or users may claim immunity under Section 230(c)(2)(A) only if they act in “good
faith.”226 The statute does not itself define what it means to act in good faith, and courts have
applied a few different understandings of the term. Some trial court decisions have denied
immunity and allowed claims to proceed where the plaintiff alleged that a service provider acted
with an anticompetitive motive.227 For example, one court declined to dismiss a lawsuit alleging
that Google had engaged in unfair competition by removing a company’s websites from its search
results.228 Although Google said it had removed the results because they were “webspam” that
violated its guidelines, the plaintiff claimed that Google actually had acted with an
anticompetitive motive, because the plaintiff, which specialized in search engine optimization,
“was cutting into Google’s revenues.”229 The court ruled that the plaintiff had presented enough
evidence “to raise a genuine issue of fact” as to whether Google acted in good faith, preventing
the court from dismissing the claim under Section 230.230 To take another example, a different
court allowed a claim to proceed where the plaintiff alleged that YouTube removed her video to
punish her for working with a competitor rather than buying Google’s advertising services.231
In evaluating whether a provider acted in good faith, courts have also looked to whether the
provider’s rationale for restricting content is “pretextual.”232 As one trial court put it, for a
removal to be made in good faith, “the provider must actually believe that the material is
objectionable for the reasons it gives.”233 Under this view, if a provider says it is enforcing its
terms of service, but is in fact motivated by some other reason, the provider may be acting in bad
faith.234 Another trial court concluded that a service provider could be seen as acting in bad faith
when the provider “failed to respond to [the user’s] repeated requests for an explanation.”235
In comparison, one trial court suggested that “selective enforcement” of a policy alone would not
be enough to demonstrate bad faith.236 A mere mistake may be similarly insufficient.237 One trial
court rejected allegations that Google acted in bad faith by sending emails from the Republican

226 47 U.S.C. § 230(c)(2)(A).
227 See Darnaa, 2016 WL 6540452, at *8–9 (involving allegation that Google removed plaintiff’s video from YouTube
because the plaintiff refused to allow Google to embed advertising in the video). Cf. Spy Phone Labs LLC v. Google
Inc., No. 15-cv-03756-KAW, 2016 WL 6025469, at *8 (N.D. Cal. Oct. 14, 2016) (involving allegation that Google was
retaliating against plaintiff for submitting a trademark infringement complaint against another app).
228 e-ventures Worldwide, LLC, 2017 WL 2210029, at *1–2. Specifically, the lawsuit involved claims of “unfair
competition under the Lanham Act, 15 U.S.C. § 1125(a); violation of Florida’s Deceptive and Unfair Trade Practices
Act; and tortious interference with contractual relationships.” Id. at *2.
229 Id. at *1.
230 Id. at *3.
231 Darnaa, 2016 WL 6540452, at *8–9.
232 Spy Phone Labs LLC, 2016 WL 6025469, at *8; accord GCM Partners, LLC v. Hipaaline Ltd., No. 20 C 6401, 2020
WL 6867207, at *13 (N.D. Ill. Nov. 23, 2020).
233 Darnaa, 2016 WL 6540452, at *8.
234 Id.; Spy Phone Labs, 2016 WL 6025469, at *8. But see Langdon v. Google, Inc., 474 F. Supp. 2d 622, 631 (D. Del.
2007) (rejecting plaintiff’s assertion that the provider acted in bad faith because it gave false reasons for declining to
run his ads, on the grounds that the provider must have permissibly concluded they were “otherwise objectionable”).
235 Smith v. Trusted Universal Standards in Elec. Transactions, Inc., No. 09-4567, 2011 WL 900096, at *9 (D.N.J. Mar.
15, 2011).
236 Spy Phone Labs, 2016 WL 6025469, at *8. See also e360Insight, LLC v. Comcast Corp., 546 F. Supp. 2d 605, 609
(N.D. Ill. 2008) (ruling that plaintiff did not sufficiently plead an “absence of good faith” even though the plaintiff
claimed the provider “singl[ed] out” the plaintiff).
237 e360Insight, LLC, 546 F. Supp. 2d at 609; Deutsch v. Microsoft Corp., No. 22-2904, 2023 WL 2966947, at *6
(D.N.J. Apr. 17, 2023).
Congressional Research Service
22

Section 230: An Overview

National Committee (RNC) to users’ spam folders.238 The RNC proffered a study allegedly
showing that Gmail labeled Republican campaign emails as spam at a significantly higher rate
than Democratic emails.239 The court held this study alone did not demonstrate bad faith.240
Among other factors, the court observed that the study did not attribute any motive to Google,
that Google had worked with the RNC to reduce its spam rate, and that the RNC conducted an
internal test suggesting technical features rather than content affected the spam rate.241
Objectionable Material
The second important limitation on Section 230(c)(2)(A) immunity is that it applies only when
providers or users restrict the listed types of content: “material that the provider or user considers
to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise
objectionable.”242 Although this list includes only specific types of content, it can still be
interpreted relatively broadly. In particular, some courts have interpreted the catch-all phrase
“otherwise objectionable” broadly because Section 230(c)(2)(A) states that the provider or user is
the one who determines whether the content is objectionable.243 As one court noted, the statute’s
text injects “a subjective element” into this inquiry, by asking whether “the provider or user
considers” the content to be objectionable.244 Thus, some courts have concluded that material
classified as spam or malware can be considered “harassing” or “objectionable” under Section
230(c)(2)(A).245 In some cases, courts have looked to providers’ policies to determine whether the
providers considered the restricted material objectionable.246
In 2009, one Ninth Circuit judge expressed concern about interpreting “otherwise objectionable”
too broadly, cautioning that “the literal terms of” Section 230(c)(2)(A) could be read to grant
providers “free license to unilaterally block the dissemination of material by content
providers.”247 While the “good faith” provision discussed above limits providers’ discretion,248
some courts have concluded that “otherwise objectionable” should also be read more narrowly to

238 Republican Nat’l Comm. v. Google, Inc., No. 2:22-cv-01904, 2023 WL 5487311, at *6 (E.D. Cal. Aug. 24, 2023).
239 Id.
240 Id.
241 Id.
242 47 U.S.C. § 230(c)(2)(A).
243 See, e.g., e360Insight, 546 F. Supp. 2d at 607–08.
244 Id. at 608. See also, e.g., Zango, Inc. v Kaspersky Lab, Inc., No. 07-0807-JCC, 2007 WL 5189857, at *4 (W.D.
Wash. Aug. 28, 2007) (“Section 230(c)(2)(A), which provides the definition of the relevant material described in
Section 230(c)(2)(B), does not require that the material actually be objectionable; rather, it affords protection for
blocking material ‘that the provider or user considers to be’ objectionable.” (quoting 47 U.S.C. § 230(c)(2)(A))), aff’d,
568 F.3d 1169 (9th Cir. 2009). Cf. Holomaxx Techs. v. Microsoft Corp., 783 F. Supp. 2d 1097, 1104 (N.D. Cal. 2011)
(“No court has articulated specific, objective criteria to be used in assessing . . . a provider’s subjective determination
of what is ‘objectionable’ . . . . Here, however, it is clear . . . that Microsoft reasonably could conclude that Holomaxx’s
emails were ‘harassing’ and thus ‘otherwise objectionable.’” (emphasis added)).
245 E.g., e-ventures Worldwide, LLC v. Google, Inc., No. 2:14-cv-646-FtM-PAM-CM, 2017 WL 2210029, at *3 (M.D.
Fla. Feb. 8, 2017) (“[S]pam is undoubtedly ‘harassing’ or ‘objectionable’ content for purposes of the CDA.”). See also
Zango, 2007 WL 5189857, at *4 (“There is no question that [the provider] considers the software to be objectionable
[as malware].”); Langdon v. Google, Inc., 474 F. Supp. 2d 622, 631 (D. Del. 2007) (concluding implicitly, without
discussion, that Section 230 barred plaintiff’s lawsuit because Google considered his ads “otherwise objectionable”).
246 E.g., e360Insight, 546 F. Supp. 2d at 608.
247 Zango, Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169, 1178 (9th Cir. 2009) (Fisher, J., concurring).
248 Cf. id. at 1179 (expressing concern that Section 230(c)(2)(B) does not contain a good faith limitation).
Congressional Research Service
23

link to page 28 link to page 29 Section 230: An Overview

avoid giving providers this free license.249 For example, one trial court denied Section 230
immunity to YouTube in a case challenging YouTube’s decision to remove a video because its
view count had allegedly been artificially inflated.250 The court noted that the ordinary meaning
of “objectionable” could include anything a provider finds undesirable, but ultimately concluded
that such a broad definition was inconsistent with “the context, history, and purpose” of Section
230.251 Looking to the list of adjectives preceding “otherwise objectionable,” the court believed
that Congress was focused on “potentially offensive materials, not simply any materials
undesirable to a content provider or user.”252 Consequently, the court said that “it is hard to
imagine that the phrase includes . . . the allegedly artificially inflated view count.”253
Similarly looking to congressional intent, the Ninth Circuit held in a 2019 case that the term
“otherwise objectionable” should be interpreted to exclude anticompetitive conduct.254 At the
same time, however, the court emphasized the “breadth of the term” and concluded it should be
read more broadly than the specific categories preceding the “catchall phrase.”255 This Ninth
Circuit ruling interpreted Section 230(c)(2)(B) and is discussed in more detail below.256
Section 230(c)(2)(B): Enabling Access Restriction
Section 230(c)(2)(B) provides that service providers and users may not “be held liable” for
actions “taken to enable or make available to . . . others the technical means to restrict access to
material” that falls within the specific categories listed in Section 230(c)(2)(A).257 Accordingly,
Section 230(c)(2)(B) focuses on enabling others to restrict access to objectionable material, and
offers immunity to, for example, “providers of programs that filter adware and malware,”258 as
well as services that enable the filtering of spam email.259 Courts have concluded that companies

249 See, e.g., Enigma Software Grp. USA, LLC v. Malwarebytes, Inc., 946 F.3d 1040, 1050 (9th Cir. 2019); Song Fi
Inc. v. Google, Inc., 108 F. Supp. 3d 876, 884 (N.D. Cal. 2015); Darnaa, LLC v. Google, Inc., No. 15-cv-03221-RMW,
2016 WL 6540452, at *8 (N.D. Cal. Nov. 2, 2016).
250 Song Fi, 108 F. Supp. 3d at 882.
251 Id. at 882, 884.
252 Id. See also Darnaa, 2016 WL 6540452, at *8 (“The context of § 230(c)(2) appears to limit the term [objectionable]
to that which the provider or user considers sexually offensive, violent, or harassing in content.”).
253 Song Fi, 108 F. Supp. 3d at 883.
254 Enigma Software Grp. USA, 946 F.3d at 1045 (“[T]he phrase ‘otherwise objectionable’ does not include software
that the provider finds objectionable for anticompetitive reasons.”); id. at 1051 (“Congress wanted to encourage the
development of filtration technologies, not to enable software developers to drive each other out of business.”).
255 Id. at 1051; see also id. at 1052 (“We think that the catchall was more likely intended to encapsulate forms of
unwanted online content that Congress could not identify in the 1990s.”). See also, e.g., Word of God Fellowship, Inc.
v. Vimeo, Inc., 166 N.Y.S.3d 3, 7–8 (N.Y. App. Div. 2022) (rejecting a narrow reading of “objectionable” given the
differences in the categories and concluding “vaccine misinformation may be ‘otherwise objectionable’ content that
providers are entitled to remove”).
256 Infra notes 271 to 277 and accompanying text.
257 47 U.S.C. § 230(c)(2)(B). Although Section 230(c)(2)(B) refers to “material described in paragraph (1),” a note in
the United States Code indicates that this is likely meant to reference “subparagraph (A)” instead. Id. n.1.
258 Zango, Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169, 1174 (9th Cir. 2009). See generally, e.g., Russell A. Miller, The
Legal Fate of Internet Ad-Blocking
, 24 B.U. J. SCI. & TECH. L. 301, 358–60 (2018) (discussing how Section
230(c)(2)(B) might protect ad-blocking firms from liability).
259 Smith v. Trusted Universal Standards in Elec. Transactions, Inc., No. 09-4567, 2011 WL 900096, at *6 (D.N.J. Mar.
15, 2011) (granting Section 230(c)(2)(B) immunity to service that investigated and provided information about IP
addresses, “help[ing] information content providers restrict access to spam email”); id. at *8 (granting Section
230(c)(2)(B) immunity to software that “provide[d] Comcast with a means to restrict access to harassing spam email”).
Congressional Research Service
24

link to page 26 Section 230: An Overview

like Facebook are also eligible for Section 230(c)(2)(B) immunity, to the extent they provide
users with tools to hide or otherwise restrict their own access to content.260
The fact that a company provides users with the choice to opt out of receiving certain content,
however, may not always be sufficient to gain Section 230(c)(2)(B) immunity.261 In one case, a
plaintiff sued Yahoo! for sending automated text message notifications about messages the
plaintiff had received on Yahoo! Messenger.262 Yahoo! claimed that the suit was barred by Section
230(c)(2)(B) because the text message “include[d] a link to a help page which . . . contain[ed]
instructions on how to block further messages,” and accordingly, made “available the ‘technical
means to restrict access’ to messages which plaintiff might deem ‘objectionable.’”263 The trial
court rejected this claim, noting that because the text message notifications were sent
automatically, “neither Yahoo! nor the mobile phone user ha[d] the opportunity to determine
whether the third party message” was objectionable.264 Accordingly, the court held that Yahoo!
could not claim Section 230(c)(2)(B) immunity where it “did not engage in any form of content
analysis of the subject text to identify material that was offensive or harmful prior to the
automatic sending of a notification message.”265
Because Section 230(c)(2)(B) applies only to actions restricting the types of content listed in
Section 230(c)(2)(A),266 it implicates the same interpretive questions discussed above regarding
whether the provider or user considered the restricted material “to be obscene, lewd, lascivious,
filthy, excessively violent, harassing, or otherwise objectionable.”267 However, unlike Section
230(c)(2)(A), Section 230(c)(2)(B) does not contain an explicit requirement for the provider or
user to act in good faith.268 Thus, one Ninth Circuit judge expressed concern that Section
230(c)(2)(B) could be read to grant immunity to bad faith conduct, including “covert,
anticompetitive blocking” of competitors.269 The judge believed it was “very likely” that
Congress “did not intend to immunize” such conduct.270
In a 2019 decision, Enigma Software Group USA, LLC v. Malwarebytes, Inc., the Ninth Circuit
held that Section 230(c)(2)(B) did not block a suit alleging anticompetitive conduct.271 A
company that sold computer security software sued a competitor after the competitor began
flagging some of the plaintiff’s programs as “potentially unwanted programs.”272 The plaintiff

260 Fehrenbach v. Zeldin, No. 17-CV-5282, 2018 WL 4242452, at *5 (E.D.N.Y. Aug. 6, 2018) (holding that Section
230(c)(2)(B) immunized Facebook from a complaint premised on the fact that Facebook allows users to hide
comments).
261 Sherman v. Yahoo! Inc., 997 F. Supp. 2d 1129, 1138 (S.D. Cal. 2014).
262 Id. at 1130.
263 Id. at 1137 (quoting 47 U.S.C. § 230(c)(2)).
264 Id. at 1138.
265 Id.
266 47 U.S.C. § 230(c)(2)(B).
267 Id. § 230(c)(2)(A); supra “Objectionable Material.”
268 47 U.S.C. § 230(c)(2). See also, e.g., Zango, Inc. v. Kaspersky Lab, Inc., 568 F.3d 1169, 1177 (9th Cir. 2009)
(holding that allegations that provider acted in bad faith did not preclude dismissal of suit under Section 230(c)(2)(B)
because this subparagraph “has no good faith language,” and noting that the plaintiff waived any argument that the
provision “should be construed implicitly to have a good faith component”).
269 Zango, 568 F.3d at 1179 (Fisher, J., concurring).
270 Id.; see also id. at 1179 n.3 (“[T]he legislative history the parties cite is not helpful in determining the exact
boundaries of what Congress intended to immunize. Whatever those exact boundaries, I doubt Congress intended to
leave victims of malicious or anticompetitive blocking without a cause of action . . . .”).
271 Enigma Software Grp. USA, LLC v. Malwarebytes, Inc., 946 F.3d 1040, 1045 (9th Cir. 2019).
272 Id. at 1047–48.
Congressional Research Service
25

link to page 7 link to page 8 Section 230: An Overview

argued that this characterization served “as a ‘guise’ for anticompetitive conduct.”273 In evaluating
the competitor’s attempt to claim immunity under Section 230(c)(2)(B), the Ninth Circuit looked
to Section 230’s purpose, concluding that “Congress wanted to encourage the development of
filtration technologies, not to enable software developers to drive each other out of business.”274
Accordingly, the court rejected the idea that the competitor could claim immunity “regardless of
anticompetitive purpose.”275 The court believed that the term “objectionable” is not limited only
to “material that is sexual or violent in nature,” and can encompass other “forms of unwanted
online content that Congress could not identify in the 1990s.”276 But “if a provider’s basis for
objecting to and seeking to block materials is because those materials benefit a competitor,” the
court held that the provider could not claim Section 230 immunity.277
This decision was appealed to the Supreme Court, and although the Court declined the appeal, the
case garnered a number of amicus briefs from parties interested in the case, as well as a separate
statement from Justice Thomas respecting the denial of certiorari.278 Interest groups argued that
the Ninth Circuit’s decision improperly imported a “good faith” requirement into Section
230(c)(2)(B), even though the text did not contain such a limitation.279 In an opinion concurring
in the Court’s decision to deny certiorari, Justice Thomas argued that the Ninth Circuit decision—
and other decisions interpreting Section 230—improperly “relied on purpose and policy” rather
than textual arguments, creating “questionable precedent.”280 It remains to be seen whether courts
outside the Ninth Circuit will agree with its ruling.
Section 230(e): Exceptions
As detailed above, Section 230(e) outlines five exceptions to the immunity created by Section
230.281 A defendant cannot claim Section 230 immunity as a basis to dismiss a federal criminal
prosecution or any lawsuit brought under intellectual property laws, state laws that are consistent
with Section 230, certain electronic communications privacy laws, or certain sex trafficking
laws.282 Outside of these exceptions, courts have generally held that Section 230 will bar
inconsistent liability even under later-enacted federal civil laws.283

273 Id. at 1048. Specifically, the complaint alleged both state law causes of action—deceptive business practices and
tortious interference with business and contractual relations—and a federal claim under the Lanham Act. Id. The Ninth
Circuit also considered whether the Lanham Act claim fell within the Section 230 exception for intellectual property
claims, holding that it did not. Id. at 1045.
274 Id. at 1051.
275 Id.
276 Id. at 1051–52.
277 Id. at 1052. However, the court noted that the defendant provider disputed whether it did engage in “anticompetitive
blocking” and claimed instead that it found the plaintiff’s “programs ‘objectionable’ for legitimate reasons based on the
programs’ content.” Id. The court suggested this factual dispute could be resolved on remand to the lower court. Id.
278 See Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC, 141 S. Ct. 13 (2020).
279 See, e.g., Brief of Electronic Frontier Foundation as Amicus Curiae in Support of Petitioner at 4, Malwarebytes, Inc.,
208 L. Ed. 2d 197 (No. 19-1284); Brief of TechFreedom as Amicus Curiae in Support of Petitioner at 5, Malwarebytes,
Inc.
, 208 L. Ed. 2d 197 (No. 19-1284).
280 Malwarebytes, Inc., 141 S. Ct. at 13–14 (Thomas, J., statement respecting the denial of certiorari).
281 Supra notes 40 to 49 and accompanying text.
282 See 47 U.S.C. § 230(e).
283 For example, two federal courts of appeals concluded that the Justice Against Sponsors of Terrorism Act, adopted in
2016, did not implicitly repeal Section 230, and Section 230 would therefore bar any inconsistent liability. Gonzalez v.
Google LLC, 2 F.4th 871, 889 (9th Cir. 2021), vacated, 598 U.S. 617 (2023) (ruling on the merits of the claims and
declining to address the application of Section 230); Force v. Facebook, Inc., 934 F.3d 53, 72 (2d Cir. 2019).
Congressional Research Service
26

link to page 33 Section 230: An Overview

Federal Criminal Law
The first exception to Section 230 immunity is for “any . . . Federal criminal statute,” meaning
that any defendant in a federal criminal prosecution cannot claim Section 230 immunity.284 For
example, Section 230 does not bar prosecution under federal statutes that prohibit the knowing
distribution of obscene materials online.285 Neither did Section 230 bar the federal prosecution of
Backpage.com corporate entities for conspiracy to engage in money laundering.286 This exception
does not include state criminal laws, and courts have read Section 230 to preempt inconsistent
state criminal prosecutions.287
Most courts to consider the issue have interpreted Section 230(e)(1) to allow only criminal
prosecutions, not civil lawsuits based on violations of federal criminal laws.288 A number of
plaintiffs have argued that, particularly where federal law creates criminal and civil liability for
the same conduct, applying Section 230 to bar suits under a civil enforcement provision would
“impair the enforcement” of the criminal law.289 Several courts have rejected those arguments,290
noting the traditional distinction between criminal and civil liability and concluding that, by
referring only to “criminal” statutes in Section 230(e)(1), Congress intended to exclude civil
suits.291
Intellectual Property Law
The second exception to Section 230 immunity is for “any law pertaining to intellectual
property.”292 This phrase is somewhat ambiguous,293 but courts have concluded that this exception

284 47 U.S.C. § 230(e)(1).
285 See, e.g., 18 U.S.C. § 1462 (making it a crime to “knowingly use[] any . . . interactive computer service . . . for
carriage in interstate or foreign commerce—(a) any obscene, lewd, lascivious, or filthy . . . picture, motion-picture film,
. . . writing, print, or other matter of indecent character; or (b) any obscene, lewd, lascivious, or filthy . . . electrical
transcription, or other article or thing capable of producing sound”).
286 See Press Release, U.S. Dep’t of Justice, Backpage’s Co-founder and CEO, As Well As Several Backpage-Related
Corporate Entities, Enter Guilty Pleas (Apr. 12, 2018), https://www.justice.gov/opa/pr/backpage-s-co-founder-and-ceo-
well-several-backpage-related-corporate-entities-enter-guilty.
287 See generally, e.g., Voicenet Commc’ns, Inc. v. Corbett, No. 04-1318, 2006 WL 2506318, at *3 (E.D. Pa. Aug. 30,
2006) (interpreting Section 230(e)(1) not to include state criminal laws); see also, e.g., Universal Commc’n Sys., Inc. v.
Lycos, Inc., 478 F.3d 413, 422 (1st Cir. 2007) (dismissing suit under state cyberstalking law because defendant’s
“liability would depend on treating it as the publisher of those postings”); Backpage.com, LLC v. McKenna, 881 F.
Supp. 2d 1262, 1273 (W.D. Wash. 2012) (concluding proposed state legislation “is likely inconsistent with and
therefore expressly preempted by [47 U.S.C. § 230]” because it imposes “liability on Backpage.com and [Internet
Archive] for information created by third parties—namely ads for commercial sex acts depicting minors—so long as it
‘knows’ that it is publishing, disseminating, displaying . . . such information”).
288 See, e.g., Yuksel v. Twitter, Inc., No. 22-cv-05415-TSH, 2022 WL 16748612, at *5 (N.D. Cal. Nov. 7, 2022); but
see
Doe #1 v. MG Freesites, Ltd., No. 7:21-cv-00220-LSC, 2022 WL 407147, at *22 (N.D. Ala. Feb. 9, 2022)
(indicating Section 230 did not bar claims under certain civil provisions contained in Title 18); Nieman v. Versuslaw,
Inc., No. 12-3104, 2012 WL 3201931, at *9 (C.D. Ill. Aug. 3, 2012) (saying in dicta that “arguably, § 230 of the CDA
may not be used to bar a civil RICO claim because that would impair the enforcement of a Federal criminal statute”).
Other exceptions do allow specific federal civil claims; for example, civil suits based on certain federal sex trafficking
offenses may be permitted under a different exception. See infra “Sex Trafficking Law (FOSTA).
289 E.g., Force v. Facebook, Inc., 934 F.3d 53, 71 (2d Cir. 2019); Doe v. Backpage.com, LLC, 817 F.3d 12, 23 (1st Cir.
2016); Doe v. Bates, No. 5:05CV91, 2006 WL 8440858, at *13 (E.D. Tex. Jan. 18, 2006).
290 E.g., Force, 934 F.3d at 72; Backpage.com, 817 F.3d at 23; Bates, 2006 WL 8440858, at *14.
291 See, e.g., Force, 934 F.3d at 71; Backpage.com, 817 F.3d at 23.
292 47 U.S.C. § 230(e)(2).
293 See Perfect 10, Inc. v. CCBill LLC, 488 F.3d 1102, 1119 (9th Cir. 2007) (“The CDA does not contain an express
(continued...)
Congressional Research Service
27

link to page 27 Section 230: An Overview

for laws “pertaining to intellectual property” allows, for example, suits for copyright and
trademark infringement.294 In evaluating whether Section 230(e)(2) applies, courts have
sometimes looked not only to whether the plaintiff is suing under a law that generally involves
intellectual property issues, but more specifically, whether the plaintiff’s claim itself involves an
intellectual property right.295
For example, the Ninth Circuit ruled in 2019 that a false advertising claim brought under the
Lanham Act did not fall within the Section 230(e)(2) exception.296 The court noted that the
Lanham Act, a federal law, “contains two parts, one governing trademark infringement and
another governing false designations of origin, false descriptions, and dilution.”297 Noting that
Congress intended to provide broad immunity in Section 230, the Ninth Circuit construed the
intellectual property exception narrowly, to include only “claims pertaining to an established
intellectual property right . . . like those inherent in a patent, copyright, or trademark.”298 Because
the false advertising claim did not “relate to or involve trademark rights or any other intellectual
property rights,” the court held that the intellectual property exception did not apply.299 Somewhat
similarly, a New Hampshire trial court held in one case that three “right of privacy” torts—
intrusion upon seclusion, publication of private facts, and casting in a false light—involved rights
that could not be considered property rights.300 Accordingly, the court concluded that the claims
did “not sound in ‘law pertaining to intellectual property’” and Section 230 barred the claims.301
Courts have disagreed about whether Section 230(e)(2) includes state law claims such as the right
to publicity,302 a cause of action that essentially allows plaintiffs to sue for the improper
commercial use of their identity.303 Some courts, including the Third Circuit, have held that the
exception does include state intellectual property claims, allowing, for example, state law claims
for copyright infringement, misappropriation and unfair competition, and right of publicity to

definition of ‘intellectual property,’ and there are many types of claims in both state and federal law which may—or
may not—be characterized as ‘intellectual property’ claims.”).
294 E.g., Parker v. Google, Inc., 422 F. Supp. 2d 492, 503 n.8 (E.D. Penn. 2006); Gucci Am., Inc. v. Hall & Assocs.,
135 F. Supp. 2d 409, 414 (S.D.N.Y. 2001); Malibu Media, LLC v. Weaver, No. 8:14-cv-1580-T-33TBM, 2016 WL
1394331, at *8 (M.D. Fla. Apr. 8, 2016).
295 Enigma Software Grp. USA, LLC v. Malwarebytes, Inc., 946 F.3d 1040, 1052–53 (9th Cir. 2019). See also, e.g.,
Corker v. Costco Wholesale Corp., No. C19-0290RSL, 2019 WL 5895430, at *6 (W.D. Wash. Nov. 12, 2019)
(concluding Section 230(e)(2) did not apply to a false association claim because the claim did “not involve an
intellectual property right or trademark”); Doe v. Friendfinder Network, Inc., 540 F. Supp. 2d 288, 302–03 (D.N.H.
2008) (holding that Section 230(e)(2) did not apply to state right of privacy claims that involved personal rights).
296 Enigma Software Grp. USA, 946 F.3d at 1053. However, the court nonetheless concluded that because the claim
was “based on allegations of [anticompetitive] conduct,” it would not apply Section 230(c)(2) to dismiss the claim. Id.
at 1054. This portion of the opinion is discussed supra “Section 230(c)(2)(B): Enabling Access Restriction.
297 Enigma Software Grp. USA, 946 F.3d at 1053.
298 Id.
299 Id. at 1053–54.
300 Friendfinder Network, Inc., 540 F. Supp. 2d at 302–03.
301 Id. at 303. See also Ratermann v. Pierre Fabre USA, Inc., No. 22-CV-325, 2023 WL 199533, at *5 (S.D.N.Y. Jan.
17, 2023) (concluding Section 230(e)(2) did not apply to a state law construed as creating “a statutory right to privacy,
not property”).
302 See, e.g., Stayart v. Yahoo! Inc., 651 F. Supp. 2d 873, 888–89 (W.D. Wis. 2009) (noting that a right to publicity
claim “is generally considered an intellectual property claim,” implicating this exception, but further noting the
“disagreement among various federal courts regarding the scope of the intellectual property exception,” and ultimately
dismissing the claim on jurisdictional grounds); see also Friendfinder Network, Inc., 540 F. Supp. 2d at 302 (holding
that a state right of publicity claim “arises out of a ‘law pertaining to intellectual property’ within the meaning of” 47
U.S.C. § 230(e)(2)).
303 See, e.g., ETW Corp. v. Jireh Publ’g, Inc., 332 F.3d 915, 928–37 (6th Cir. 2003) (discussing the right of publicity).
Congressional Research Service
28

Section 230: An Overview

proceed.304 These courts have noted that the exception refers broadly to “any law,”305 and that
other provisions of Section 230 distinguish between state and federal law, suggesting that “any
law” includes both state and federal laws.306
In contrast, the Ninth Circuit has held that Section 230(e)(2) encompasses only federal laws and
that Section 230 bars state intellectual property claims.307 In Perfect 10, Inc. v. CCBill LLC, the
Ninth Circuit looked to Congress’s policy goals and “construe[d] the term ‘intellectual property’
to mean ‘federal intellectual property.’”308 The court observed that state intellectual property laws
“are by no means uniform,” and could subject websites to varied liability schemes.309 In the view
of the court, this outcome “would be contrary to Congress’s expressed goal of insulating the
development of the Internet from the various state-law regimes.”310 The Ninth Circuit did not
discuss the fact that the text of Section 230(e)(2) refers to “any law,” noting only that the term
“intellectual property” was undefined.311
State Law
The third exception provides that Section 230 will not “prevent any State from enforcing any
State law that is consistent with this section,” allowing states to continue enforcing any laws that
are “consistent” with Section 230.312 As one trial court described this provision, “Section
230(e)(3) does not attempt to define what state law is consistent and inconsistent with the CDA;”
in effect, this subsection “provides no substantive content.”313 In evaluating whether a state law,
or a particular application of a state law, is consistent with Section 230 or whether it is instead
inconsistent and preempted, courts have looked to whether the law would violate Section
230(c)(1) by treating service providers or users as the publisher of another person’s content.314
Accordingly, for example, one court concluded that a libel claim that would hold a website

304 Hepp v. Facebook, 14 F.4th 204, 210 (3d Cir. 2021); Atl. Recording Corp. v. Project Playlist, Inc., 603 F. Supp. 2d.
690, 694, 704 (S.D.N.Y. 2009); Friendfinder Network, Inc., 540 F. Supp. 2d at 302; Albert v. Tinder, Inc., No. 22-
60496-CIV, 2022 WL 18776124, at *11 (S.D. Fla. Aug. 5, 2022). The First Circuit suggested in one decision that a
state trademark claim was “not subject to Section 230 immunity.” Universal Commc’n Sys., Inc. v. Lycos, Inc., 478
F.3d 413, 422–23 (1st Cir. 2007). Courts have debated whether this conclusion was dicta, given that the First Circuit
ultimately concluded that the claim “would not survive” even if Section 230 did not apply. Id. at 423; compare
Friendfinder Network, Inc.
, 540 F. Supp. 2d at 299 (describing this statement as dicta), with Hepp, 14 F.4th at 210
(saying the merits discussion “was necessary only because” of the court’s Section 230(e)(2) ruling).
305 47 U.S.C. § 230(e)(2) (emphasis added).
306 Hepp, 14 F.4th at 210–11; Atl. Recording Corp., 603 F. Supp. 2d. at 703–04; Friendfinder Network, Inc., 540 F.
Supp. 2d at 299–300.
307 See, e.g., Perfect 10, Inc. v. CCBill LLC, 488 F.3d 1102, 1119 (9th Cir. 2007).
308 Perfect 10, Inc., 488 F.3d at 1118–19.
309 Id. at 1118.
310 Id.
311 See id. at 1119; see also, e.g., Friendfinder Network, Inc., 540 F. Supp. 2d at 299 (“The Ninth Circuit made no
attempt to reckon with the presence of the term ‘any’—or for that matter, the absence of term ‘federal’—in § 230(e)(2)
when limiting it to federal intellectual property laws.”).
312 47 U.S.C. § 230(e)(3).
313 Atl. Recording Corp. v. Project Playlist, Inc., 603 F. Supp. 2d. 690, 694, 702 (S.D.N.Y. 2009).
314 Compare, e.g., HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676, 683 (9th Cir. 2019) (holding that an
ordinance regulating home rentals “is not ‘inconsistent’ with the CDA” because it would not impose a duty on websites
to monitor third-party content), with, e.g., Backpage.com, LLC v. McKenna, 881 F. Supp. 2d 1262, 1273 (W.D. Wash.
2012) (holding that a state criminal law “is likely inconsistent with and therefore expressly preempted by Section 230”
because it would impose liability on websites for third-party content). Cf. Dangaard v. Instagram, LLC, No. C 22-
01101 WHA, 2022 WL 17342198, at *5 (N.D. Cal. Nov. 30, 2022) (citing Section 230(e)(3) and the “policy”
provisions in Section 230(b) as additional support for a ruling that Section 230(c)(1) did not bar certain claims).
Congressional Research Service
29

Section 230: An Overview

operator “liable for statements he actually authored” was “consistent with” Section 230 and could
proceed.315
Electronic Communications Privacy Act of 1986
The fourth exception to Section 230 immunity is for claims brought under the Electronic
Communications Privacy Act of 1986 (ECPA) “or any similar State law.”316 ECPA is a federal law
that governs wiretapping and electronic eavesdropping.317 ECPA creates a number of criminal
offenses, which would fall within the first exception for federal crimes,318 but also contains civil
liability provisions, which fall within this fourth exception.319 Perhaps most relevant to service
providers that host user-generated content, ECPA makes it unlawful not only to intercept covered
communications intentionally, but also to disclose information intentionally if the person “ha[s]
reason to know that the information was obtained through” an unlawful interception.320 However,
the Seventh Circuit ruled in one case that this exception did not allow a lawsuit against companies
that provided web hosting services to people who sold illegally obtained videos.321 The court said
the plaintiffs had not shown that the web service companies had “disclose[d] any
communication,” declining to impose secondary liability on the web service providers absent
evidence that the companies provided “culpable assistance” to the “wrongdoer.”322
Sex Trafficking Law (FOSTA)
The fifth exception to Section 230 immunity was added in 2018 by the Allow States and Victims
to Fight Online Sex Trafficking Act of 2017 (FOSTA) and relates to certain sex trafficking
offenses.323 Specifically, Section 230(e)(5) provides that Section 230 will not bar (1) federal324
civil actions “brought under” 18 U.S.C. § 1595 “if the conduct underlying the charge would
constitute a violation of” 18 U.S.C. § 1591, which prohibits knowingly engaging in sex
trafficking of minors or in sex trafficking that involves force, fraud, or coercion, or knowingly
benefiting from participation in a venture that engaged in such an act;325 (2) state criminal

315 Cisneros v. Sanchez, 403 F. Supp. 2d 588, 592 (S.D. Tex. 2005) (emphasis added).
316 47 U.S.C. § 230(e)(4). See also, e.g., Universal Commc’n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 421 (1st Cir. 2007)
(“We note that liability under the ECPA is specifically exempted from Section 230 immunity.”).
317 See generally CRS Report R41733, Privacy: An Overview of the Electronic Communications Privacy Act, by
Charles Doyle.
318 See 47 U.S.C. § 230(e)(1).
319 See id. § 230(e)(4).
320 18 U.S.C. §§ 2511 (outlining the prohibitions), 2520 (authorizing civil suits).
321 Doe v. GTE Corp., 347 F.3d 655, 662 (7th Cir. 2003).
322 Id. at 658–59.
323 Allow States and Victims to Fight Online Sex Trafficking Act of 2017, Pub. L. No. 115-164, § 4, 132 Stat. 1253,
1254 (2018). One state court described Section 230(e)(5) as a “rule of construction” going beyond a mere exception
applicable to the specified claims. In re Facebook, Inc., 625 S.W.3d 80, 100 (Tex. 2021). A number of federal courts
have described it as an “exception.” E.g., Does v. Reddit, Inc., 51 F.4th 1137, 1140 (9th Cir. 2022); Doe #1 v. MG
Freesites, Ltd., No. 7:21-cv-00220-LSC, 2022 WL 407147, at *10 (N.D. Ala. Feb. 9, 2022).
324 While one state court concluded Section 230(e)(5) also allows state civil suits that are “materially indistinguishable”
from liability under 18 U.S.C. § 1595, a number of federal courts have disagreed with this view, concluding Section
230(e)(5) does not allow state civil lawsuits. Compare In re Facebook, Inc., 625 S.W.3d at 100, with, e.g., Doe v.
Reddit, Inc., No. SACV 21-768 JVS(KESx), 2021 WL 4348731, at *5 (C.D. Cal. July 12, 2021), and J.B. v. G6 Hosp.,
LLC, No. 19-cv-07848-HSG, 2020 WL 4901196, at *7 (N.D. Cal. Aug. 20, 2020).
325 “Participation in a venture” is in turn defined as “knowingly assisting, supporting, or facilitating a violation” of
these provisions. 18 U.S.C. § 1591(e)(4). The D.C. Circuit concluded that this provision prohibits “aiding and abetting
sex trafficking.” Woodhull Freedom Found. v. United States, 72 F.4th 1286, 1298–99 (D.C. Cir. 2023).
Congressional Research Service
30

Section 230: An Overview

prosecutions where the underlying conduct would violate 18 U.S.C. § 1591; and (3) state criminal
prosecutions where the underlying conduct would violate 18 U.S.C. § 2421A, which prohibits
“operat[ing] an interactive computer service . . . with the intent to promote or facilitate the
prostitution of another person” in jurisdictions where such conduct is illegal.326 There has been
significant litigation over the scope of this exception.
The FOSTA exception will apply only if a private plaintiff or state prosecutor can demonstrate
that the service provider or user violated the specified federal laws.327 A number of early cases
considering the scope of this exception faced disagreements over whether the plaintiff had to
prove a violation of the criminal law, 18 U.S.C. § 1591, or merely a violation of 18 U.S.C.
§ 1595, the provision allowing civil actions.328 This distinction was significant because the two
provisions have slightly distinct elements: most significantly, different mens rea, or mental state,
requirements. As stated by one trial court, while both statutes authorize liability against
“beneficiaries” of a sex trafficking venture, “the criminal statute requires that the defendant have
actual knowledge of sex trafficking at issue while the civil statute allows a plaintiff to plead that
the defendant merely had constructive knowledge.”329
The Ninth Circuit and a number of trial courts have adopted the former reading, requiring
plaintiffs to prove the defendant violated the criminal law in order to avoid Section 230
immunity.330 In Does 1–6 v. Reddit, Inc., for example, the Ninth Circuit considered claims
alleging that the social media platform Reddit had “done little to remove” or prevent sexually
explicit images and videos of minors because such images and videos “drive[] user traffic” and
contribute to “substantial advertising revenue.”331 The plaintiffs alleged the FOSTA exception
applied and Reddit was liable “as a beneficiary of child sex trafficking.”332 To qualify for the
exception, the Ninth Circuit held that the “defendant-website’s own conduct must ‘underl[ie]’ the
claim” and violate the criminal provisions in 18 U.S.C. § 1591.333 Thus, the plaintiff had to prove
the website violated that law “by directly sex trafficking or, with actual knowledge, ‘assisting,
supporting, or facilitating’ trafficking.”334 In addition to requiring beneficiaries to possess actual
knowledge, the Ninth Circuit held that “mere association with sex traffickers” was insufficient to
violate the criminal law “absent some knowing ‘participation’ in the form of assistance, support,
or facilitation.”335 In the case before it, the Ninth Circuit concluded that the plaintiffs had alleged

326 47 U.S.C. § 230(e)(5).
327 See id.
328 See, e.g., Doe #1 v. MG Freesites, Ltd., No. 7:21-cv-00220-LSC, 2022 WL 407147, at *11–15 (N.D. Ala. Feb. 9,
2022) (acknowledging the disagreement and differing judicial resolutions but avoiding the question).
329 Id. at *11. 18 U.S.C. § 1595 provides: “An individual who is a victim of a violation of this chapter may bring a civil
action against the perpetrator (or whoever knowingly benefits, financially or by receiving anything of value from
participation in a venture which that person knew or should have known has engaged in an act in violation of this
chapter) . . . .” (emphasis added).
330 Does 1–6 v. Reddit, Inc., 51 F.4th 1137, 1141 (9th Cir. 2022); see also, e.g., M.H. & J. v. Omegle.com, LLC, No.
8:21-cv-814-VMC-TGW, 2022 WL 93575, at *6 (M.D. Fla. Jan. 10, 2022), appeal filed, No. 22-10338 (11th Cir. Jan.
31, 2022); Doe v. Kik Interactive, Inc., 482 F. Supp. 3d 1242, 1251 (S.D. Fla. 2020).
331 Does 1–6, 51 F.4th at 1139–40.
332 Id. at 1140.
333 Id. at 1143 (quoting 47 U.S.C. § 230(e)(5)(A)).
334 Id. at 1145 (quoting 18 U.S.C. § 1591(e)(4)).
335 Id. In another case, the D.C. Circuit concluded that 18 U.S.C. § 1591(e)(4) prohibits “aiding and abetting sex
trafficking,” or more precisely, “prohibits aiding and abetting a venture that one knows to be engaged in sex trafficking
while knowingly benefiting from that venture.” Woodhull Freedom Found. v. United States, 72 F.4th 1286, 1298–99
(D.C. Cir. 2023).
Congressional Research Service
31

link to page 16 Section 230: An Overview

“only that Reddit ‘turned a blind eye’ to the unlawful content posted on its platform, not that it
actively participated in sex trafficking.”336
In contrast, the Seventh Circuit allowed broader liability in G.G. v. Salesforce.com, Inc.
although its ruling did not weigh in on Section 230(e)(5).337 The minor plaintiff in that case was
trafficked on Backpage.com, and she alleged that Salesforce, which generally provides customer
relationship management software, “knowingly benefited from its participation in what it knew or
should have known was Backpage’s sex-trafficking venture.”338 The plaintiff alleged Salesforce
provided Backpage.com with specialized software and support that created a “close business
relationship.”339 The Seventh Circuit held that the allegations involving constructive knowledge
stated a claim under 18 U.S.C. § 1595 and avoided dismissal under Section 230.340 However, the
Seventh Circuit did not consider the scope of the FOSTA exception, instead concluding that
Section 230(c) did not apply on its own terms.341
Reform Proposals and Considerations for Congress
This section of the report discusses various proposals to reform Section 230, as well as some of
the legal considerations implicated by those proposals, including relevant First Amendment
issues. The report focuses primarily on examples of bills introduced in the 116th and 117th
Congresses that would have amended Section 230.342 The report does not discuss bills that would
have directly regulated content moderation practices such as by restricting the use of certain
algorithms or other design features, or imposing disclosure or transparency requirements outside
the Section 230 framework.
Overview of Reform Proposals and Select Legal Considerations
Following the enactment of FOSTA in 2018, Members of Congress introduced a variety of
proposals to further reform Section 230. There have also been a number of reform proposals from
outside commentators and the executive branch.343 While over 25 bills to amend Section 230
were introduced in each of the 116th and 117th Congresses, and Members of the 118th Congress
continued to introduce such proposals, no further amendments have been enacted.344 Although
there have been many proposals to reform Section 230’s immunity shield, some argue either that

336 Does 1–6, 51 F.4th at 1145. See also J.B. v. Craigslist, Inc., No. 22-15290, 2023 WL 3220913, at *1 (9th Cir. May
3, 2023) (mem.) (affirming trial court’s conclusion that Section 230(e)(5) did not allow lawsuit under 18 U.S.C. § 1595
where defendant lacked actual knowledge).
337 G.G. v. Salesforce.com, Inc., 76 F.4th 544, 548 (7th Cir. 2023).
338 Id. The suit was brought by the minor and her mother, but this report refers to a singular plaintiff for convenience.
339 Id.
340 Id. at 548, 555.
341 Id. at 567–68. This aspect of the case is discussed supra note 134 and accompanying text.
342 Some of the bills referenced in this discussion have been introduced in multiple Congresses, but for brevity, the
report generally only discusses one version.
343 See, e.g., CRS Legal Sidebar LSB10484, UPDATE: Section 230 and the Executive Order on Preventing Online
Censorship
, by Valerie C. Brannon et al.; Joe Biden, Republicans and Democrats, Unite Against Big Tech Abuses,
WALL ST. J. (Jan. 11, 2023), https://www.wsj.com/articles/unite-against-big-tech-abuses-social-media-privacy-
competition-antitrust-children-algorithm-11673439411.
344 See, e.g., The Telecommunication Act’s “Good Samaritan” Protection: Section 230, DISRUPTIVE COMPETITION
PROJECT, https://www.project-disco.org/featured/section-230/ (last visited Jan. 4, 2024); see also CRS Report R46662,
Social Media: Misinformation and Content Moderation Issues for Congress, by Jason A. Gallo and Clare Y. Cho,
Appendix B (listing bills from the 116th Congress).
Congressional Research Service
32

Section 230: An Overview

Section 230 should not be changed or that reforms should be modest and carefully considered.345
As commentators have observed, some of the reform proposals may conflict with others and
pursue divergent goals, making it more difficult to predict which of these reform efforts, if any,
may garner sufficient support to be enacted.346
While some Members of Congress have proposed to repeal Section 230 entirely,347 others suggest
more incremental rollbacks, removing immunity only for certain types of claims or certain
providers. Broadly, many proposals to reform Section 230 have pursued one of two distinct goals:
encouraging sites to take down more harmful content, and encouraging sites to take down less
lawful content.348 Some proposals include provisions aimed at both goals, although the goals can
sometimes be in tension depending on what type of content is being targeted. Content seen as
harmful by some may be seen as beneficial by others. The bills that have been introduced have
sought to achieve these goals in a variety of different ways, including by creating new exceptions
to Section 230 or conditioning Section 230 immunity on certain prerequisite actions. Other bills
have focused on procedural aspects of Section 230, such as clarifying that the provision operates
as an “affirmative defense” or does not bar injunctive relief.349
One initial consideration is that proposals to remove Section 230 immunity for either hosting or
taking down certain types of content will not necessarily result in a provider or user being held
liable for that activity. Instead, the result of these proposals is that if a provider would be liable
under some other law, such liability would not be barred by Section 230. Nonetheless, because of
this threat of liability, a service provider may respond to the removal of Section 230 immunity for
specific types of actions by ceasing that action: removing content or no longer engaging in certain
content moderation practices. For example, Craigslist took down its personal ads section in
response to FOSTA, reportedly out of concern that it might face lawsuits based on some of the
activity in that section of its site.350
However, the way any given provider responds to an amendment of Section 230 would depend on
a variety of factors. For instance, if Congress added an exception removing immunity for hosting
certain content, providers might continue to host that content if they believe the benefits of

345 See, e.g., Sen. Ron Wyden, Wyden Remarks at Section 230 Briefing Hosted by EFF (Mar. 8, 2023),
https://www.wyden.senate.gov/news/press-releases/wyden-remarks-at-section-230-briefing-hosted-by-eff; Clyde
Wayne Crews, The Case Against Social Media Content Regulation, COMPETITIVE ENTER. INST. (June 1, 2020),
https://cei.org/issue_analysis/the-case-against-social-media-content-regulation; Eric Goldman, Want to Learn More
About Section 230? A Guide to My Work
, TECH. & MKTG. L. BLOG (July 1, 2020),
https://blog.ericgoldman.org/archives/2020/07/want-to-learn-more-about-section-230-a-guide-to-my-work.htm;
Jennifer Huddleston, Does Content Moderation Need Changes to Section 230?, AM. ACTION FORUM (June 18, 2020),
https://www.americanactionforum.org/insight/does-content-moderation-need-changes-to-section-230/.
346 See, e.g., Rebecca Kern, White House Renews Call to ‘Remove’ Section 230 Liability Shield, POLITICO (Sept. 9,
2022), https://www.politico.com/news/2022/09/08/white-house-renews-call-to-remove-section-230-liability-shield-
00055771; Dean DeChiaro, OK to Change Section 230, Tech CEOs Say, But How Remains Elusive, CONG. Q (Nov. 17,
2020), https://plus.cq.com/doc/news-6052674.
347 E.g., S. 2972, 117th Cong. (2021); H.R. 8896, 116th Cong. (2020). Cf. 21st Century FREE Speech Act, S. 1384,
117th Cong. (2021) (proposing to repeal Section 230 and replace with an altered intermediary liability scheme).
348 Cf. Shaun B. Spencer, The First Amendment and the Regulation of Speech Intermediaries, 106 MAR. L. REV. 1, 8
(2022) (distinguishing “proxy-censor regulations” from “must-carry regulations”).
349 E.g., SAFE TECH Act, S. 299, 117th Cong. § 2(1) (2021).
350 See Brian Feldman, Craigslist’s Legendary Personals Section Shuts Down, N.Y. MAG. (Mar. 23, 2018),
https://nymag.com/intelligencer/2018/03/craigslist-shuts-down-personals-section-because-of-fosta.html. Craigslist
expressly cited FOSTA as the motive for its decision; others speculated that Reddit and Tumblr, among other sites,
made changes to their content policies in response to FOSTA. See, e.g., Paris Martineau, Tumblr’s Porn Ban Reveals
Who Controls What We See Online
, WIRED (Dec. 4, 2018), https://www.wired.com/story/tumblrs-porn-ban-reveals-
controls-we-see-online.
Congressional Research Service
33

link to page 9 link to page 4 link to page 42 Section 230: An Overview

hosting the content outweigh potential litigation costs, particularly if providers believe they are
likely to prevail in any lawsuits or that lawsuits are unlikely. Some have pointed to the outcome in
Stratton Oakmont, Inc., discussed above, to suggest that absent Section 230, sites might stop all
content moderation to attempt to avoid liability.351 Economic and social considerations may also
factor into a provider’s decision about how to respond to Section 230 amendments.352 For
instance, if Congress limited immunity for restricting certain types of content, providers might
continue to restrict it if they believe advertisers or users disfavor that content.
Another general consideration is what type of defendants would be subject to liability under any
given proposal to amend Section 230. As previously discussed, Section 230 is currently available
to any provider or user of an interactive computer service, a broad term referring to any service
that enables multiple users to access a computer server.353 A wholesale exception to Section 230
would remove that federal immunity for all interactive computer service providers and users.
Some proposals to amend Section 230 would instead only limit Section 230 immunity for certain
providers, such as larger providers or providers of certain types of services such as social
media.354 For a discussion of policy considerations in defining “online platforms,” see CRS
Report R47662, Defining and Regulating Online Platforms, coordinated by Clare Y. Cho.
Finally, as discussed in more detail in a later section, proposals to reform Section 230 may also
raise First Amendment considerations.355
Liability for Hosting Content
A number of proposals would have exposed service providers or users to greater liability for
hosting another’s content, with the apparent goal of incentivizing content removal or restriction.
Several bills would have created new exceptions to Section 230 by amending subsection (e) to
carve out certain categories of claims, similar to FOSTA.356 Other bills would have created
exceptions from only Section 230(c)(1) for certain types of claims,357 content,358 or defendants.359
The type of claim or content carved out from Section 230 protections varied. For instance,
multiple bills focused on child sexual exploitation, allowing claims premised on conduct that
violates new or existing laws related to distributing child sexual abuse material.360 Other bills

351 Supra Stratton Oakmont, Inc. v. Prodigy Services Co. In Stratton Oakmont, Inc. v. Prodigy Servs. Co., No.
31063/94, 1995 WL 323710, at *4 (N.Y. Sup. Ct. May 24, 1995), a state court held that a message board host could be
subject to liability as the publisher of allegedly defamatory statements in part because it removed other messages.
352 For a discussion of the content recommendation and moderation policies of social media sites, see CRS Report
R46662, Social Media: Misinformation and Content Moderation Issues for Congress, by Jason A. Gallo and Clare Y.
Cho; and CRS In Focus IF12462, Social Media Algorithms: Content Recommendation, Moderation, and Congressional
Considerations
, by Kristen E. Busch.
353 47 U.S.C. § 230(f)(2); supra “Text and Legislative History.”
354 E.g., Justice Against Malicious Algorithms Act of 2021, H.R. 5596, 117th Cong. § 2 (2021); Limiting Section 230
Immunity to Good Samaritans Act, H.R. 277, 117th Cong. § 2 (2021). See generally, e.g., Eric Goldman & Jess Miers,
Regulating Internet Services by Size, 2 COMPETITION POL’Y INT’L ANTITRUST CHRON. 24 (2021) (discussing
considerations related to distinguishing internet services based on size).
355 Infra “Free Speech Considerations.”
356 E.g., Civil Rights Modernization Act of 2021, H.R. 3184, 117th Cong. § 2(a) (2021); SAFE TECH Act, S. 299,
117th Cong. § 2(2) (2021); PLAN Act, H.R. 4232, 116th Cong. § 2(b) (2019).
357 E.g., Protecting Americans from Dangerous Algorithms Act, H.R. 8636, 116th Cong. (2020).
358 E.g., Health Misinformation Act of 2021, S. 2448, 117th Cong. (2021).
359 E.g., Accountability for Online Firearms Marketplaces Act, S. 2725, 117th Cong. (2021).
360 E.g., Holding Sexual Predators and Online Enablers Accountable Act, S. 5012, 116th Cong. § 5 (2020); EARN IT
Act of 2020, S. 3398, 116th Cong. § 5 (2020). See also, e.g., STOP CSAM Act of 2023, S. 1199, 118th Cong. § 5(e)
(2023); END CSAM Act, S. 823, 118th Cong. § 5(g) (2023).
Congressional Research Service
34

Section 230: An Overview

would have created exceptions for certain lawsuits brought under state law, including breach of
contract claims361 or claims relating to property rentals.362 Still other bills would have exempted
claims under federal and state civil rights laws that prohibit discrimination on the basis of a
protected class.363 Some of these bills would have authorized liability only if a service provider
used certain types of algorithms to deliver the specified type of content.364
In addition to proposals that created exceptions for harmful or illegal content, some proposals
would have effectively conditioned immunity on a service provider’s ability to remove harmful or
illegal content. For example, the CASE-IT Act, as introduced in the 116th Congress, stated that a
service provider or user would lose Section 230(c)(1) immunity for a year if they engaged in
activities such as permitting harmful content to be distributed to minors, if the harmful content
was “made readily accessible to minors by the failure of such provider or user to implement a
system designed to effectively screen users who are minors from accessing such content.”365
Apart from these bills focusing on specific claims or content, some bills would have more broadly
limited Section 230(c)(1) immunity.366 For example, some bills would have created new
exceptions for the enforcement of all federal civil laws.367 Other proposals would have exposed
providers to liability for hosting unlawful content if the provider was aware of that content.368 For
example, the Platform Accountability and Consumer Transparency Act (PACT Act), as introduced
in the 117th Congress, would have amended Section 230 so that some providers would lose
immunity under subsection (c)(1) if they were notified about certain illegal content or activity
occurring on their service and did not “remove the illegal content or stop the illegal activity”
within certain time periods.369 The PACT Act would have required written notice that contained,
among other elements, a copy of a court order finding the content or activity illegal.370

361 Limiting Section 230 Immunity to Good Samaritans Act, S. 3983, 116th Cong. § 2 (2020).
362 PLAN Act, H.R. 1107, 117th Cong. § 2(a) (2021).
363 E.g., Civil Rights Modernization Act of 2021, H.R. 3184, 117th Cong. § 2(a) (2021); SAFE TECH Act, S. 299,
117th Cong. § 2(2) (2021).
364 E.g., Health Misinformation Act of 2021, S. 2448, 117th Cong. § 3(a) (2021) (providing that a service provider
“shall be treated as the publisher or speaker of health misinformation” if it uses certain algorithms to promote that
content); Protecting Americans from Dangerous Algorithms Act, H.R. 8636, 116th Cong. § 2 (2020) (providing that
“an interactive computer service shall be considered to be an information content provider” and will not receive Section
230(c)(1) immunity in civil actions brought under 42 U.S.C. §§ 1985, 1986, or 18 U.S.C. § 2333, if the claim involves
the use of certain types of algorithms to deliver the relevant content, with exemptions for certain services).
365 CASE-IT Act, H.R. 8719, 116th Cong. § 2 (2020).
366 See, e.g., Stopping Big Tech’s Censorship Act, S. 4062, 116th Cong. § 2 (2020) (providing that both service
providers and users may only claim immunity under Section 230(c)(1) if a service “takes reasonable steps to prevent or
address the unlawful use” of the service “or the unlawful publication of information on” the service).
367 E.g., PACT Act, S. 4066, 116th Cong. § 7 (2020) (providing that Section 230 does not apply to the enforcement of
federal civil statutes or regulations); Stopping Big Tech’s Censorship Act, S. 4062, 116th Cong. § 2 (2020) (creating a
new exception for civil enforcement actions brought by the federal government arising out of violations of federal law).
368 E.g., See Something, Say Something Online Act of 2020, S. 4758, 116th Cong. § 5 (2020) (providing that a provider
“that fails to report a known suspicious transmission may be held liable as a publisher for the . . . transmission”). Cf.
Stop Shielding Culpable Platforms Act, H.R. 2000, 117th Cong. (2021) (stating that Section 230(c)(1) does not prevent
a provider or user “from being treated as the distributor of information”). The bill’s sponsor explained in a press release
that this was intended to allow liability if an entity “knowingly shares” information. Press Release, Rep. Jim Banks,
Chairman, Republican Study Committee, Banks Statement on the Stop Shielding Culpable Platforms Act (Mar. 22,
2021), https://banks.house.gov/uploadedfiles/stop_shielding_culpable_platforms_act_-_one-pager.pdf.
369 PACT Act, S. 797, 117th Cong. § 6(a) (2021). The bill defines “illegal activity” as content provider activity “that
has been determined by a” court “to violate Federal criminal or civil law.” Id. § 6(b). It defines “illegal content” as
information that a court has determined violates “(A) Federal criminal or civil law; or (B) State defamation law.” Id.
370 Id. § 6(a).
Congressional Research Service
35

link to page 10 link to page 10 Section 230: An Overview

The notice-and-takedown liability regime of the PACT Act may be compared to the notice-and-
takedown procedures of the Digital Millennium Copyright Act (DMCA), enacted in 1998.371 The
DMCA provides a safe harbor to covered providers who remove content after being notified that
it may violate federal copyright law.372 The law protects them from lawsuits premised on hosting
potentially infringing content. While the PACT Act would have required the specified notice to
contain a court order adjudicating the challenged content as illegal, the DMCA essentially leaves
the initial determination of whether content is illegal to private parties. Under the DMCA, the
person notifying a service provider of copyright infringement must provide a good-faith assertion
under penalty of perjury that the use of the allegedly infringing material is unlawful.373 The
notifier thus has the initial burden of determining whether the material violates copyright laws.374
The provider hosting the allegedly infringing content then must decide whether to accept the
notice and expeditiously take down the material, or instead to ignore the notice and risk liability.
The DMCA can therefore incentivize removals by granting immunity to providers that remove
allegedly infringing material, creating the potential that providers will take down lawful material
rather than risk litigation.375 The PACT Act, in contrast, would have incentivized removal after a
court already determined the material violated the law.376 If a proposal to convert Section 230 into
a notice-and-takedown liability scheme instead left it to providers to determine in the first
instance whether activity on their site violated the law, then such a hypothetical proposal could,
like the DMCA, incentivize the removal of at least some lawful content.
Other bills would have effectively conditioned Section 230 immunity on the provider’s content
recommendation and moderation practices. For instance, some bills would have caused providers
to lose Section 230 immunity if they used certain algorithms to distribute content to users or
display behavioral advertising, regardless of whether the algorithmically distributed content or
behavioral advertising formed the basis of the suit.377 One bill would have required certain

371 17 U.S.C. § 512(c). It could also be compared to the notice-based liability imposed on distributors of defamatory
content. See supra notes 64 to 66 and accompanying text; cf., e.g., Barrett v. Rosenthal, 146 P.3d 510, 520 (Cal. 2006)
(comparing the DMCA’s “limited liability” scheme to Section 230, and concluding “that Congress did not intend to
permit notice liability under the CDA”).
372 See generally CRS In Focus IF11478, Digital Millennium Copyright Act (DMCA) Safe Harbor Provisions for
Online Service Providers: A Legal Overview
, by Kevin J. Hickey; U.S. COPYRIGHT OFFICE, SECTION 512 OF TITLE 17
(2020), https://www.copyright.gov/policy/section512/section-512-full-report.pdf.
373 17 U.S.C. § 512(c)(3).
374 See, e.g., Lenz v. Univ. Music Corp., 815 F.3d 1145, 1151 (9th Cir. 2016) (acknowledging the copyright holder’s
obligation to state that the use is unauthorized and holding that this provision requires the holder to consider whether
the potentially infringing material is authorized as “fair use” of a copyright).
375 17 U.S.C. § 512(g)(1); see also, e.g., Wendy Seltzer, Free Speech Unmoored in Copyright’s Safe Harbor: Chilling
Effects of the DMCA on the First Amendment
, 24 HARV. J. LAW & TECH. 171, 175 (2010) (discussing the incentive
structure and arguing that the DMCA results in removal of constitutionally protected speech). The DMCA also
provides a process for the user who posted the allegedly infringing material to challenge the initial notice. 17 U.S.C.
§ 512(g)(2)–(3). If there is such a “counter notification,” the provider may be able to replace the initial post and retain
immunity. Id. § 512(g)(2), (4).
376 See PACT Act, S. 797, 117th Cong. § 6 (2021).
377 E.g., Break Up Big Tech Act of 2020, H.R. 8922, 116th Cong. § 2 (2020) (providing that Section 230 will not apply
if a provider sells targeted advertising and displays the advertising to users who have not opted in, among other
provisions); Don’t Push My Buttons Act, S. 4756, 116th Cong. § 2 (2020) (providing that a provider generally may not
claim Section 230 immunity if the provider uses automated functions to deliver content to users based on information it
has collected about the user’s habits, preferences, or beliefs, with certain exceptions); BAD ADS Act, S. 4337, 116th
Cong. § 2 (2020) (preventing certain providers from claiming Section 230 immunity for 30 days after displaying
“behavioral advertising” to a user or providing user information to a person knowing the information will be used to
“create or display behavioral advertising”).
Congressional Research Service
36

link to page 19 link to page 11 link to page 11 Section 230: An Overview

providers to “publicly disclose accurate information” about their “content moderation activity”
before they could claim Section 230(c)(1) immunity.378
Still other bills would have allowed liability if a provider or user promoted the content at issue in
a particular lawsuit,379 sometimes focusing specifically on providers’ use of algorithms.380 For
instance, the Justice Against Malicious Algorithms Act of 2021 would have provided that Section
230(c)(1) would not apply to certain providers that knew, should have known, or recklessly made
a personalized recommendation that materially contributed to a physical or severe emotional
injury.381 Taking a different approach, the DISCOURSE Act would have provided that certain
service providers would be deemed the “information content provider” for information targeted to
a user through an algorithm.382 This approach seemingly drew on cases ruling that if a provider
creates or develops the challenged information—is the content provider—Section 230 immunity
does not apply.383 Both of these bills contained exceptions if the recommendations responded to a
user search. Viewed on a general level, these proposals accordingly would have attempted to
discourage the use of recommendation algorithms. For more information on algorithms, see CRS
In Focus IF12462, Social Media Algorithms: Content Recommendation, Moderation, and
Congressional Considerations
, by Kristen E. Busch; and CRS Report R47753, Liability for
Algorithmic Recommendations
, by Eric N. Holmes.
Some bills would have targeted similar concerns by further amending the term “information
content provider” to encompass other types of activity. Multiple proposals would have treated a
person as an information content provider if the person “affirmatively and substantively”
modified another’s content,384 or solicited or funded information.385
Liability for Restricting Content
Some proposals would have limited providers’ and users’ immunity for restricting access to
another’s content, with the apparent goal of incentivizing the hosting of content. One preliminary
consideration in proposals seeking to limit immunity for restricting access to content is the
respective scope of Section 230(c)(1) and (c)(2), an issue discussed above.386 Because courts have
held that both provisions may immunize decisions to take down or otherwise restrict content, a
bill that seeks to limit such immunity may need to amend both provisions. Otherwise, if a
proposal amends only Section 230(c)(2) and does not address the scope of Section 230(c)(1),

378 DISCOURSE Act, S. 2228, 117th Cong. § 2(d)(2) (2021).
379 E.g., Platform Integrity Act, H.R. 9695, 117th Cong. § 2 (2022) (providing that Section 230(c)(1) will not apply if
the “provider or user has promoted, suggested, amplified, or otherwise recommended such information”).
380 E.g., Biased Algorithm Deterrence Act of 2019, H.R. 492, 116th Cong. § 2 (2019) (stating that if “an owner or
operator of a social media service . . . displays user-generated content in an order other than chronological order, delays
the display of such content relative to other content, or otherwise hinders the display of such content relative to other
content, if for a reason other than to restrict access to or availability of material described in [Section 230(c)(2)(A)] or
to carry out the direction of the user that generated such content,” that social media service “shall be treated as a
publisher or speaker of such content”).
381 Justice Against Malicious Algorithms Act of 2021, H.R. 5596, 117th Cong. § 2(a) (2021).
382 E.g., DISCOURSE Act, S. 2228, 117th Cong. § 2(a)(2)(A) (2021).
383 See supra “Information Provided by Another Information Content Provider.”
384 E.g., 21st Century FREE Speech Act, S. 1384, 117th Cong. § 2(a) (2021); Protect Speech Act, H.R. 8517, 116th
Cong. § 2 (2020); Online Freedom and Viewpoint Diversity Act, S. 4534, 116th Cong. § 2 (2020).
385 E.g., DISCOURSE Act, S. 2228, 117th Cong. § 2(a)(2)(A) (2021); Protect Speech Act, H.R. 8517, 116th Cong. § 2
(2020). Cf., e.g., SAFE TECH Act, S. 299, 117th Cong. § 2(1) (2021) (creating an exception to Section 230(c)(1) if the
provider or user accepted payment to make the speech available or funded the creation of speech).
386 Supra text accompanying notes 81 to 83.
Congressional Research Service
37

link to page 41 link to page 41 link to page 38 link to page 39 link to page 26 link to page 26 Section 230: An Overview

courts might continue to apply (c)(1) to some takedown decisions regardless of whether the more
limited immunity in (c)(2) no longer protected those decisions. In response to this issue, some
bills would have provided that Section 230(c)(1) does not apply to decisions to restrict content, so
that Section 230(c)(2) is the sole provision that immunizes takedown decisions.387
Some proposals would have amended the specific categories of material mentioned in Section
230(c)(2), changing the types of content covered by this provision.388 For example, a few
proposals would have deleted the catch-all term granting providers and users immunity for
restricting “otherwise objectionable material” and added new, more limited categories of material
such as material promoting “terrorism,” “violence,” or “self-harm.”389 One bill would have added
definitions for some of the existing categories of material: for example, defining “harassing”
material in part as material provided “with the intent to abuse, threaten, or harass any specific
person” and “lacking in any serious literary, artistic, political, or scientific value.”390 A bill
without such definitions may face interpretive questions about whether, for instance, specific
material promotes terrorism or self-harm. A number of these types of proposals would also have
granted immunity for removing “unlawful” material.391 Using the phrase “unlawful” makes these
proposals subject to the same questions discussed above regarding who determines whether the
content is unlawful and how.392
If a proposal retains the language in Section 230(c)(2) providing immunity for restricting material
“that the provider or user considers to” fall within the listed categories,393 it is likely courts would
continue interpreting this provision as embodying a subjective standard.394 Some proposals,
though, would have amended Section 230(c)(2) to state that it applies only if the provider or user
has an “objectively reasonable” belief that the content falls within one of the listed categories,395
seemingly inviting courts to engage in a more rigorous review of this belief.396
Other proposals would have limited immunity for takedown decisions in ways that depart more
substantially from the current Section 230 framework. Some proposals sought to condition
Section 230(c)(2) immunity on certain procedural requirements, such as requiring providers and
users to explain their decisions to restrict access to content.397 Other bills would have required

387 E.g., Protect Speech Act, H.R. 3827, 117th Cong. § 2 (2021); Online Freedom and Viewpoint Diversity Act, S.
4534, 116th Cong. § 2 (2020).
388 See, e.g., Stop the Censorship Act, H.R. 4027, 116th Cong. § 2 (2019) (replacing entire list of adjectives in Section
230(c)(2) with “unlawful”).
389 E.g., Preserving Political Speech Online Act, S. 2338, 117th Cong. § 4(2) (2021) (deleting “filthy” and “otherwise
objectionable” and adding “threatening, or promoting illegal activity”); Protect Speech Act, H.R. 8517, 116th Cong. § 2
(2020) (replacing “harassing, or otherwise objectionable” with “promoting terrorism or violent extremism, harassing,
promoting self-harm, or unlawful”); Stop the Censorship Act of 2020, H.R. 7808, 116th Cong. § 2 (2020) (replacing
“otherwise objectionable” with “unlawful, or that promotes violence or terrorism”).
390 21st Century FREE Speech Act, S. 1384, 117th Cong. § 2(a)(2) (2021).
391 Supra notes 388 and 389.
392 See supra text accompanying notes 368 to 376.
393 47 U.S.C. § 230(c)(2)(A).
394 See supra text accompanying notes 243 to 246.
395 E.g., DISCOURSE Act, S. 2228, 117th Cong. § 2(b)(1)(A) (2021); Protect Speech Act, H.R. 8517, 116th Cong. § 2
(2020); Online Freedom and Viewpoint Diversity Act, S. 4534, 116th Cong. § 2 (2020).
396 Cf. Holomaxx Techs. v. Microsoft Corp., 783 F. Supp. 2d 1097, 1104 (N.D. Cal. 2011) (“No court has articulated
specific, objective criteria to be used in assessing . . . a provider’s subjective determination of what is ‘objectionable’
. . . . Here, however, it is clear . . . that Microsoft reasonably could conclude that Holomaxx’s emails were ‘harassing’
and thus ‘otherwise objectionable.’” (emphasis added)).
397 E.g., Stopping Big Tech’s Censorship Act, S. 4062, 116th Cong. § 2 (2020).
Congressional Research Service
38

Section 230: An Overview

providers (and sometimes users) to adopt public terms of service for restricting access to content
and then consistently apply those terms in order to benefit from Section 230 immunity.398
Some proposals would have made broader changes to limit the types of content restriction
decisions receiving immunity, focusing on the substance of those decisions. A number of bills
would have granted immunity for takedown decisions only if the provider or user acted in a
viewpoint-neutral manner.399 Another proposal would have stated that providers do not act in
“good faith” for purposes of Section 230(c)(2)(A) if they restrict material on the basis of certain
protected classes, including race, religion, sex, or “political affiliation or speech.”400
Perhaps departing most significantly from the current Section 230 framework, one bill would
have added a new provision to Section 230 making it unlawful for “any internet platform” to
restrict access to legally protected material and creating a private right of action to enforce this
provision.401 This bill would have raised similar questions as the proposals using the phrase
“unlawful” discussed above, regarding who determines whether material is legally protected.
Under this bill, another question might have been when material is “protected” under the
Constitution or other laws—when a law is explicit enough to create protections, for instance, and
whether any protections have to be absolute rather than qualified.
Free Speech Considerations
The Free Speech Clause of the First Amendment to the U.S. Constitution limits the government’s
ability to regulate speech.402 There are at least two distinct types of First Amendment issues that

398 E.g., 21st Century FREE Speech Act, S. 1384, 117th Cong. § 2(a)(2) (2021) (defining “in good faith” to mean,
among other requirements, that a provider or user (1) restricts access to material consistent with publicly available
terms of service; (2) does not restrict access to material on deceptive grounds; (3) does not restrict material that is
similarly situated to material the provider or user intentionally declines to restrict; and (4) gives the content provider
notice of the restriction and an opportunity to respond); Protect Speech Act, H.R. 8517, 116th Cong. § 2 (2020)
(providing that Section 230(c)(1) and Section 230(c)(2)(A) apply only if the provider or user (1) makes publicly
available terms of service for content moderation; (2) restricts content consistently with those terms of service; (3) does
not restrict content “on deceptive grounds or apply terms of service or use to restrict access to or availability of material
that is similarly situated to material that the service intentionally declines to restrict”; and (4) gives the content provider
“timely notice” of the basis for restricting access to the content and “a meaningful opportunity to respond”); Limiting
Section 230 Immunity to Good Samaritans Act, S. 3983, 116th Cong. § 2(1) (2020) (amending Section 230(c)(1) so
that it applies to a covered “edge provider” only if it adopts written terms of service for restricting material that
“promise” that the provider will (1) “design and operate” the service in “good faith,” a term defined as excluding
“intentionally selective enforcement” of the service’s terms of service, among other provisions, and (2) pay certain
damages and costs if the provider is found to have breached that promise).
399 E.g., DISCOURSE Act, S. 2228, 117th Cong. § 2(a)(2) (2021) (stating that dominant service providers will be
deemed information content providers if they “suppress a discernible viewpoint” of the information); Stopping Big
Tech’s Censorship Act, S. 4062, 116th Cong. § 2(1)(B)(iii) (2020) (providing that Section 230(c)(2)(A) will apply only
if a provider or user, among other requirements, acts “in a viewpoint-neutral manner”); Ending Support for Internet
Censorship Act, S. 1914, 116th Cong. § 2(a)(1) (2019) (providing that Section 230(c) will apply to larger providers
only if the FTC has certified that “the company does not moderate information . . . in a manner that is biased against a
political party, political candidate, or political viewpoint”).
400 Preserving Political Speech Online Act, S. 2338, 117th Cong. § 4(5) (2021).
401 Protecting Constitutional Rights from Online Platform Censorship Act, H.R. 83, 117th Cong. § 2(a)(2) (2021).
402 U.S. CONST. amend. I (“Congress shall make no law . . . abridging the freedom of speech.”). Although the text of the
First Amendment refers to “Congress,” it has long been understood to restrict action by the executive branch as well.
See, e.g., Columbia Broad. Sys., Inc. v. Democratic Nat’l Comm., 412 U.S. 94, 160 (1973) (Douglas, J., concurring)
(describing First Amendment as restricting Congress, whether “acting directly or through any of its agencies such as
the FCC”); see generally, e.g., Daniel J. Hemel, Executive Action and the First Amendment’s First Word, 40 PEPP. L.
REV. 601 (2013). The First Amendment applies to the states through the Fourteenth Amendment. 44 Liquormart v.
Rhode Island, 517 U.S. 484, 489 n.1 (1996); U.S. CONST. amend. XIV.
Congressional Research Service
39

Section 230: An Overview

may be raised by proposals to amend Section 230. The first issue is whether any given proposal
unconstitutionally infringes the constitutionally protected speech of either providers or users. The
second is whether, if Section 230 is repealed in whole or in part, the First Amendment may
nonetheless prevent private parties or the government from holding providers liable for hosting
content. This section of the report first explains background principles on legal protections for
online speech, and then provides some initial considerations for evaluating these two issues.
Background Principles
First Amendment Protections for Online Speech
The Supreme Court has recognized that the First Amendment protects speech transmitted over the
internet, saying in one case that “cyberspace,” and in particular, “social media,” is today the most
important place for “the exchange of views” and other core speech activities.403 Accordingly, the
Court has invalidated a number of laws that regulate online speech, particularly if they target
speech based on its content.404 For example, in 1997, the Supreme Court struck down two
provisions of the Communications Decency Act of 1996 that prohibited sending or displaying
certain “indecent” or “patently offensive” material to minors.405
In addition to protecting website users when they post or read speech online, the First
Amendment protects website operators when they engage in speech activities.406 The Supreme
Court has concluded that a website designer engages in protected speech when designing a
website, even when the website incorporates third-party material.407 The Court has also
recognized that businesses engaged in speech activities generally have the right to refuse to host
customers’ speech, saying that the government may violate the First Amendment if it compels “a
private corporation to provide a forum for views other than its own.”408 This concern is
heightened if the business is providing a forum for speech409 and if there is a risk the user’s
speech will be attributed to the business hosting it, such that the business’s decision to host the

403 Packingham v. North Carolina, 582 U.S. 98, 104 (2017); see also id. at 107 (ruling unconstitutional a state law that
prohibited convicted sex offenders from using social media, barring “access to what for many are the principal sources
for knowing current events, checking ads for employment, speaking and listening in the modern public square, and
otherwise exploring the vast realms of human thought and knowledge”).
404 See, e.g., Reno v. ACLU, 521 U.S. 844, 870 (1997) (“[O]ur cases provide no basis for qualifying the level of First
Amendment scrutiny that should be applied to this medium.”). See generally Cong. Rsch. Serv., Overview of Content-
Based and Content-Neutral Regulation of Speech
, CONSTITUTION ANNOTATED,
https://constitution.congress.gov/browse/essay/amdt1-7-3-1/ALDE_00013695/ (last visited Jan. 4, 2024).
405 Reno, 521 U.S. at 859–60. For more discussion of the constitutional tiers of scrutiny used to evaluate speech
regulations, see CRS In Focus IF12308, Free Speech: When and Why Content-Based Laws Are Presumptively
Unconstitutional
, by Victoria L. Killion.
406 See generally CRS Report R45650, Free Speech and the Regulation of Social Media Content, by Valerie C.
Brannon.
407 303 Creative LLC v. Elenis, 600 U.S. 570, 587 (2023).
408 Pac. Gas & Elec. Co. v. Pub. Utils. Comm’n, 475 U.S. 1, 9 (1986) (plurality opinion); see also id. at 19–20 (holding
that a state regulatory commission could not require a utility company to publish content in its monthly newsletter from
entities who disagreed with the utility’s views); id. at 24 (Marshall, J., concurring) (“While the interference with
appellant’s speech is, concededly, very slight, the State’s justification—the subsidization of another speaker chosen by
the State—is insufficient to sustain even that minor burden.”); see also, e.g., Hurley v. Irish-Am. Gay, Lesbian &
Bisexual Grp. of Boston, 515 U.S. 557, 576 (1995) (“[W]hen dissemination of a view contrary to one’s own is forced
upon a speaker intimately connected with the communication advanced, the speaker’s right to autonomy over the
message is compromised.”).
409 Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1930 (2019).
Congressional Research Service
40

Section 230: An Overview

speech can be seen as an expressive choice to be associated with that speech.410 For instance, the
Court has said that newspapers are engaged in constitutionally protected speech when they
“exercise . . . editorial control and judgment” in choosing what “material [will] go into a
newspaper,” and has further held that the government generally may not interfere with those
editorial judgments.411
Some lower courts have extended this line of Supreme Court cases to websites that host or
present third-party content, dismissing lawsuits premised on the sites’ editorial decisions about
what content to publish.412 The Supreme Court agreed to consider this issue in its October 2023
term, in two cases involving conflicting appeals court rulings.413 Both cases involve state laws
restricting online platforms’ ability to moderate user content.414 The Eleventh Circuit held that
social media companies making content moderation decisions are likely engaged in “protected
exercises of editorial judgment,”415 while the Fifth Circuit said the covered online platforms
“exercise virtually no editorial control or judgment.”416 Contrary to the conclusion of the
Eleventh Circuit, the Fifth Circuit said that the platforms screen out obscenity and spam but allow
the posting of “virtually everything else.”417
A 2016 decision by the D.C. Circuit somewhat similarly looked at the degree of editorial
judgment that online service providers actually exercised. In U.S. Telecom Association v. FCC,
the D.C. Circuit rejected a First Amendment challenge to the FCC’s 2015 net neutrality order.418
The 2015 order classified broadband internet access service providers as common carriers,
subjecting them to heightened regulation, including prohibiting these providers from blocking
lawful content.419 A broadband service provider argued that the rules violated its First Amendment
rights by forcing providers “to transmit speech with which they might disagree.”420 The D.C.
Circuit rejected this argument, concluding that there was no First Amendment issue because the

410 Compare Hurley, 515 U.S. at 575 (ruling that a state could not force a parade organizer to host a specific group
where the group’s “participation would likely be perceived as having resulted from the [organizer’s] . . . determination
. . . that its message was worthy of presentation and quite possibly of support as well”), with Rumsfeld v. Forum for
Acad. & Institutional Rights, Inc., 547 U.S. 47, 64–65 (2006) (rejecting challenge to federal funding condition
requiring law schools to host military recruiters, saying the hosting decision was not “inherently expressive” and
“[n]othing about recruiting suggests that law schools agree with any speech by recruiters”), and PruneYard Shopping
Ctr. v. Robins, 447 U.S. 74, 87 (1980) (concluding a shopping center did not have a First Amendment right to eject
students distributing pamphlets, acknowledging the pamphleteers’ views would “not likely be identified with . . . the
owner”). At least one scholar has argued this Supreme Court precedent suggests First Amendment protections apply if
an online platform is creating a “coherent speech product.” Eugene Volokh, Treating Social Media Platforms Like
Common Carriers?
, 1 J. FREE SPEECH L. 377, 427 (2021).
411 Miami Herald Publ’g Co. v. Tornillo, 418 U.S. 241, 258 (1974) (ruling unconstitutional a state law requiring
newspapers, in certain circumstances, to publish replies to criticisms of political candidates).
412 See, e.g., Isaac v. Twitter, Inc., 557 F. Supp. 3d 1251, 1261 (S.D. Fla. 2021); La’Tiejira v. Facebook, Inc., 272 F.
Supp. 3d 981, 992 (S.D. Tex. 2017); Publius v. Boyer-Vine, 237 F. Supp. 3d 997, 1008 (E.D. Cal. 2017); Zhang v.
Baidu.com, Inc., 10 F. Supp. 3d 433, 443 (S.D.N.Y. 2014).
413 NetChoice, LLC v. Att’y Gen., 34 F.4th 1196 (11th Cir. 2022), cert. granted, 216 L. Ed. 2d 1313 (2023);
NetChoice, L.L.C. v. Paxton, 49 F.4th 439 (5th Cir. 2022), cert. granted, 216 L. Ed. 2d 1313 (2023).
414 The cases, including the state laws, are discussed in more detail in CRS Legal Sidebar LSB10748, Free Speech
Challenges to Florida and Texas Social Media Laws
, by Valerie C. Brannon.
415 NetChoice, LLC, 34 F.4th at 1203.
416 NetChoice, L.L.C., 49 F.4th at 459.
417 Id.
418 U.S. Telecom Ass’n v. FCC, 825 F.3d 674, 740 (D.C. Cir. 2016).
419 Id. at 696; see also CRS Report R40616, The Federal Net Neutrality Debate: Access to Broadband Networks, by
Patricia Moloney Figliola; CRS Report R46973, Net Neutrality Law: An Overview, by Chris D. Linebaugh.
420 U.S. Telecom Ass’n, 825 F.3d at 740.
Congressional Research Service
41

link to page 11 Section 230: An Overview

FCC’s rules “affect[ed] a common carrier’s neutral transmission of others’ speech, not a carrier’s
communication of its own message.”421 One critical basis for the D.C. Circuit’s conclusion was
the FCC’s finding that broadband providers did not, in fact, exercise control over the content they
transmitted, and instead acted “as ‘mere conduits for the messages of others, not as agents
exercising editorial discretion subject to First Amendment protections.’”422 The Eleventh Circuit,
in contrast, said “social-media platforms aren’t ‘dumb pipes’” that “reflexively transmit[] data
from point A to point B.”423
Justice Kavanaugh, then a judge on the D.C. Circuit, wrote an opinion disagreeing with the D.C.
Circuit’s approach to the First Amendment analysis.424 He argued internet service providers
“enjoy First Amendment protection of their rights to speak and exercise editorial discretion”
regardless of whether the providers actually choose to exercise much editorial discretion.425 In his
view, First Amendment protections should attach because internet service providers deliver
content to consumers, performing the same kinds of functions as more traditional media.426 It
remains to be seen whether Justice Kavanaugh will adhere to these views when considering the
cases appealed to the Supreme Court in the October 2023 term, or what approach the rest of the
Court will take to this issue. As discussed, lower court caselaw suggests that whether any given
lawsuit or regulation implicates the First Amendment by interfering with a provider’s editorial
discretion will likely depend in part on the factual circumstances and the nuances of the lawsuit or
regulation.
Section 230 Protections for Online Speech
There is more precedent clarifying Section 230’s protections for promotion and moderation
activities, and courts have described the law as protecting the speech of both users and providers.
Section 230 arguably protects user speech by allowing providers to host user-generated content
without fear of incurring liability.427 The Fourth Circuit said in Zeran that in enacting Section
230, Congress was, in part, responding to concerns that online providers facing potential tort
liability would simply prohibit or remove user content rather than litigate its legality.428 By
shielding providers from that liability, Congress removed that incentive for providers to restrict
user speech.429 Further, in immunizing decisions both to host and not to host user content, Section
230 can also be seen as protecting possible First Amendment rights of editorial discretion.430
Significantly, the way courts have interpreted Section 230(c)(1) to grant immunity for “publisher”

421 Id.
422 Id. at 741 (quoting In re Protecting and Promoting the Open Internet, 30 FCC Rcd. 5601, 5870 (2015)).
423 NetChoice, LLC, 34 F.4th at 1204. The court said a social media platform likely exercised editorial judgment in two
ways: by removing posts that violate its terms of service and by arranging available content in certain ways. Id.
424 U.S. Telecom Ass’n v. FCC, 855 F.3d 381, 418 (D.C. Cir. 2017) (Kavanaugh, J., dissenting from denial of rehearing
en banc).
425 Id. at 428–29.
426 Id. at 428.
427 See, e.g., Zeran v. Am. Online, Inc., 129 F.3d 327, 330–31 (4th Cir. 1997); Ardia, supra note 78, at 386–87.
428 Zeran, 129 F.3d at 330–31.
429 Id. at 331.
430 Cf., e.g., Langdon v. Google, Inc., 474 F. Supp. 2d 622, 629–31 (D. Del. 2007) (concluding plaintiff’s claims are
barred by both the First Amendment and 47 U.S.C. § 230(c)(2)(A)).
Congressional Research Service
42

link to page 52 Section 230: An Overview

activities seems consistent with the Supreme Court’s description of constitutionally protected
“editorial” functions.431
According to the Zeran court, Congress also intended to “encourage service providers to self-
regulate the dissemination of offensive material”—that is, to remove some user content.432
Granting providers immunity for their decisions to remove or restrict access to user content could
operate in some tension with the goal of encouraging providers to host user speech.433 But both
aspects of Section 230—providing providers with immunity for hosting user content and for
restricting content—were arguably intended to ensure that the government generally would not be
the entity striking the proper balance between these two goals,434 and that private parties would
instead be the ones deciding whether content belonged online.435 In this sense, then, both aspects
of Section 230 serve the First Amendment by shielding speech from government intervention.
Section 230 accordingly overlaps somewhat with the First Amendment. However, while Section
230 may protect some speech activities, Section 230 is not coextensive with the First
Amendment’s protections,436 as discussed in more detail below.437
First Amendment Issues with Reform Proposals
Any legislative proposal that regulates online content may implicate the First Amendment to the
extent that it burdens protected speech activity. As currently written, Section 230 does not itself
make any speech unlawful. Instead, it governs whether interactive computer service providers and
users may be subject to liability under other laws for their interactions with others’ content.438
Further, although Section 230 can be seen as speech-protective, the removal of Section 230’s
statutory speech protections would not affect the scope of any constitutional speech protections.

431 Compare, e.g., Zeran, 129 F.3d at 330 (“[L]awsuits seeking to hold a service provider liable for its exercise of a
publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content—
are barred [by Section 230].”), with, e.g., Miami Herald Publ’g Co. v. Tornillo, 418 U.S. 241, 258 (1974) (“The choice
of material to go into a newspaper, and the decisions made as to limitations on the size and content of the paper, and
treatment of public issues and public officials . . . constitute the exercise of editorial control and judgment.”).
432 Zeran, 129 F.3d at 331.
433 See, e.g., Batzel v. Smith, 333 F.3d 1018, 1028 (9th Cir. 2003) (“We recognize that there is an apparent tension
between Congress’s goals of promoting free speech while at the same time giving parents the tools to limit the material
their children can access over the Internet. . . . [L]aws often have more than one goal in mind, and . . . it is not
uncommon for these purposes to look in opposite directions. . . . Tension within statutes is often not a defect but an
indication that the legislature was doing its job.”).
434 See 47 U.S.C. § 230(a)(4) (finding that the internet has “flourished, to the benefit of all Americans, with a minimum
of government regulation”); id. § 230(b)(2) (stating that it is the policy of the United States “to preserve the vibrant and
competitive free market that presently exists for the Internet and other interactive computer services, unfettered by
Federal or State regulation”).
435 See, e.g., 141 CONG. REC. H8470 (daily ed. Aug. 4, 1995) (statement of Rep. Christopher Cox) (“[W]e do not wish
to have a Federal Computer Commission with an army of bureaucrats regulating the Internet . . . .”); id. at H8470
(statement of Rep. Joe Barton) (arguing Section 230 provides “a reasonable way to . . . help [service providers] self-
regulate . . . without penalty of law”); id. at H8471 (statement of Rep. Rick White) (arguing the responsibility for
“protect[ing children] from the wrong influences on the Internet” should lie with parents instead of federal
government); id. at H8471 (statement of Rep. Bob Goodlatte) (“The Cox-Wyden amendment is a thoughtful approach
to keep smut off the net without government censorship.”).
436 See, e.g., Eric Goldman, Why Section 230 Is Better than the First Amendment, 95 NOTRE DAME L. REV. ONLINE 33,
34 (2019) (“Section 230 provides significant and irreplaceable substantive and procedural benefits beyond the First
Amendment’s free speech protections.”).
437 Infra “Comparing the Operation of First Amendment and Section 230 Protections .”
438 See 47 U.S.C. § 230.
Congressional Research Service
43

link to page 43 Section 230: An Overview

Section 230 is not constitutionally required, and Congress could repeal it without violating the
First Amendment.439
Section 230 nonetheless affects constitutionally protected speech by creating government
incentives for certain speech activities, and accordingly, amendments to Section 230 could
implicate constitutional free speech concerns.440 A law is not necessarily unconstitutional merely
because it affects protected speech, however. Courts apply a variety of different tests to determine
whether government regulations implicating First Amendment interests are constitutional.441
Which analysis a court adopts depends on a variety of factors, including whether the regulation is
focused primarily on speech or on conduct,442 and whether the regulation targets only certain
types of speech.443
In general, laws that regulate speech based on its content or viewpoint will be considered
“presumptively unconstitutional and may be justified only if the government proves that they are
narrowly tailored to serve compelling state interests.”444 Content-neutral speech regulations, in
contrast, are generally evaluated under a more lenient standard and are more likely (but not
guaranteed) to be upheld against a First Amendment challenge.445 Specifically, content-neutral
laws that regulate speech are subject to intermediate scrutiny, which asks whether the restriction
is “narrowly tailored to serve a significant governmental interest” and “leave[s] open ample
alternative channels for communication of the information.”446 Further, Congress may be able to
target certain limited categories of speech that the Supreme Court has historically recognized can
be regulated more freely, such as obscenity or fraud, without triggering heightened scrutiny.447
As previously discussed, jurisprudence regarding First Amendment protections for editorial
discretion is still developing.448 Some Supreme Court cases suggest the protection for editorial
discretion may be absolute, while others suggest First Amendment protections in this realm may
track ordinary constitutional standards.449 Both the Fifth and Eleventh Circuits have taken the

439 See, e.g., Gucci Am., Inc. v. Hall & Assocs., 135 F. Supp. 2d 409, 422 (S.D.N.Y. 2001) (“Section 230 reflects a
‘policy choice,’ not a First Amendment imperative, to immunize ISPs from defamation and other ‘tort-based lawsuits,’
driven, in part, by free speech concerns.” (quoting Zeran v. Am. Online, Inc., 129 F.3d 327, 330–31 (4th Cir. 1997))).
440 See, e.g., Ashutosh Bhagwat, Do Platforms Have Editorial Rights, 1 J. FREE SPEECH L. 1, 135 (2021); Daphne
Keller, Who Do You Sue?, HOOVER INST., Aegis Series Paper No. 1902, at 3 (2019).
441 See generally, e.g., CRS Report R45650, Free Speech and the Regulation of Social Media Content, by Valerie C.
Brannon.
442 See, e.g., Ashcroft v. Free Speech Coal., 535 U.S. 234, 253 (2002) (“[T]he Court’s First Amendment cases draw
vital distinctions between words and deeds, between ideas and conduct.”); Rumsfeld v. Forum for Acad. & Institutional
Rights, Inc., 547 U.S. 47, 62 (2006) (upholding law where “the compelled speech . . . is plainly incidental to the [law’s]
regulation of conduct”).
443 See generally CRS In Focus IF12308, Free Speech: When and Why Content-Based Laws Are Presumptively
Unconstitutional
, by Victoria L. Killion.
444 Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015).
445 See, e.g., Univ. City Studios, Inc. v. Corley, 273 F.3d 429, 451, 454 (2d Cir. 2001) (rejecting a First Amendment
challenge to a court order enforcing the Digital Millennium Copyright Act because the restriction targeted the
“functional,” “nonspeech” aspects of computer code, and was accordingly content neutral). In contrast, for example, the
Eleventh Circuit struck down certain aspects of a state law regulating online content moderation that it believed did not
survive intermediate scrutiny. NetChoice, LLC v. Att’y Gen., 34 F.4th 1196, 1227 (11th Cir. 2022), cert. granted, 216
L. Ed. 2d 1313 (2023).
446 Ward v. Rock Against Racism, 491 U.S. 781, 791 (1989) (quoting Clark v. Cmty. for Creative Non-Violence, 468
U.S. 288, 293 (1984)).
447 See generally CRS In Focus IF11072, The First Amendment: Categories of Speech, by Victoria L. Killion.
448 Supra “First Amendment Protections for Online Speech.”
449 Compare Miami Herald Publ’g Co. v. Tornillo, 418 U.S. 241, 258 (1974) (“It has yet to be demonstrated how
(continued...)
Congressional Research Service
44

link to page 47 Section 230: An Overview

latter approach, and in particular, suggested content-based laws regulating editorial discretion are
more likely to be ruled unconstitutional than content-neutral laws.450 If a Section 230 reform
proposal imposes direct requirements for providers or users to distribute or restrict content,451 it
may raise these First Amendment concerns.
The fact that Section 230 currently does not directly require or prohibit certain types of speech,
but merely creates incentives for moderating speech, is potentially another complicating factor in
determining the appropriate First Amendment analysis for reform proposals. Some have argued
that because Congress was not required to grant this immunity, it can restrict or condition Section
230 immunity without raising any constitutional concerns.452 Other commentators have argued
that speech-based limits on Section 230 immunity would run afoul of Supreme Court precedent
prohibiting unconstitutional conditions on government benefits.453
In other contexts, the Supreme Court has recognized that denying a benefit “to claimants who
engage in certain forms of speech is in effect to penalize them for such speech” and can have the
same “deterrent effect” as a more direct speech restriction.454 Under the unconstitutional
conditions doctrine, which has largely—but not solely455—been developed in the context of grant
programs, the government “may not deny a benefit to a person on a basis that infringes his
constitutionally protected interests.”456 Thus, the government might violate the First Amendment
if it uses a grant program to impose restrictions on private speech.457 At the same time, the

governmental regulation of this crucial process can be exercised consistent with First Amendment guarantees of a free
press . . . .”), with Turner Broad. Sys. v. FCC, 512 U.S. 622, 643–44 (1994) (applying an intermediate level of scrutiny
to regulations that “interfere with cable operators’ editorial discretion,” but where “the extent of the interference does
not depend upon the content of the cable operators’ programming”).
450 NetChoice, LLC, 34 F.4th at 1226; NetChoice, L.L.C. v. Paxton, 49 F.4th 439, 448, 457 (5th Cir. 2022), cert.
granted
, 216 L. Ed. 2d 1313 (2023).
451 See, e.g., See Something, Say Something Online Act of 2020, S. 4758, 116th Cong. § 5 (2020) (amending Section
230 to include an affirmative requirement for providers to “take reasonable steps to prevent or address unlawful users
of the service through the reporting of suspicious transmissions”); CASE-IT Act, H.R. 8719, 116th Cong. § 2 (2020)
(creating a new private right of action allowing content providers to sue service providers that fail “to make content
moderation decisions pursuant to policies or practices that are reasonably consistent with the First Amendment”).
452 See, e.g., Craig Parshall, Big Tech and the Whole First Amendment, FEDERALIST SOC’Y (Aug. 14, 2020),
https://fedsoc.org/commentary/fedsoc-blog/big-tech-and-the-whole-first-amendment.
453 See, e.g., Bhagwat, supra note 440, at 135; Edwin Lee, Conditioning Section 230 Immunity on Unbiased Content
Moderation Practices as an Unconstitutional Condition
, 2020 U. ILL. J.L. TECH. & POL’Y 457, 467 (2020). For a
discussion of the unconstitutional conditions doctrine, see Overview of Unconstitutional Conditions Doctrine,
CONSTITUTION ANNOTATED, https://constitution.congress.gov/browse/essay/amdt1-7-13-1/ALDE_00000771/ (last
visited Jan. 4, 2024).
454 Speiser v. Randall, 357 U.S. 513, 518 (1958); see also id. at 529 (concluding that a California provision requiring
veterans seeking a property tax exemption to swear a loyalty oath was unconstitutional because it placed the burden of
proof on the claimants).
455 See, e.g., Frost & Frost Trucking Co. v. R.R. Comm’n, 271 U.S. 583, 593–94 (1926) (holding that a state could not
place conditions on permits that would “require the relinquishment of constitutional rights”). Cf. FCC v. League of
Women Voters, 468 U.S. 364, 381 (1984) (ruling unconstitutional regulations that unduly interfered with broadcast
licensees’ ability to express their own “editorial opinion”). League of Women Voters involved a condition on a grant
program administered by the Corporation for Public Broadcasting, but the condition was analyzed under the
constitutional rubric that applies to broadcast licenses. See id. at 377–78.
456 Perry v. Sindermann, 408 U.S. 593, 597 (1972) (holding that a public university could not place a condition on
employment that violated a person’s free speech rights). See generally CRS Report R46827, Funding Conditions:
Constitutional Limits on Congress’s Spending Power
, by Victoria L. Killion.
457 See Agency for Int’l Dev. v. All. for Open Soc’y Int’l, Inc., 570 U.S. 205, 208, 218 (2013) (holding that a federal
condition requiring funding recipients to have “a policy explicitly opposing prostitution and sex trafficking” was
unconstitutional because it limited the recipients’ speech outside the bounds of the federal program); Legal Servs. Corp.
(continued...)
Congressional Research Service
45

link to page 48 Section 230: An Overview

government can impose conditions that ensure funds “will be used only to further the purposes of
a grant.”458 Additionally, when the government uses a grant program to recruit “private entities to
convey a governmental message,” it may impose content- and viewpoint-based restrictions on
funded speech.459
Section 230 grants a legal benefit, in the form of immunity. To the extent that reform proposals
would impose conditions on that benefit related to editorial choices about distributing or
restricting others’ content, the unconstitutional conditions doctrine may be seen as an appropriate
framework to analyze the law’s constitutionality.460 This would mean conditions on Section 230
trigger First Amendment analysis—but if a court followed the cases analyzing grant programs,
the doctrine might allow certain content- and viewpoint-based restrictions on immunity.
It is unclear, however, how some aspects of this doctrine might apply outside the context of grant
programs and other forms of monetary benefits. In 2017’s Matal v. Tam, four members of the
Supreme Court concluded that unconstitutional conditions cases involving “cash subsidies or
their equivalent” were not relevant in analyzing speech restrictions in the context of trademark
registration, a nonmonetary government benefit.461 It may be difficult, for instance, to apply cases
asking whether a speech restriction serves “the purposes of a grant”462 to review conditions on
nonmonetary benefits that do not seem to exist to convey a clear “governmental message.”463 The
Court’s ruling in Tam further suggested, though, that viewpoint-based conditions on nonmonetary
government benefits sometimes violate the Constitution.464 Tam held that a federal law
prohibiting the registration of disparaging trademarks was unconstitutional under the First
Amendment, saying that “[s]peech may not be banned on the ground that it expresses ideas that
offend.”465 Like Section 230, the federal trademark law did not directly prohibit disparaging
speech; it merely limited the benefits of trademark registration.466 Section 230 also provides legal
protections for private speech, and Tam could thus suggest that any viewpoint-based conditions
on Section 230 immunity are unconstitutional.467 Tam did not consider content-based conditions
or other types of speech restrictions on nonmonetary benefits.468

v. Velazquez, 531 U.S. 533, 542, 548–49 (2001) (holding that a federal condition prohibiting funds from being used for
legal representation involving an effort to amend welfare law was unconstitutional, where the program “was designed
to facilitate private speech, not to promote a governmental message”).
458 Rust v. Sullivan, 500 U.S. 173, 198 (1991) (upholding a federal grant condition prohibiting health programs
receiving federal funding from encouraging the use of abortion).
459 Rosenberger v. Rector & Visitors of the Univ. of Va., 515 U.S. 819, 833–34 (1995).
460 E.g., Lee, supra note 453, at 466. Cf. Cathy Gellis, Section 230 Isn’t a Subsidy; It’s a Rule of Civil Procedure,
TECHDIRT (Dec. 29, 2020), https://www.techdirt.com/articles/20201229/12003745970/section-230-isnt-subsidy-rule-
civil-procedure.shtml (arguing that Section 230 is more similar to a rule of civil procedure than “some sort of tangible
prize the government hands out selectively”).
461 Matal v. Tam, 582 U.S. 218, 240 (2017) (plurality opinion).
462 Rust, 500 U.S. at 198.
463 Rosenberger, 515 U.S. at 833.
464 See Tam, 582 U.S. at 243–44; id. at 247 (Kennedy, J., concurring).
465 Id. at 223 (majority opinion).
466 See id. at 226–27 (discussing the legal rights and benefits conferred by registration).
467 See also Legal Servs. Corp. v. Velazquez, 531 U.S. 533, 548–49 (2001) (“Where private speech is involved, even
Congress’ antecedent funding decision cannot be aimed at the suppression of ideas thought inimical to the
Government’s own interest.”).
468 In its October 2023 term, the Supreme Court is considering another First Amendment challenge in which the federal
government has argued a restriction on trademark registration is constitutional because it is a condition on a benefit
rather than a direct restriction on speech. E.g., Transcript of Oral Argument at 7–9, Vidal v. Elster, No. 22-704 (U.S.
Nov. 1, 2023).
Congressional Research Service
46

link to page 41 link to page 39 link to page 40 link to page 41 link to page 42 Section 230: An Overview

If a court did not apply Supreme Court precedent on grant programs to conditions on Section 230
immunity, content- and viewpoint-based conditions on Section 230 could trigger heightened First
Amendment scrutiny under prevailing Supreme Court precedent.469 Section 230 already contains
content-based distinctions: Section 230(c)(2) extends immunity only to those providers and users
restricting access to specific types of “objectionable” content,470 arguably regulating speech on
the basis of its content and viewpoint.471 Courts have not weighed in on the constitutionality of
Section 230’s current content-based distinctions.472 Reform proposals that would add to the
current list of types of content in Section 230(c)(2) could create additional content- or viewpoint-
based distinctions.473 Other proposals would have created content-based exceptions to Section
230, allowing liability for hosting certain types of content.474 Some bills that would have created
new content-based exceptions seemed to target historically “unprotected”475 categories of speech
such as child sexual abuse material,476 and therefore might not trigger heightened constitutional
scrutiny on that basis.477
Other bills from prior Congresses would have limited providers’ editorial discretion by extending
immunity only to providers that moderate content in specific ways.478 It might be argued that
some of these proposals are content- or viewpoint-based, while others might be considered
content-neutral. Open doctrinal questions may complicate this analysis. The Eleventh Circuit held
that a state law prohibiting a platform from making content moderation decisions based on the
content of certain users’ posts was content-based because it applied based on the message
conveyed by the platform’s decision.479 Under this reasoning, proposals that extend Section 230
immunity only to providers or users who moderate content in a viewpoint-neutral manner could

469 See Reed v. Town of Gilbert, 576 U.S. 155, 163–64 (2015).
470 47 U.S.C. § 230(c)(2) (providing immunity to providers and users for certain decisions to restrict access to “material
that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise
objectionable, whether or not such material is constitutionally protected”).
471 See, e.g,, Tam, 582 U.S. at 243 (plurality opinion) (“Giving offense is a viewpoint.”); Brown v. Ent. Merchs. Ass’n,
564 U.S. 786, 794 (2000) (holding that a law restricting the sale of “violent” works to children was content-based).
472 Cf., e.g., Woodhull Freedom Found. v. United States, 72 F.4th 1286, 1306 (D.C. Cir. 2023) (rejecting an argument
that FOSTA’s “Section 230(e)(5) selectively withdraws immunity on the basis of speech’s content or viewpoint,”
because that exception denies immunity only for unprotected speech integral to criminal conduct); Lewis v. Google,
851 Fed. Appx. 723, 724 n.2 (9th Cir. 2021) (rejecting a First Amendment overbreadth challenge to Section 230 in part
because the law “does not prohibit any speech”); Green v. Am. Online (AOL), 318 F.3d 465, 470 (3rd Cir. 2003)
(rejecting plaintiff’s claim that Section 230(c)(2) violates the First Amendment by allowing providers to restrict
constitutionally protected material, noting that the provision did “not require” the provider to “restrict speech”).
473 E.g., Online Freedom and Viewpoint Diversity Act, S. 4534, 116th Cong. § 2 (2020) (replacing “otherwise
objectionable” in Section 230(c)(2) with “promoting self-harm, promoting terrorism, or unlawful”); supra note 389.
474 E.g., Public Servant Anti-Intimidation Act of 2022, H.R. 8962, 117th Cong. (2022) (providing that Section 230 will
not bar liability for the publication of the personal information of a public servant); Health Misinformation Act of 2021,
S. 2448, 117th Cong. (2021) (providing that a service provider “shall be treated as the publisher or speaker of health
misinformation” if it uses certain algorithms to promote that content); Ending Support for Internet Censorship Act, S.
1914, 116th Cong. (2019) (providing that Section 230(c) will not apply to larger providers unless the FTC has certified
that “the company does not moderate information . . . in a manner that is biased against a political party, political
candidate, or political viewpoint”).
475 For a discussion of the so-called unprotected categories, see CRS In Focus IF11072, The First Amendment:
Categories of Speech
, by Victoria L. Killion.
476 E.g., EARN IT Act of 2022, S. 3538, 117th Cong. § 5 (2022).
477 See Woodhull Freedom Found., 72 F.4th at 1306.
478 See supra notes 377 to 385 and accompanying text, and notes 397 to 401 and accompanying text.
479 NetChoice, LLC v. Att’y Gen., 34 F.4th 1196, 1226 (11th Cir. 2022), cert. granted, 216 L. Ed. 2d 1313 (2023). In
Reed v. Town of Gilbert, 576 U.S. 155, 163 (2015), the Supreme Court said a law is content-based if it “draws
distinctions based on the message a speaker conveys.”
Congressional Research Service
47

link to page 36 Section 230: An Overview

be considered content-based if they are also viewed as applying based on the message
conveyed.480 The Fifth Circuit and others have claimed, however, that these types of proposals
should be considered content-neutral because requiring viewpoint-neutrality does not itself single
out any particular viewpoint or subject matter.481 This debate implicates open questions in the
Supreme Court’s definition of what qualifies as a content-based law.482 Similar questions could be
raised by bills that would have restricted the availability of Section 230 immunity when content is
recommended or restricted by an algorithm, to the extent these decisions about what content to
transmit could be seen as conveying a message.483
Proposals that condition Section 230 immunity on adopting certain types of procedures to
promote or restrict content, across all types of content, may be more likely to be considered
content-neutral.484 This might include, for example, bills that would have conditioned immunity
for takedown decisions on providing notice of the restriction and an opportunity for the content
provider to respond.485 The Eleventh Circuit characterized state law provisions requiring the
disclosure of content moderation standards as content-neutral.486 The federal bills that would have
conditioned Section 230 immunity on providing publicly available content moderation practices
could be viewed in the same light.487 At the same time, the Supreme Court has said disclosure
requirements are a type of compelled speech and has applied a variety of First Amendment tests
to disclosure requirements depending on the type of speech being compelled.488
Ultimately, given the open questions surrounding conditions on nonmonetary benefits, it is
difficult to say definitively how a court would analyze a First Amendment challenge to a limit or
condition on Section 230 immunity. As discussed, some Supreme Court precedent suggests that
laws that draw distinctions based on the content or viewpoint of speech may be subject to
heightened scrutiny, even in the context of a law that merely disfavors, rather than prohibits,
certain speech.489 However, the fact that any given Section 230 reform proposal does not directly
prohibit or compel speech would likely be a relevant factor in the First Amendment analysis.

480 See, e.g., Stopping Big Tech’s Censorship Act, S. 4062, 116th Cong. § 2 (2020).
481 See, e.g., NetChoice, L.L.C. v. Paxton, 49 F.4th 439, 480 (5th Cir. 2022), cert. granted, 216 L. Ed. 2d 1313 (2023);
Spencer, supra note 348, at 59–60.
482 See CRS Legal Sidebar LSB10739, Refining Reed: City of Austin Updates Test for Content-Based Speech
Restrictions
, by Victoria L. Killion.
483 See, e.g., Platform Integrity Act, H.R. 9695, 117th Cong. § 2 (2022); Biased Algorithm Deterrence Act of 2019,
H.R. 492, 116th Cong. § 2 (2019). Cf., e.g., United States v. Rundo, 990 F.3d 709, 717 (9th Cir. 2021) (per curiam)
(holding a federal law prohibiting speech tending to “promote” a riot swept in constitutionally protected speech).
484 See Turner Broad. Sys. v. FCC, 512 U.S. 622, 643–44 (1994) (characterizing as content-neutral regulations that
“interfere with cable operators’ editorial discretion,” where “the extent of the interference does not depend upon the
content of the cable operators’ programming”).
485 E.g., Protect Speech Act, H.R. 3827, 117th Cong. § 2 (2021).
486 NetChoice, LLC v. Att’y Gen., 34 F.4th 1196, 1227 (11th Cir. 2022), cert. granted, 216 L. Ed. 2d 1313 (2023).
487 E.g., 21st Century FREE Speech Act, S. 1384, 117th Cong. (2021); Limiting Section 230 Immunity to Good
Samaritans Act, H.R. 277, 117th Cong. (2021).
488 See generally CRS In Focus IF12388, First Amendment Limitations on Disclosure Requirements, by Valerie C.
Brannon et al. Both the Fifth and Eleventh Circuits, for example, applied a relatively lenient standard known as
Zauderer review to evaluate state law provisions requiring notice and appeal of content moderation decisions—
although they disagreed on the outcome of that constitutional analysis. See NetChoice, L.L.C. v. Paxton, 49 F.4th 439,
485, 487 (5th Cir. 2022) (ruling the provision “easily passes muster under Zauderer”), cert. granted, 216 L. Ed. 2d
1313 (2023); NetChoice, LLC, 34 F.4th at 1230–31 (ruling the provision was likely unduly burdensome on speech).
489 See, e.g., Matal v. Tam, 582 U.S. 218, 224 (2017) (majority opinion).
Congressional Research Service
48

Section 230: An Overview

Comparing the Operation of First Amendment and Section 230 Protections
Besides the constitutionality of Section 230’s immunity provisions and proposed reforms, another
relevant issue is the extent to which the First Amendment might prevent liability for hosting
content. The scope of First Amendment protections is important to understand the potential
consequences of Section 230 reforms. For example, FOSTA both created a new federal criminal
offense and created new exceptions to Section 230 immunity.490 The new criminal offense, which
prohibits operating an interactive computer service “with the intent to promote or facilitate the
prostitution of another person,”491 was challenged on constitutional grounds.492 Courts ultimately
rejected those challenges, reading the criminal law narrowly to avoid sweeping in protected
advocacy.493 Nonetheless, those cases could have affected not only the government’s ability to
enforce this federal criminal law, but could also have been relevant for courts determining
whether providers and users can face liability under the FOSTA exceptions to Section 230
immunity. Namely, even though Section 230 no longer barred state criminal prosecutions that
track this new criminal offense,494 courts might have concluded that the First Amendment
prevented prosecution.495
In a variety of legal contexts, courts have suggested that the First Amendment imposes a
heightened standard of liability, such as requiring proof of a higher level of intent, before speech
“distributors” such as bookstores and newsstands can be punished for circulating unlawful
content.496 And even in the context of lawsuits against publishers such as newspapers or
magazines, courts have sometimes imposed heightened standards where the liability is premised
on speech.497 Consequently, some commentators have argued that even if Section 230 were
repealed, the First Amendment would continue to prevent liability premised on hosting or
distributing speech.498 Although the Constitution likely would preclude civil or criminal liability

490 Allow States and Victims to Fight Online Sex Trafficking Act of 2017, Pub. L. No. 115-164, §§ 3–4, 132 Stat. 1253,
1253–54 (2018).
491 18 U.S.C. § 2421A.
492 Woodhull Freedom Found. v. United States, 72 F.4th 1286, 1296 (D.C. Cir. 2023); United States v. Martono, No.
3:20-CR-00274-N-1, 2021 WL 39584, at *1 (N.D. Tex. Jan. 5, 2021).
493 Woodhull Freedom Found., 72 F.4th at 1299; Martono, 2021 WL 39584, at *1.
494 47 U.S.C. § 230(e)(5)(C) (providing that Section 230 will not “impair or limit . . . any charge in a criminal
prosecution brought under State law if the conduct underlying the charge would constitute a violation of [18 U.S.C.
§ 2421A] and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or
facilitation of prostitution was targeted”).
495 Cf., e.g., United States v. Rundo, 990 F.3d 709, 717 (9th Cir. 2021) (per curiam) (holding a federal law prohibiting
speech tending to “promote” a riot impermissibly swept in constitutionally protected speech).
496 See, e.g., Smith v. California, 361 U.S. 147, 155 (1959) (holding that a law imposing criminal penalties on
bookstores that possess obscene material was unconstitutional under the First Amendment because it did not include
any element of scienter, or knowledge); Cubby, Inc. v. CompuServe, Inc., 776 F. Supp. 135, 139 (S.D.N.Y. 1991)
(requiring proof of knowledge before a distributor may be held liable for defamation). See also, e.g., Bantam Books,
Inc. v. Sullivan, 372 U.S. 58, 70 (1963) (holding that a state commission violated the First Amendment by sending
book publishers notices threatening punishment under state obscenity laws, characterizing the scheme as a system of
prior administrative restraints that was impermissible because it lacked sufficient procedural safeguards).
497 See, e.g., N.Y. Times Co. v. Sullivan, 376 U.S. 254, 279–80 (1964) (requiring a showing of “actual malice” before a
“public official” may recover damages from a newspaper for a defamatory statement “relating to his official conduct”);
Braun v. Soldier of Fortune Magazine, Inc., 968 F.2d 1110, 1114 (11th Cir. 1992) (ruling that a magazine could be held
liable for negligently publishing an advertisement “only if the advertisement on its face would have alerted a
reasonably prudent publisher to the clearly identifiable unreasonable risk of harm”).
498 See, e.g., Note, Section 230 as First Amendment Rule, 131 HARV. L. REV. 2027, 2028 (2018); cf. Brent Skorup &
Jennifer Huddleston, The Erosion of Publisher Liability in American Law, Section 230, and the Future of Online
Curation
, 72 OKLA. L. REV. 635, 637 (2020) (arguing that in the area of defamation law, “First Amendment
considerations would likely lead courts to a § 230-like liability protection,” but noting differences in the two regimes).
Congressional Research Service
49

link to page 46 link to page 9 link to page 46 link to page 46 Section 230: An Overview

in some circumstances, the protections of the First Amendment are likely not coextensive with
Section 230 immunity.499 Generally, this stems from the fact that Section 230 provides complete
immunity for covered activities absent an inquiry into whether the underlying content is
constitutionally protected, meaning that Section 230 likely protects at least some speech that the
First Amendment does not protect.
First, while Section 230 provides a complete bar to liability for covered activities, the First
Amendment may merely impose a heightened standard of liability if a lawsuit implicates
protected speech.500 One illustration comes from the pre-Section 230 rulings described above that
considered whether early online platforms hosting message boards could be held liable for
defamatory statements posted by users.501 In Cubby, the federal trial court concluded that
CompuServe should be treated as a distributor for purposes of analyzing the defamation claim.502
Accordingly, the court ruled that the plaintiff had to meet a heightened standard and prove that
CompuServe “knew or had reason to know of the allegedly defamatory . . . statements.”503 While
the trial court ultimately concluded that the plaintiff had not met this standard and dismissed the
defamation claim,504 it was theoretically possible for the plaintiff to prove the claim by submitting
sufficient evidence of CompuServe’s knowledge. By contrast, courts have ruled that Section 230
will bar a claim against a provider that merely publishes a defamatory statement regardless of
whether the provider actually knew about the statement.505 Accordingly, while heightened First
Amendment standards likely would lead courts to dismiss some lawsuits premised on speech,
plaintiffs with sufficient proof may be able to overcome those standards in circumstances where
Section 230 would have barred the suit. However, a few trial courts have concluded that the First
Amendment completely immunizes websites from certain civil claims without suggesting that
some heightened standard applies—similar to the current regime under Section 230.506
More generally, Section 230’s complete immunity for “publisher” activities has procedural
advantages for providers and users engaged in protected activity.507 As discussed above, the
inquiry into whether a service provider or user has engaged in “publisher” activities may overlap
with constitutional protections for “editorial” activity,508 but Section 230 nonetheless does not
require a court to investigate whether First Amendment activity has occurred. Accordingly,
Section 230 provides greater certainty for service providers and users that distributing or
restricting others’ speech will be protected from liability, without having to consider whether a
court would conclude the speech is constitutionally protected.509 In at least some cases, courts

499 See generally, e.g., Goldman, Why Section 230 Is Better than the First Amendment, supra note 436 (discussing ways
Section 230 offers more protection, both substantive and procedural, than the First Amendment).
500 See, e.g., id. at 38–39 (noting that “sufficient scienter can override” First Amendment protections in defamation
cases, but Section 230 “moot[s] inquiries into defendants’ scienter”).
501 See supra Stratton Oakmont, Inc. v. Prodigy Services Co.
502 Cubby, Inc. v. CompuServe, Inc., 776 F. Supp. 135, 140 (S.D.N.Y. 1991).
503 Id. at 140–41.
504 Id. at 141.
505 See, e.g., Zeran v. Am. Online, Inc., 129 F.3d 327, 334 (4th Cir. 1997) (concluding Section 230 barred claim that
provider could be held liable for defamation as a distributor with knowledge of the statement).
506 E.g., Zhang v. Baidu.com, Inc., 10 F. Supp. 3d 433, 443 (S.D.N.Y. 2014); Langdon v. Google, Inc., 474 F. Supp. 2d
622, 629–30 (D. Del. 2007); Search King, Inc. v. Google Tech., Inc., No. CIV-02-1457-M, 2003 WL 21464568, at *3–
4 (W.D. Okla. May 27, 2003).
507 See, e.g., 47 U.S.C. § 230(c)(1) (providing that a service provider or user may not be treated as a “publisher” of
another’s content); id. § 230(c)(2) (extending immunity for decisions to restrict certain material “whether or not such
material is constitutionally protected”).
508 See supra note 431.
509 See, e.g., Goldman, Why Section 230 Is Better than the First Amendment, supra note 436, at 42–43.
Congressional Research Service
50

link to page 25 link to page 49 Section 230: An Overview

may dismiss a lawsuit against a provider on Section 230 grounds at an early stage in the litigation
based on the allegations alone.510 Whether early dismissal is warranted, however, will depend on
the elements of the claim, the factual circumstances, and the particulars of any Section 230 or
First Amendment defense. For example, as discussed above, allegations that a provider acted in
bad faith have prevented providers from obtaining early dismissal under Section 230(c)(2)(A).511
Nonetheless, some commentators believe that in most cases, Section 230 will allow a quicker
dismissal than the First Amendment.512
Accordingly, while the First Amendment might prevent some claims premised on decisions to
host or restrict others’ speech, its protections are likely less extensive than the current scope of
Section 230 immunity.

Author Information

Valerie C. Brannon
Eric N. Holmes
Legislative Attorney
Attorney-Adviser (Constitution Annotated)



510 See id. at 39–40.
511 See supra note 227.
512 E.g., Gellis, supra note 460.
Congressional Research Service
51

Section 230: An Overview



Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan
shared staff to congressional committees and Members of Congress. It operates solely at the behest of and
under the direction of Congress. Information in a CRS Report should not be relied upon for purposes other
than public understanding of information that has been provided by CRS to Members of Congress in
connection with CRS’s institutional role. CRS Reports, as a work of the United States Government, are not
subject to copyright protection in the United States. Any CRS Report may be reproduced and distributed in
its entirety without permission from CRS. However, as a CRS Report may include copyrighted images or
material from a third party, you may need to obtain the permission of the copyright holder if you wish to
copy or otherwise use copyrighted material.

Congressional Research Service
R46751 · VERSION 7 · UPDATED
52