Legal Sidebari
State Social Media Laws at the Supreme Court
February 20, 2024
On February 26, 2024, the Supreme Court is to hear oral argument over the constitutionality of Florida
and Texas laws that restrict online platforms’ ability to moderate user content. The U.S. Courts of Appeals
opinions in these cases present two different views of the First Amendment rights at issue. The Eleventh
Circuit
concluded that parts of the Florida law were likely unconstitutional because they unduly restricted
the editorial judgment of the covered platforms. This decision was consistent with the way a number of
trial courts have characterized the First Amendment rights of websites that host user speech. In contrast,
the Fifth Circuit upheld the Texas law as constitutional
, saying the covered platforms were engaged in
“censorship,” not protected speech. A Supreme Court ruling in this case could have significant
implications for Congress as it considers whether and how to regulate online platforms. This Legal
Sidebar first discusses the relevant First Amendment principles at stake, then explains the background of
the two cases and the parties’ arguments at the Supreme Court.
Free Speech Protections for Speech Hosts
The First Amendment prevents the government from unduly infringing speech, including the speech of
private companies. The Supreme Court has long recognized that companies may be engaged in protected
speech both when they create their own speech—which can include activities like
designing a website for
customers—and when they provide a forum for others’ speech. When a private entity hosts speech, it
may
“exercise editorial discretion over the speech and speakers in the forum.” For instance, i
n Miami Herald
Publishing Co. v. Tornillo, the Supreme Court struck down a state law requiring newspapers to publish
certain pieces from political candidates. The Court
reasoned that newspapers “exercise editorial control
and judgment” over what material to print and how to present the content, and ruled that the First
Amendment prevented the government from regulating “this crucial process.”
The Court has
recognized these protections for editorial judgment outside the context of traditional media.
To take one example, i
n Hurley v. Irish American Gay, Lesbian & Bisexual Group of Boston (discussed in
more detail in a CRS podcast), the Court held that a parade organizer’s decisions about who could march
were expressive even though the parade as a whole did not communicate one coherent, particularized
message. The Court said the organizer’s decision to include a parade unit suggested the organizer would
be celebrating that group’s message, and accordingly, the First Amendment
protected the decision to
exclude a certain group.
Federal trial courts have applied these principles to online speech, citing the First
Amendment to dismiss private lawsuits that have challenged websites’ editorial decisions about what
content to publish.
Congressional Research Service
https://crsreports.congress.gov
LSB11116
CRS Legal Sidebar
Prepared for Members and
Committees of Congress
Congressional Research Service
2
I
n some cases, however, the Supreme Court has concluded the government can force a private entity to
host others’ speech. The distinction between these two lines of cases has not always been clear, but factors
that may be relevant are the
type of media, whether the business is providi
ng a forum for speech, whether
the host
in fact exercises discretion over the speech it hosts, and whether there is
a risk the third party’s
speech will b
e attributed to t
he host. For instance,
in U.S. Telecom Ass’n v. FCC, the D.C. Circuit
rejected
a First Amendment challenge to
a 2015 net neutrality order that classified broadband internet access
service providers as common carriers and prevented them from blocking lawful content. Critically, the
court
concluded those providers did not actually make editorial decisions picking and choosing which
speech to transmit. Then-Judge Kavanaugh, however
, disagreed with this “use it or lose it” analysis. He
argued the providers were entitled to First Amendment protection because they transmit internet content,
regardless of whether they actually choose to exercise their editorial discretion.
If a court concludes that a host exercises protected editorial discretion, it may then ask what government
infringements on that speech activity may be permissible. Similarly to
Tornillo and
Hurley, some federal
trial
courts have
seemed to take an absolute approach, concluding that no infringement on editorial
discretion is permissible. In other contexts, however, courts have appli
ed different levels of constitutional
scrutiny that allow the government to justify infringing protected activity. Broadly, if a l
aw compels
specific messages or targets spee
ch based on its content, courts will usually apply a demanding standard
known as “strict scrutiny.” Under strict scrutiny, a law is presumptively unconstitutional and must be
narrowly tailored to serve compelling state interests. If a law i
s content-neutral, or if it primarily targets
nonexpressive conduct and onl
y incidentally regulates speech, a court may apply “intermediate scrutiny.”
This standar
d requires the restriction on speech to be no greater than essential to further an important or
substantial government interest. For example, the Court applied intermediate scrutiny to evaluate a law
requiring cable operators to carry certain local broadcast stations, after concluding the law’s application
did not depend on the content of the operators’ programming and was instead based on
special
characteristics of the medium. The Court also evaluates commercial speech regulations using intermediate
scrutiny; however, it has reviewed certai
n commercial disclosure requirements under an even more
lenient standard prescribed i
n Zauderer v. Office of Disciplinary Counsel. Zauderer upheld a state law
requiring attorneys to include certain statements in their advertisements after
ruling the disclosures were
“reasonably related” to the state’s interest in preventing consumer deception.
State Laws and Procedural History
The details of Florida’s and Texas’s content moderation laws and the litigation in the lower federal courts
are discussed in more detail i
n this prior Legal Sidebar. An abbreviated discussion is below.
Florida: NetChoice v. Moody
In May 2021, Florida enacted a
law limiting computer services’ ability to restrict user content. The law
applied to any service that meets certain size thresholds and “provides or enables computer access by
multiple users to a computer server.” Thus, the law included not only social media sites but also, for
instance, internet service providers and offline entities that provide computer access. (For ease of
reference, this discussion uses the term “platforms” to refer to the entities covered by both states’ laws.)
Florida’s law required platforms to apply their moderation standards consistently and limited how often
platforms could change their moderation rules. It also required platforms to give notice and explanation
before the platform could restrict users’ content. Further, the law completely prohibited platforms from
removing or restricting the content of political candidates or “journalistic enterprises.” The law also
contained other disclosure provisions, such as requiring platforms to share terms of service and provide
data about how many people viewed a user’s posts.
Congressional Research Service
3
Two trade groups, NetChoice and the Computer & Communications Industry Association (CCIA)
, sued to
enjoin this law, claiming it violated the First Amendment rights of their members. In May 2022, the
Eleventh Circuit
partially affirmed a preliminary injunction preventing the state from enforcing this law.
The court held that the provisions limiting platforms’ ability to engage in content moderation were likely
unconstitutional, but rejected the constitutional challenges to most of the disclosure requirements.
The Eleventh Circuit first
concluded that the law triggered First Amendment scrutiny by restricting the
platforms’ exercise of editorial judgment and imposing disclosure requirements. More specifically, the
court
ruled that the regulated platforms “exercise editorial judgment that is inherently expressive”: they
express a message of disagreement or disapproval when they “choose to remove users or posts,
deprioritize content in viewers’ feeds or search results, or sanction breaches of their community
standards.” Accordingly, the law’s content moderation provisions—those that would prohibit restricting
content or control how platforms apply or change their moderation standards
—triggered either strict or
intermediate First Amendment scrutiny. The court
said it did not need to decide exactly what standard
applied to each of these provisions because they could not withstand even intermediate scrutiny. The court
rejected a hypothetical state interest in preventing private “censorship” or promoting a variety of views by
citing Supreme Court precedent establishing “there’s no legitimate—let alone substantial—governmental
interest in leveling the expressive playing field.”
The court
applied Zauderer to review the disclosure provisions, both the notice-and-explanation
requirement and the other provisions. However, while the court ruled that most of the disclosure
provisions satisfied
Zauderer’s lenient constitutional standard, it
held that the notice-and-explanation
requirement did not. The cour
t said this provision was unduly burdensome; it was “practically impossible
to satisfy” and so would be likely to chill the platforms’ exercise of editorial judgment.
Texas: NetChoice v. Paxton
Texas’
s law, enacted in September 2021, also imposed content moderation and disclosure requirements on
platforms, but had a slightly different scope than Florida’s law. Texas’s law applied more narrowly to
large public sites that allow users to create accounts and communicate with others “for the primary
purpose” of sharing information. The law also excluded certain services such as internet service
providers, email, or certain news sites. Texas’s law prohibited covered platforms from “censor[ing]” users
based on viewpoint or location in Texas, although it contained exceptions allowing providers to remove
certain types of unlawful or otherwise harmful content. The law imposed a requirement for platforms to
provide users with notice and an opportunity to appeal when their content is removed. Finally, the law
also imposed other disclosure requirements relating to the platforms’ content moderation standards.
NetChoice and CCIA also
challenged the Texas law under the First Amendment. The Fifth Circuit,
however,
reversed a preliminary injunction against the law, concluding the trade groups were unlikely to
succeed on their constitutional claims. (One ju
dge dissented in part.) The Fifth Circuit
recognized that its
opinion disagreed with the Eleventh Circuit’s reasonin
g, including how to interpret Supreme Court
precedent discussing editorial discretion.
Throughout its opinion, the Fifth Circuit rejected the First Amendment arguments
by characterizing the
plaintiffs as asserting a “right to censor,” not a protected speech right. The court
said the platforms
regulated by Texas’s law were “nothing like” the newspapers in
Tornillo. The court
concluded the
platforms “exercise virtually no editorial control or judgment,” describing them as using algorithms to
screen out obscenity and spam but posting “virtually everything else.” The Fifth Circuit
believed cases
like
Hurley apply only when a host is “intimately connected” with the third-party speech, and said the
platforms are not so connected, in part because they do not curate an overall message. In addition to
declaring
Tornillo and
Hurley inapposite, the Fifth Circuit als
o stated that the Supreme Court has not
recognized editorial discretion as a type of protected speech. Instead, the court
held that the platforms
Congressional Research Service
4
could be treated as “common carriers subject to nondiscrimination regulation.” (While the D.C. Circuit’s
U.S. Telecom Association case invol
ved common carrier classification under the Communications Act, the
Fifth Circuit looked to a historical
, common-law definition of common carriers as a special class of
“communication and transportation providers” that must serve all comers.) In the alternative, the court
ruled that even if the law did implicate the platforms’ First Amendment rights, it would implicate at most
only intermediate scrutiny, which the state could satisfy. In contrast to the Eleventh Circuit, the Fifth
Circuit
said protecting the free exchange of a variety of ideas is an important government interest.
Like the Eleventh Circuit, the Fifth Circuit
concluded the disclosure provisions were subject to review
under
Zauderer. Unlike the Eleventh Circuit, the Fifth Circuit
held that
all the disclosure provisions,
including the notice and explanation requirements, were not overly burdensome and satisfied this level of
constitutional review.
Party Arguments at the Supreme Court
Bot
h Florida and th
e trade groups appealed the Eleventh Circuit’s ruling to the Supreme Court, and the
trade groups appealed the Fifth Circuit’s ruling. The Supreme Court agreed to hear
Moody v. NetChoice (Florida’s appeal) and
NetChoice v. Paxton (the Texas case), limited to t
he questions of whether the laws’
“content-moderation restrictions” and “individualized-explanation requirements” comply with the First
Amendment. Thus, the Court did not agree to consider all the laws’ disclosure provisions, only the
provisions requiring platforms to explain their moderation decisions.
In their briefs, the trade groups claim the content moderation restrictions and explanation requirements in
both laws violate the First Amendment by forcing private parties to host speech with which they disagree.
The trade groups
cite Tornillo and
Hurley as
recognizing constitutional protections for private parties’
editorial judgments. The gr
oups contend these principles extend online
, including to platforms’ post-
publication review of user content. Responding to the Fifth Circuit opinion, they
point out that
Hurley found the parade organizer to be “intimately connected” to the speech it compiled even though the parade
failed to convey “an exact message.” The covered platforms are not “common carriers,” th
ey claim,
because the platforms “constantly engage in editorial filtering . . . pursuant to policies they publish and
enforce.” In their
view, the fact that platforms make “individualized determinations about which speech to
disseminate and how”—unlike a common carrier—was the very reason Florida and Texas enacted laws
limiting this discretion. The United States filed a
brief in support of the trade groups, defending the
Eleventh Circuit’s approach to editorial discretion—although that brief al
so suggested the trade groups’
view of speech hosts’ rights sweeps too broadly at times.
The trade groups argue the
Florida and Texas laws should be subject to strict scrutiny because the laws
compel speech and contain other content-based distinctions. The Florida law’s focus on journalistic
enterprises and speech by or about political candidates arguabl
y singles out specific subject matters for
different treatment. The Texas law contai
ns exceptions that allow sites to censor specific types of content,
and completely excludes news, sports, and entertainment sites apparently based on the content of the
speech they carry. Finally, more specifically addressing the notice-and-explanation requirements, the trade
groups
observed that the Supreme Court has never applied
Zauderer to uphold a disclosure requirement
outside the context of correcting misleading advertising, and argued that
Zauderer’s lenient review should
not apply to requirements that “have nothing to do with advertising.”
Bot
h Florida a
nd Texas portray their laws as permissible nondiscrimination regulations,
arguing the laws
regulate nonexpressive conduct rather than speech. Florida, for instance
, claims that the platforms’
hosting of third-party speech is “inherently nonexpressive conduct”
because the platforms “are generally
open to all users and content” and most often do not make individualized decisions about whether to
allow specific content. Flori
da contrasts this to the “deliberate selection and expression” at issue in cases
like
Hurley. Further, Florid
a asserts the platforms’ decisions about how to arrange content are not made
Congressional Research Service
5
with an intent to convey a message or
promote certain content. It further
claims that the platforms do not
have an expressive interest in censoring journalistic enterprises or political candidates.
Texas’s arguments in favor of its law are somewhat different. Texa
s argues that its law regulates conduct
because the platforms’ “‘dominant market shares’ allow them to exercise ‘unprecedented’ and
‘concentrated control’ over the world’s speech”
—including by “favoring certain viewpoints.” Texas
focuses on historical regulation of certain mediums of communicati
on, asserting the covered platforms
“are today’s descendants” of common carriers. A group of almost 20 states filed an
amicus brief in
support of Florida and Texas that raised somewhat similar arguments, focusing largely on the historical
precedent for regulating platforms for mass communications. Like the Fifth Circuit, Texas
claims there is
no First Amendment right of “editorial discretion.” Instead, if the platforms seek to avoid being treated as
common carriers, Texa
s argues (among other factors) that services must be provided according to
individualized contracts that vary by customer, rather than provided on general terms to all customers.
In the alternative, bot
h Florida a
nd Texas argue that even if their content moderation provisions implicate
speech, they are content-neutral laws that should survive intermediate scrutiny. Bot
h states al
so claim
their notice-and-explanation provisions should be upheld under
Zauderer, saying that the Supreme Court
has never expressly limited that case to the advertising context
and arguing that any compliance burdens
on platforms will
be minimal.
Considerations for Congress
These two cases involve the constitutionality of state laws, but some Members of the 118th Congress have
also expressed interest in regulating online platforms, and more specifically in regulating online content
moderation. For instance
, one bill would make it unlawful for a social media service to “de-platform”
citizens based on their “social, political, or religious status.” Some bills would limit the scope of
a federal
immunity provision known as
Section 230, with the goal of disincentivizing sites from restricting user
content
—for example, allowing liability if certain sites engage in content moderation activity that
promotes “a discernible viewpoint.” A number of other bills would impose various
transparency
requirements on platforms such a
s requiring platforms t
o disclose their
terms of service, including their
content moderation practices. Some of these bills would specifically require
an explanation of platforms’
decisions to restrict content and require platforms to provide a
complaint process to appeal the decision. If
the Supreme Court clarifies the scope of constitutional protections for editorial judgment and, more
specifically, weighs in on the constitutionality of Florida’s and Texas’s content moderation and disclosure
provisions, its opinion could affect Congress’s ability to enact similar proposals.
Author Information
Valerie C. Brannon
Legislative Attorney
Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff
to congressional committees and Members of Congress. It operates solely at the behest of and under the direction of
Congressional Research Service
6
Congress. Information in a CRS Report should not be relied upon for purposes other than public understanding of
information that has been provided by CRS to Members of Congress in connection with CRS’s institutional role.
CRS Reports, as a work of the United States Government, are not subject to copyright protection in the United
States. Any CRS Report may be reproduced and distributed in its entirety without permission from CRS. However,
as a CRS Report may include copyrighted images or material from a third party, you may need to obtain the
permission of the copyright holder if you wish to copy or otherwise use copyrighted material.
LSB11116 · VERSION 1 · NEW