Legal Sidebari
Free Speech Challenges to Florida and Texas
Social Media Laws
Updated September 22, 2022
Two U.S. Courts of Appeals have recently taken different positions on the validity of state laws restricting
internet services’ ability to moderate user content. In May, the Eleventh Circuit largely
upheld a
preliminary injunction ruling Florida’
s Senate Bill 7072 likely unconstitutional, preventing the law from
taking effect. This ruling contrasts with a September ruling from the Fifth Circuit
rejecting a challenge to
a somewhat similar Texas law
, H.B. 20. As explained in more detail in this Legal Sidebar, the two
opinions take different views of whether these laws likely violate the constitutional free speech rights of
online platforms. This Legal Sidebar begins by reviewing the relevant constitutional background, then
explains both states’ laws and the First Amendment aspects of the legal challenges to those laws.
First Amendment and Editorial Control
As explored in
this CRS Report, the Supreme Court has recognized that private entities may exercise
constitutionally protected “editorial control” when they choose what speech to publish or how to present
it. For example, in one case, the Cou
rt held that a state could not force newspapers to publish political
candidates’ responses to editorials criticizing their character. The Court ruled that the newspaper’s “choice
of material” constituted “editorial control and judgment” that could not be regulated “consistent with First
Amendment guarantees.” Newspapers and
cable operators are classic examples of companies that
exercise editorial control, and the Court has recognized that other private businesses, includin
g public
utilities and
parade organizers, may also have constitutionally protected rights to exclude speech in
certain circumstances. In
one case, the Court stated the principle as follows: “when dissemination of a
view contrary to one’s own is forced upon a speaker intimately connected with the communication
advanced, the speaker’s right to autonomy over the message is compromised.”
In other decisions, however, the Supreme Court has held that private entities may not assert a
constitutional right to exclude third parties if the hosting decision is not
“inherently expressive” and does
not implicate concerns about editorial control. Applying these principles, one federal court of appeals
concluded that the First Amendment did not bar net neutrality regulations requiring broadband providers
to host lawful content. This ruling was
based on the premise that these providers did not exercise
protected editorial discretion but instead neutrally transmitted all third-party speech in the same way a
common carrier would. (Historically
, common carriers were
companies such as railroads or
Congressional Research Service
https://crsreports.congress.gov
LSB10748
CRS Legal Sidebar
Prepared for Members and
Committees of Congress
Congressional Research Service
2
telecommunications services who held themselves out to the public as carrying passengers, goods, or
communications for a fee.) The appeals cou
rt cautioned, however, that it might have resolved the case
differently if the providers instead “engage[d] in editorial discretion” by “selecting which speech to
transmit.” Justice Kavanaugh, then a judge on the federal appeals court, would have held that the net
neutrality rule was unconstitutional,
saying that internet service providers perform the same kinds of
functions as cable operators and thus exercise constitutionally protected editorial discretion.
Accordingly, one critical question for lower courts evaluating laws or lawsuits that would require a
website to host unwanted speech has been whether a site’s hosting decision is expressive. A related
question is whether the website exercises editorial discretion. If courts find that online platforms are
exercising protected editorial discretion when they moderate user-generated content, then the First
Amendment will limit the government’s ability to regulate platforms’ content moderation decisions.
A number of trial courts facing this issue have concluded that the First Amendment barred lawsuits
seeking to h
old websites, search engines, and social media companies liable for their decisions to not host
certain content. For example, one trial cour
t concluded that the First Amendment barred a lawsuit brought
under federal and state civil rights laws when the plaintiffs tried to hold a search engine liable for
designing “its search-engine algorithms to favor certain expression on core political subjects.” The court
ruled that the plaintiffs’ theory of liability depended on the premise that the search engine “exercise[d]
editorial control” protected by the First Amendment by favoring certain expression. The court believed
that allowing the lawsuits to proceed would violate the prin
ciple, stated by the Supreme Court, “that a
speaker has the autonomy to choose the content of his own message.”
Florida
Senate Bill 7072
Florida’s social media law, signed into law on May 24, 2021, restricts internet services’ ability to
moderate content and imposes certain disclosure obligations on those services. The law primarily applies
to “social media platforms,
” defined broadly to include any service that “[p]rovides or enables computer
access by multiple users to a computer server,” operates as a “legal entity,” and does business in the state.
Partially tracking the federal definition of
“interactive computer service,” this term could therefore
include services such as search engines or internet service providers. Further, the definition includes only
larger companies that meet certain revenue or user thresholds.
The content moderation provisions of the l
aw limit platforms’ ability to engage in deplatforming,
censorship, shadow-banning, or post prioritization—all term
s defined in the law. The law requires
platforms to apply their moderation standards “in a consistent manner” and provides that platforms may
only change their “user rules, terms, and agreements” once every 30 days. It also requires platforms to
allow users to opt out of certain content-moderation practices. Additional restrictions prohibit platforms
from deplatforming or restricting the content of political candidates or
“journalistic enterprises.”
The law also contains several disclosure provisions, including requirements to publish standards for
moderating content, inform users about changes to terms of service, and provide data about how many
people viewed a user’s posts. The law also requires platforms to give users notice and explanations before
the platform may censor, deplatform, or shadow ban users’ content.
NetChoice v. Moody
As discussed in
a prior Legal Sidebar, a federal trial court granted a preliminary injunction temporarily
staying enforcement of Florida’s law on June 30, 2021. The trial court held that the law was likely
Congressional Research Service
3
unconstitutional under the First Amendment after concluding that it discriminated based on the content
and viewpoint of speech. Florida appealed that decision to the Eleventh Circuit.
On May 23, 2022, the Eleventh Circu
it partially affirmed this preliminary injunction, agreeing that many
aspects of the law were likely unconstitutional but upholding some of the disclosure provisions. The court
fir
st held that platforms engaged in content moderation are exercising protected “editorial judgment that is
inherently expressive.” The cou
rt stated that “when a platform removes or deprioritizes a user or post, it
makes a judgment about whether and to what extent it will publish information to its users—a judgment
rooted in the platform’s own views about the sorts of content and viewpoints that are valuable and
appropriate for dissemination on its site.
” Citing a variety of platforms’ moderation policies, the court
noted that by removing certain users or types of content, platforms “cultivate different types of
communities” and sometimes “promote explicitly political agendas.” This, in the court’s view, was
protected editorial activity. The state had
argued that the covered platforms should be treated as common
carriers, which can be held to equal access obligations. The cou
rt disagreed, stating that unlike
telecommunications service providers like telegraph companies, social media platforms had never acted
as common carriers but had instead always restricted the use of their platforms. The court further
concluded that the state could not designate the platforms as common carriers if it would abrogate the
platforms’ First Amendment rights.
Accordingly, the cou
rt ruled that the law triggered First Amendment scrutiny by restricting platforms’
“ability to speak through content moderation.” The content moderation p
rovisions limited the platforms’
editorial judgment, and the disclosure provisions (with one exception) indirectly burdened that judgment.
Although the court held that both the content moderation provisions and the rest of the disclosure
requirements affected the platforms’ editorial judgment, it treated those two types of provisions differently
in its First Amendment analysis. The cour
t held that the content moderation provisions were subject to
some form of heightened constitutional scrutiny and likely could not survive that review. Reasoning that
the state had no substantial interest in “leveling the playing field” for speech, the cou
rt found the law did
not further any substantial government interest. Neither did th
e state show that the burden on speech was
no greater than necessary, given how broadly the law restricted platforms’ editorial discretion.
A more lenient standard of review applied to most of the disclosure provisions, and the court mainly
upheld those provisions. Specifically, the court applied a relaxed standard applicable to
commercial
disclosure requirements. The cou
rt said that most of the transparency requirements likely permissibly
served an interest “in ensuring that users—consumers who engage in commercial transactions with
platforms by providing them with a user and data for advertising in exchange for access to a forum—are
fully informed about the terms of that transaction and aren’t misled about platforms’ content-moderation
policies.” The provision requiring platforms to provide notice and justification for all content moderation
actions, though
, was deemed “unduly burdensome and likely to chill platforms’ protected speech.”
The Eleventh Circuit’
s opinion therefore allowed portions of Florida’s law to go into effect but otherwise
affirmed the trial court’s preliminary injunction preventing the law from taking effect. This judgment
stands in contrast to the Fifth Circuit’s recent ruling on Texas’s somewhat similar social media law.
Texas
H.B. 20
Texas enacte
d H.B. 20 on September 9, 2021, months after Florida’s law was adopted—and preliminarily
enjoined by a trial court. H.B. 20
defines “social media platform” more narrowly than Florida’s law does,
applying the term only to a “website or application that is open to the public, allows a user to create an
account, and enables users to communicate with other users for the primary purpose of posting
Congressional Research Service
4
information, comments, messages, or images.” Thus, unlike the Florida law, which broadly sweeps in a
variety of internet service providers, the Texas law focuses on sites with the primary purpose of enabling
user communication. H.B. 20 further applies only to platforms with “more than 50 million active users in
the United States in a calendar month.” The definition expressly
excludes certain services such as internet
service providers, email, or certain news sites. (Some
provisions of the law, however, impose separate
restrictions on email providers.)
Like the Florida law, the Texas law imposes both content moderation restrictions and disclosure
requirements on covered platforms. The T
exas law prohibits social media platforms from “censor[ing]”
users or content based on viewpoint or the user’s geographic location in the state. However, the law does
not prevent platforms from censoring some
specific types of content, including unlawful expression or
specific discriminatory threats of violence. The law also says social media platforms can continue to
censor content when “specifically authorized ... by federal law,” a provision t
hat may preserve platforms’
federal immunity under the Communications Act’
s Section 230 for removing certain “objectionable”
content.
Th
e law also imposes operational restrictions on platforms, requiring them to “provide an easily
accessible” system for users to submit complaints about illegal content or content removals. Platforms
must generally act on these complaints within 48 hours. Further, platform
s must notify users when the
platforms remove their content and provide users with the opportunity to appeal such a decision under
statutorily specified procedures.
The law additionally
requires platforms to “disclose accurate information” about their content and data
management and “business practices,” including publishing an acceptable use policy explaining their
content moderation policies. It furth
er requires the biannual publication of a transparency report with
information about takedowns of illegal or policy-violating content.
NetChoice v. Paxton
On December 1, 2021, a federal trial cou
rt ruled H.B. 20 likely unconstitutional and entered a preliminary
injunction preventing the state from enforcing the restrictions on social media platforms discussed above.
This decision was appealed to the Fifth Circuit, which on May 11, 20
22, entered a stay of the preliminary
injunction pending appeal, allowing the Texas law to go into effect. However, the Supreme Cou
rt vacated
the Fifth Circuit’s stay on May 31, 2022, allowing the preliminary injunction to go into effect again.
On September 16, 2022, the Fifth Circuit issued
an opinion rejecting the constitutional challenge and
vacating the trial court’s preliminary injunction. One jud
ge dissented in part, concluding that H.B. 20’s
content moderation provisions violated the First Amendment.
Starting with the content moderation provisions, the majority opinion
first held that the “history and
original understanding” of the First Amendment did not support the platforms’ view. The cou
rt suggested
the Founders were primarily concerned with
prior restraints on speech and protections for “good-faith
opinions on matters of public concern.” In the court’s view, H.B. 20 did not operate as a prior restraint or
restrict platforms’ ability to express their own views because it restricted only the platforms’ ability to
restrict
others’ speech. T
he court “reject[ed] the Platforms’ efforts to reframe their censorship as speech.”
It ruled that the covered platforms “exercise virtually no editorial control or judgment,” because they “use
algorithms to screen out certain obscene and spam-related content” but post “virtually everything else.”
The court furt
her concluded that the platforms’ content moderation decisions were not inherently
expressive, and any expressive communication would occur only when a platform explains why it made
that decision and how it expresses the platform’s views.
The Fifth Cir
cuit suggested that Section 230, the federal immunity provision mentioned above, supported
its ruling.
Section 230 states that providers and users of “interactive computer services” may not “be
Congressional Research Service
5
treated as the publisher or speaker” of another’s content. The Fifth Circuit believed that this immunity
“reflects Congress’s judgment that the Platforms do not operate like traditional publishers and are not
‘speak[ing]’ when they host user-submitted content.” The cou
rt characterized this statutory immunity
provision as a “factual determination” relevant to its constitutional analysis.
The court also
ruled that H.B. 20 correctly classified the covered platforms as common carriers and that
the law’s nondiscrimination requirement was consistent with historical regulation of common carriers.
This was again based in part on the court’
s conclusion that the platforms hold themselves out to serve the
public equally, even though they do business only with users who agree to their terms of service and they
censor certain types of content.
As an alternative basis for its holding, the Fifth Circu
it said that even if the platforms engage in protected
speech when they moderate content, the law would be subject only to an intermediate level of scrutiny
because it was content-neutral. As discussed in
this prior Legal Sidebar, content-based laws ordinarily
trigger strict constitutional scrutiny, while content-neutral laws receive intermediate scrutiny. The
Supreme Court has explained that a law is content-based if it applies to speech because of the substantive
message being expressed. The Fifth Cir
cuit ruled that H.B. 20 is content-neutral because it does not
discriminate based on the content or viewpoint of the platforms’ moderation decisions. The court
said that
even though the law allows platforms to restrict certain types of content such as specific threats of
violence, these exceptions did not make the law content-based because they were not based on
hostility toward the message expressed by moderation. In some contrast to the Eleventh Circuit’s conclusion that
the government does not have a substantial interest in ensuring a level playing field for speech, the Fifth
Circuit
held that Texas’s interest in promoting the free exchange of ideas from a variety of sources was a
sufficiently important interest and that H.B. 20 was the least speech-restrictive way to achieve this goal.
Turning to the disclosure requirements, the Fifth Circuit
concluded, like the Eleventh Circuit, that these
provisions were constitutional under the lower level of constitutional scrutiny applicable to commercial
disclosure requirements. However, unlike the Eleventh Circuit, the Fifth Circuit
held that the operational
provisions requiring explanation and appeals of content removal decisions were also constitutional,
dismissing the platforms’ complaints about the burdens on their moderation activity.
This ruling likely creates a circuit split on the constitutional protections that should be afforded to private
companies that host others’ speech. The Fifth Circuit acknowledged that—although the Florida and Texas
laws differed in some ways—
it disagreed with the Eleventh Circuit on key legal issues relating to
constitutional protections for editorial discretion and the common carrier doctrine. Although the Fifth
Circuit acknowledged that the Supreme Court has said certain businesses exercise editorial discretion, the
appeals cou
rt rejected the idea that the First Amendment protects editorial control in and of itself. Instead,
it said that regulation of editorial decisions implicates the First Amendment only when the regulation
“either coerces” the host “to speak or interferes with their speech.” The dissenting judge, however,
agreed
with the Eleventh Circuit that the Supreme Court has clearly stated “that protected expression lies not
merely in the message or messages transmitted but in the process of collecting and presenting speech.”
Considerations for Congress
The Fifth Circuit’s views of social media platforms’ First Amendment rights stand in contrast to the
Eleventh Circuit’s ruling and opinions from a number of trial courts. Decisions weighing in on online
platforms’ constitutional rights to freely moderate content could be significant not only for Florida and
Texas, but also for other states that hav
e indicated that they are considering similar legislation limiting
viewpoint discrimination or encouraging moderation
of harmful content. This circuit split creates some
ambiguity for states seeking to assess possible legal challenges. It also means that a state’s ability to enact
similar laws may depend on the federal judicial circuit in which it is located. Online platforms’
obligations may also depend on where they operate.
Congressional Research Service
6
The scope of online platforms’ First Amendment rights is also relevant to Congress as it considers bills
proposing to regulate online content moderation. Some feder
al proposals would, in ways somewhat
distinct from the Texas law, seek to penalize online services that restrict content based on viewpoint or
would ot
herwise require platforms to host lawful content. Oth
er federal bills would institu
te transparency
requirements with some similarities to certain portions of the Florida and Texas laws. Further decisions on
the constitutionality of state laws may suggest how courts are likely to analyze federal laws regulating
social media platforms.
The Fifth Circuit ruling also raises questions about the relevance of Section 230 to these discussions.
Because the Fifth Circuit relied on Section 230’s “publisher or speaker” immunity in assessing whether
online platforms should be considered speakers for First Amendment purposes, amendments that alter or
restrict Section 230 immunity could suggest a new
“determination” by Congress that these moderation
decisions represent the platform’s speech. Even without any change to Section 230, though, other courts
might question whether the Fifth Circuit correctly interpreted Section 230 as expressing a clear
congressional view on whether online platforms engage in constitutionally protected speech by hosting
third-party speech.
NetChoice, one of the trade associations challenging both the Florida and Texas laws, h
as said it believes
that the Supreme Court will eventually vindicate platforms’ First Amendment rights. Justice Alito, in an
opinion joined by Justices Thomas and Gorsuch, agreed that the issues involved in the Fifth Circuit
dispute “will plainly merit this Court’s review.” Florida
has appealed the Eleventh Circuit’s ruling to the
Supreme Court. NetChoice’s statement suggests it is likely to appeal the Fifth Circuit’s ruling, presenting
the Supreme Court with another opportunity to weigh in on this issue. Further, the rulings in these cases
are somewhat related to another case the Supreme Court is set to hear in its upcoming October 2022 term.
In
303 Creative, LLC v. Elenis, the Court agreed to consider whether a state would violate the Free
Speech Clause by applying its nondiscrimination laws to a website designer who does not want to create a
website for a same-sex wedding. Although the free speech claim in
303 Creative is not phrased in terms
of editorial discretion, it pre
sents related questions regarding when website design qualifies as speech,
particularly when the message conveyed by the website could be attributed to a third party.
Author Information
Valerie C. Brannon
Legislative Attorney
Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff
to congressional committees and Members of Congress. It operates solely at the behest of and under the direction of
Congress. Information in a CRS Report should not be relied upon for purposes other than public understanding of
information that has been provided by CRS to Members of Congress in connection with CRS’s institutional role.
CRS Reports, as a work of the United States Government, are not subject to copyright protection in the United
States. Any CRS Report may be reproduced and distributed in its entirety without permission from CRS. However,
as a CRS Report may include copyrighted images or material from a third party, you may need to obtain the
permission of the copyright holder if you wish to copy or otherwise use copyrighted material.
Congressional Research Service
7
LSB10748 · VERSION 6 · UPDATED