Trial Court Rules State Social Media Law Likely Unconstitutional




Legal Sidebari

Trial Court Rules State Social Media Law
Likely Unconstitutional

July 2, 2021
In late May, Florida enacted a new law regulating internet services and online content moderation.
Interest groups representing internet companies quickly challenged the law in court, arguing that it
violates the First Amendment’s Free Speech Clause and is preempted by Section 230 of the
Communications Act of 1934. O
n June 30, 2021, in NetChoice v. Moody, a federal trial court granted a
preliminary injunction temporarily staying enforcement of the state law, which was set to go into effect on
July 1, 2021. This Legal Sidebar explains the Florida law as well as the lawsuit challenging it. The
Sidebar also discusses possible implications for Congress as it considers whether to regulate online
content moderation.
Florida’s Social Media Law
Florida’s governor signed Senate Bill 7072 into law on May 24, 2021. The law contains a number of
provisions limiting online companies’ ability to engage in deplatforming, censorship, shadow-banning, or
post prioritization. Each of these terms is defined in the law. The law focuses mostly on “social media
platforms,” defined broadly to include any service that “[p]rovides or enables computer access by
multiple users to a computer server,” operates as a “legal entity,” and does business in the state. Further,
the definition only includes larger companies that meet certain revenue or user thresholds. Partially
tracking Section 230’s definition of “interactive computer service,” this term could therefore include
services such as search engines or internet service providers—although the state law excludes services
owned by companies that also operate a theme park or entertainment complex.
Several provisions in the law address speech by or about political candidates. Section 2 provides that a
“social media platform” may be fined if it “willfully deplatform[s] a candidate for office.” Section 4,
discussed in more detail below, prohibits platforms from using “post-prioritization and shadow banning
algorithms” for content posted by or about candidates. Section 2 also requires social media platforms to
notify candidates that they have made an “in-kind contribution” if they “willfully provide[] free
advertising” for the candidate.
Section 4 places more general requirements on covered “social media platforms,” enforced through state
laws regulating unfair or deceptive practices and through private civil actions. Generally, this section
Congressional Research Service
https://crsreports.congress.gov
LSB10618
CRS Legal Sidebar
Prepared for Members and
Committees of Congress




Congressional Research Service
2
limits platforms’ ability to “censor, deplatform, and shadow ban” users or engage in “post-prioritization”
by imposing certain transparency, procedural, and opt-out provisions. More specifically, Section 4:
 Requires a platform to inform users about any changes to its terms of service prior to
implementing the changes, and provides that platforms may only change their terms once
every 30 days;
 Requires a platform to publish the standards “it uses or has used for determining how to
censor, deplatform, and shadow ban,” and requires platforms to apply those standards “in
a consistent manner among its users on the platform”;
 Prohibits platforms from censoring, deplatforming, or shadow banning users without
prior written notice explaining the rationale for the decision (unless the material is
“obscene”), and requires platforms to allow deplatformed users to retrieve their data for
at least 60 days after receiving this notice; and
 Requires platforms to allow users to request data on how many users were shown their
posts, and “to opt out of post-prioritization and shadow banning algorithm categories to
allow sequential or chronological posts and content.”
Finally, Section 4 also prohibits platforms from censoring, deplatforming, or shadow banning “a
journalistic enterprise based on the content of its publication or broadcast.” It defines “journalistic
enterprise”
to include any business operating in Florida that: (1) publishes a certain quantity of words or
video online and meets certain quotas for users or viewers; (2) operates a cable channel and meets
specific quotas for content hours and subscribers; or (3) operates under a broadcast license issued by the
Federal Communications Commission.
In addition to addressing “social media platform” activity specifically, the Florida law contains a number
of broader antitrust provisions. Section 3 of the law limits the state’s ability to work with any person who
has been put on an “antitrust violator vendor list.” The state will place people on this list after receiving
notice (and verifying) that a person “was convicted or held civilly liable for an antitrust violation.” The
Attorney General also has some authority to place people on this list if they have been charged with
violating antitrust laws and the Attorney General believes “there is probable cause” that they “likely
violated” antitrust laws. The law provides notice and an opportunity for a hearing for any person placed
on this list.
NetChoice v. Moody
Two trade associations representing online businesses, NetChoice and the Computer & Communications
Industry Association, sued Florida on May 27, 2021, seeking to enjoin enforcement of Senate Bill 7072.
They raised two different legal arguments against the law. First, they argued that the state law is expressly
preempted by Section 230, a federal law that, among other things, prevents states from imposing liability
that treats an “interactive computer service” provider as the publisher of another person’s content.
Second, they argued that the law is unconstitutional under the First Amendment because it infringes their
members’ rights “to exercise editorial judgment over speech on their private property.” The trial court
held that the challengers are likely to prevail on both grounds and entered a preliminary injunction
temporarily preventing the state from enforcing the law.
Section 230
The court first considered the plaintiffs’ statutory arguments and held that parts of the state law are likely
preempted by Section 230. Generally, Section 230 (discussed in more detail in this CRS Report) creates a
federal immunity shield that protects providers and users of “interactive computer services” from legal
liability for content provided by another person. Among other provisions, Section 230(c)(2)(A) says


Congressional Research Service
3
service providers and users may not be held liable for acting “in good faith to restrict access to”
objectionable material. In addition, Section 230(e)(3) expressly preempts state laws that are inconsistent
with Section 230.
According to the trial court, those portions of the Florida law “that purport to impose liability for . . .
decisions to remove or restrict access to content” are preempted by Section 230. These portions of law
include those that allow liability for covered platforms’ censorship, deplatforming, and shadow banning
decisions. For example, the court explained that a service that deplatforms a candidate in violation of the
state law could be acting under the protections of Section 230, if the deplatforming qualifies as a good
faith decision to restrict access to material that the platform considers objectionable. Because Section 230
only preempts certain aspects of the law, though, the court proceeded to consider the merits of the
plaintiffs’ constitutional claims.
First Amendment
As discussed in more detail in this CRS Report, the Supreme Court has recognized that private entities
hosting others’ speech may sometimes exercise “editorial control” protected by the First Amendment’s
Free Speech Clause when they choose what speech to host or how to present it. For example, in one case,
the Court held that a state violated the First Amendment when it attempted to force newspapers to host
certain speech. The Supreme Court has said that, like other content-based laws, regulations that compel
speech
are generally subject to strict scrutiny. (However, in a decision issued on July 1, a plurality of the
Court suggested that compelled disclosure requirements are subject to a slightly less demanding “exacting
scrutiny” analysis.) Under a strict scrutiny analysis, laws are “presumptively unconstitutional and may be
justified only if the government proves that they are narrowly tailored to serve compelling state interests.”
In other decisions, however, the Court has held that private entities may not be able to assert a
constitutional right to exclude third parties if it is unlikely that anyone would attribute the speech of those
third parties to the host, or if the hosting decision is not “inherently expressive.” For instance, the Court
rejected a free speech claim by a group of law schools who were attempting to exclude military recruiters
from their campuses. The schools objected to the military’s policy excluding gay people from the military,
and argued that a federal law penalizing the schools for barring military recruiters unconstitutionally
required them to accommodate the military’s messages. In contrast to the cases recognizing a right of
editorial control, the Supreme Court explained that “[n]othing about recruiting suggests that law schools
agree with any speech by recruiters, and nothing in the [federal law] restricts what the law schools may
say about the military’s policies.” That is, the schools could disassociate themselves from the military’s
views and effectively express their own views. Accordingly, the Court held that the federal law focused
primarily on nonexpressive conduct, rather than protected speech, and was not subject to heightened
constitutional scrutiny.
In NetChoice, the trial court concluded that the social media platforms covered by the law do not “use
editorial judgment in quite the same way” as newspapers because, in contrast to newspapers, social media
providers cannot read and select all the content that they host. However, the court also believed that the
platforms do not engage only in conduct, and said that the “ideologically sensitive” moderation decisions
targeted by the state law are “the very cases on which the platforms are most likely to exercise editorial
judgment.” Ultimately, the court ruled that the law is subject to strict scrutiny because it is “about as
content-based as it gets,” noting as an example the portions of the law that apply only to material posted
by or about a candidate, thus discriminating on the basis of the material’s content. Among other concerns,
the court also concluded that strict scrutiny is appropriate because the law applies “to only a small subset
of social-media entities”—the largest companies—citing Supreme Court precedent suggesting that
“discrimination between speakers is often a tell for content discrimination.” In addition, the court pointed
to statements from legislators and the Governor suggesting “that the actual motivation for this legislation


Congressional Research Service
4
was hostility to the social media platforms’ perceived liberal viewpoint,” triggering strict scrutiny based
on viewpoint discrimination.
The court held that the state is unlikely to satisfy the strict scrutiny standard. First, the court wrote that,
under prevailing Supreme Court precedent, “leveling the playing field—promoting speech on one side of
an issue or restricting speech on the other—is not a legitimate state interest.” Second, the court
determined that the law is not narrowly tailored. In the alternative, the court also ruled that the law could
not even satisfy the intermediate scrutiny standard that applies to content-neutral laws. Accordingly, the
court granted a preliminary injunction temporarily preventing the state from enforcing the law.
Considerations for Congress
Although preliminary, the court’s ruling in NetChoice could nonetheless be of interest to Congress as it
considers whether to amend Section 230 or otherwise regulate online content moderation. Legislators in
other states have introduced similar bills that would create liability for certain deplatforming decisions,
and some commentators have opined that these other state laws are similarly unconstitutional. If Congress
does not want to preempt state laws regulating content moderation, it could amend Section 230, though
the state laws could still be susceptible to First Amendment challenges. In some ways, NetChoice could
be seen as a test case for how laws regulating online platforms’ content moderation activities might fare in
court. In addition to possible state laws, Congress has also introduced a number of proposals that would
regulate online content moderation, although most of these proposals would not directly prohibit
platforms from restricting certain types of content in exactly the same way as this Florida law does. In
particular, Congress has introduced a number of proposals to amend Section 230. Some have argued that,
as an indirect regulation of speech, Section 230 should be subject to a distinct constitutional analysis—an
issue discussed in more detail in this report.
Although other courts would not be bound to follow NetChoice when they consider constitutional
challenges to other laws, they may nonetheless look to the ruling as persuasive. There have been
relatively few court decisions specifically considering when online content moderation activities are
protected under the First Amendment, but a few other trial courts have similarly concluded that websites
and search engines may not be held liable for specific decisions to publish or remove certain content.
Accordingly, NetChoice adds to the lower court rulings suggesting that government regulations may be
unconstitutional if they interfere with sites’ editorial discretion. In particular, the NetChoice decision
could suggest that courts may be skeptical of laws that entail content-based distinctions or discriminate
between different speakers, and that governments seeking to regulate content moderation should claim a
compelling government interest other than “leveling the playing field” for speech. However, to the extent
that other laws are focused on conduct or are content-neutral, courts could apply a lower level of scrutiny
to any constitutional challenges.
Florida intends to appeal the trial court’s ruling, and given the preliminary nature of the decision, further
proceedings are likely in this case.

Author Information

Valerie C. Brannon

Legislative Attorney




Congressional Research Service
5
Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff
to congressional committees and Members of Congress. It operates solely at the behest of and under the direction of
Congress. Information in a CRS Report should not be relied upon for purposes other than public understanding of
information that has been provided by CRS to Members of Congress in connection with CRS’s institutional role.
CRS Reports, as a work of the United States Government, are not subject to copyright protection in the United
States. Any CRS Report may be reproduced and distributed in its entirety without permission from CRS. However,
as a CRS Report may include copyrighted images or material from a third party, you may need to obtain the
permission of the copyright holder if you wish to copy or otherwise use copyrighted material.

LSB10618 · VERSION 1 · NEW