Legal Sidebari
NetChoice v. Bonta and First Amendment
Limits on Protecting Children Online
November 1, 2023
On September 18, 2023, i
n NetChoice v. Bonta, the U.S. District Court for the Northern District of
California preliminarily enjoined enforcement of the
California Age-Appropriate Design Code Act
(CAADCA). California enacted the CAADCA to
“protect[] children when they are online.” The court
acknowledged that the law’s purpose “clearly is important,” but nonetheless held that it “likely violates
the First Amendment.” This Sidebar explains the court’s decision and discusses some potential
implications for similar legislation.
Online Child Protection and Free Speech
Since at least the
Communications Decency Act of 1996, Congress has been enacting l
aws intended to
make “[t]he information superhighway . . . safe for families and children.” The resulting legislation
includes privacy
protections for children online and laws conditioning certain federal assistance for
schools a
nd libraries on the adoption of policies that protect children from exposure to harmful or obscene
websites.
Over the same time period, the Supreme Court has issued multiple decisions holding that the internet is an
important forum for speech. In 1997, the Court
wrote that chat rooms allow “any person with a phone line
[to] become a town crier.” By 2017, the Court had
described social media websites as “the modern public
square.” In decisions that address t
he intersection between the internet’s role in facilitating speech and
legislative efforts to protect children online, the Court
has “repeatedly recognized the governmental
interest in protecting children from harmful materials,” but it has also held that some statutes aimed at
increasing those protections online
exceed the First Amendment’
s bounds.
Recent years have seen ren
ewed calls to increase protections for children using the internet. The
Children’s Online Privacy Protection Act (COPPA), the primary federal law that creates
privacy
protections for children online, was enacted in 1998. Som
e lawmakers have argued that it is time for an
update. In addition, states—which are al
so prohibited from abridging the freedom of speech—have added
several new laws to th
e regulatory landscape in the past few years.
Congressional Research Service
https://crsreports.congress.gov
LSB11071
CRS Legal Sidebar
Prepared for Members and
Committees of Congress
Congressional Research Service
2
The CAADCA
In 2022, California enacted the CAADCA after
finding that “more needs to be done to create a safer
online space for children to learn, explore, and play.” California’s law reaches beyond COPPA’s federal
protections in several ways. The CAADCA defines
child as anyone under the age of
eighteen, while
COPPA defines
child as anyone under the age of
thirteen. The CAADC
A applies to services “likely to be
accessed by children,” while COPPA regulates only services
“directed to children” and operators who
know they are collecting personal information from children. The CAADCA also includes prohibitions
and mandates involving reports, risk assessments, and design limitations that do not appear in COPPA—
while omitting any express safe harbor provision like the one
incorporated into COPPA.
Specifically, businesses
covered by the CAADCA
may not
• use the personal information of any child in a way that the business knows, or has reason
to know, is materially detrimental to the physical health, mental health, or well-being of a
child;
• profile a child or collect geolocation information from a child by default unless certain
criteria are met;
• collect, sell, share, or retain any personal information that is not necessary to provide an
online service, product, or feature with which a child is actively and knowingly engaged;
• use personal information of a child—if collection was necessary to provide an online
service, product, or feature, or to comply with other provisions of the CAADCA—for any
reason other than a reason for which that personal information was collected; or
• u
se dark patterns to lead or encourage children to provide additional personal
information, forgo privacy protections, or take an action detrimental to the child.
Covered businesses ar
e required to
• complete a Data Protection Impact Assessment before any new online services, products,
or features are offered to the public, which must identify the purpose of the online
service, product, or feature, how it uses children’s personal information, and the risks of
material detriment to children that arise from the data management practices of the
business, in addition to addressing eight specifically enumerated issues;
• provide certain information about completed Data Protection Impact Assessments to the
California Attorney General upon request;
• either estimate the age of child users with a reasonable level of certainty appropriate to
the risks that arise from the data management practices of the business, or apply the
privacy and data protections afforded to children to all consumers;
• configure all default privacy settings provided to children by the online service, product,
or feature to settings that offer a high level of privacy;
• provide privacy information, terms of service, policies, and community standards
concisely, prominently, and using clear language suited to the age of children likely to
access that online service, product, or feature;
• provide an obvious signal to the child if and when the online service, product, or feature
allows the child’s parent, guardian, or any other consumer to monitor the child’s online
activity or track the child’s location;
• enforce the published terms, policies, and community standards established by the
business; and
Congressional Research Service
3
• provide prominent, accessible, and responsive tools to help children, or if applicable their
parents or guardians, exercise their privacy rights and report concerns.
NetChoice’s Challenge to the CAADCA
NetChoice is a national trade association of online businesses and the plaintiff in several
recent lawsuits
that challenge state regulations of internet companies. In December 2022, NetChoice
sued the Attorney
General of California seeking to enjoin enforcement of the CAADCA on several grounds. The complaint
included allegations that the l
aw violates the First Amendment because it is a content-based regulation of
speech that does not survive heightened scrutiny. NetChoice also filed a
motion for a preliminary
injunction that raised the same argument.
Content-based regulations of expressive activity are ordinarily
subject to a legal standard known as strict
scrutiny if challenged in court. The Supreme Court has said that, when strict scrutiny applies, a l
aw will
be “presumptively unconstitutional and may be justified only if the government proves” the law is
“narrowly tailored to serve compelling state interests.” There is precedent for applying this level of
scrutiny to regulations of internet content. The Supreme Court explained i
n Reno v. ACLU, for example,
that a law “prohibit[ing] the knowing sending or displaying of patently offensive messages in a manner
that is available to a person under 18 years of age” is “a content-based blanket restriction on speech.”
After going on to hold that the statutory provisions challenged in the case were “not narrowly tailored,”
the Court concluded that the law abridged the First Amendment.
In it
s response to NetChoice’s motion for a preliminary injunction, California contended that the
CAADCA regulates only nonexpressive business practices related to the collection and use of data, so the
law should not trigger strict scrutiny or any other level of First Amendment scrutiny. California also
claimed that the CAADCA is content-, viewpoint-, and speaker-neutral, and strict scrutiny is therefore
inappropriate even if the court thought the law regulates speech. Instead, California argued, the CAADCA
affects, at most, commercial speech in content-neutral ways and should be subject to the more lenient
intermediate scrutiny standard. As the Supreme Court has described
intermediate scrutiny, regulations are
permissible if they directly advance a substantial governmental interest and are not more extensive than is
necessary to serve that interest. Although more lenient than strict scrutiny, intermediate scrutiny still
necessitates a relativel
y robust judicial review, which laws do not always survive.
The Court’s Decision
The Northern District of California decided that the CAADCA regulates protected expression and that,
regardless of whether the Act is subject to strict scrutiny or intermediate scrutiny, NetChoice can likely
show that it violates the First Amendment. (Because the court was deciding a motion for a preliminary
injunction, t
he applicable legal standard was whether NetChoice is “likely to succeed on” its First
Amendment claim.)
The court first explained why the CAADCA regulates expression. It reasoned that the Act’s
prohibitions
restrict certain entities’ collection, sale, and sharing of data. Pointing to the Supreme Court’s decision in
Sorrell v. IMS Health—which held that “the creation and dissemination of information are speech”—the
district court concluded that “restrict[ing] the ‘availability and use’ of information by some speakers but
not others, and for some purposes but not others, is a regulation of protected expression.” With respect to
the CAADCA’
s mandates, the court determined that the provisions requiring reports or disclosures would
require the creation and dissemination of speech, while the provisions requiring certain default rules and
policies would effectively require that companies censor or block access to certain expressive content. In
either case, the court held, the mandates regulate expression.
Congressional Research Service
4
The court next took up the question of what level of scrutiny should apply to the CAADCA. Rather than
resolve the parties’ disagreement on this point, it assumed for purposes of its analysis that the law reaches
only commercial speech and is subject to intermediate scrutiny. Even analyzing the CAADCA under that
more lenient standard, the court
explained, the law should be preliminarily enjoined.
To survive intermediate scrutiny, California would have had to show that the CAADCA directly advanced
a substantial governmental interest and was not more extensive than necessary. The court agreed with
California that the state has a substantial interest in protecting children online. California presented
evidence that children using the internet are harmed by lax data and privacy protections. Based on that
evidence and on Supreme Court cases
recognizing a compelling state interest in protecting the well-being
of minors, the court
concluded that California “satisfied its burden of showing a substantial interest.”
The court nonetheless held that every challenged provision of the CAADCA failed intermediate scrutiny.
According to the court, California did not establish that many of the CAADCA’s provisions—for
example, requiring that businesses prepare
Data Protection Impact Assessments an
d estimate users’ ages,
requiring that policies be in
age-appropriate language, and requiring that children’s
personal information
be used only for the reason for which it was collected—would in fact alleviate harm to children. Creating
reports about risks, the court explained, does not necessarily reduce any harm from the risks. Requiring
that businesses estimate users’ ages could, in the court’s view, “actually . . . exacerbate the problem” by
requiring businesses to increase collection of personal information. Other provisions chilled too much
speech: the court found that the restrictio
n on profiling children prohibits practices that may benefit
children and that requiring heightened
default privacy settings for children could result in businesses
prohibiting children from accessing services altogether. For similar reasons, the court held that the
CAADCA’s prohibitions on using information or
dark patterns in ways that are
materially detrimental to
children were not sufficiently tailored. Here, the court credited evidence that, if required to evaluate
whether a wide range of content is detrimental to anyone from an infant to a person just shy of eighteen,
websites would not be certain what content might expose them to liability and might therefore bar
children from accessing more content than necessary to prevent harm.
After
deciding that the remainder of the CAADCA could not be severed from the provisions that likely
violate the First Amendment, the court
preliminarily enjoined the law’s enforcement.
Considerations for Congress
The CAADCA was
modeled on the United Kingdom’
s Age-Appropriate Design Code (UK AADC).
Supporters of that law have
touted its role in prompting changes to the product designs, default settings,
and data practices used by some major internet-based services. As the
NetChoice v. Bonta ruling
illustrates, internet regulations imported from the UK, which has
no equivalent to the First Amendment,
will not necessarily survive scrutiny from United States courts. Still, state legislatures i
n Maryland and
Minnesota have, like California, introduced bills that track the UK AADC.
Websites’ designs, defaults, and data use are often targets of proposed federal legislative reforms. For
example, the Clean Slate for Kids Online Act of 2023
(S. 395), the Children and Teens’ Online Privacy
Protection Act
(S. 1418), the Kids Online Safety Act
(S. 1409), the Protecting Kids on Social Media Act
(S. 1291), the Social Media Child Protection Act
(H.R. 821), and Sammy’s Law of
2023 (H.R. 5778) have
all been introduced in the 118th Congress. Like the CAADCA, all seek to protect children from harm
online. All would regulate how websites and online services use data and information or design and offer
products accessed by children. Other recent bills, such as the Digital Platform Commission Act of 2023
(S. 1671), the Algorithmic Justice and Online Platform Transparency Act
(S. 2325/H.R. 4624), the
DETOUR Act
(S. 2708), and the Online Privacy Act of
2023 (H.R. 2701), would regulate websites and
online services regardless of users’ ages.
Congressional Research Service
5
NetChoice v. Bonta shows that restrictions on websites’ data use, default settings, and designs can raise
First Amendment concerns. The
NetChoice court
held that restrictions on collecting, selling, sharing, or
retaining personal information “limit the ‘availability and use’ of information . . . and thus regulate
protected speech.” The court
said that other CAADCA provisions “require businesses to affirmatively
provide information to users, and by requiring speech necessarily regulate it.” Much of what online
services do can be characterized as collecting, providing, using, or making available data and information.
As a result, many regulations directed at online services may be subject to the types of First Amendment
arguments NetChoice raised against the CAADCA.
Regulations are least vulnerable to First Amendment scrutiny when they are narrowly tailored to serve a
compelling governmental interest. When applying this standard, courts have looked at whether a
legislature had evidence of and “specifically identif[ied] an
‘actual problem’ in need of solving.” To
evaluate how well tailored a law is, courts have reviewed the extent to which the law sweeps t
oo broadly,
silencing protected, non-harmful speech, and whether there are “plausibl
e, less restrictive alternatives” to
accomplish the legislative purposes.
This is, however, an evolving area of law. Legislatures continue to reform the online legal landscape, and
courts continue to decide challenges to new laws. The
NetChoice v. Bonta case itself may see further
developments: California is appealing the
district court’s decision to the U.S. Court of Appeals for the
Ninth Circuit. The Supreme Court, meanwhil
e, took up cases this term from t
he Fifth and Eleventh
Circuits, also initiated by NetChoice, that raise
questions about First Amendment protections for websites’
editorial choices. More judicial decisions that provide guidance on the First Amendment’s application to
laws regulating online services are likely forthcoming.
Author Information
Peter J. Benson
Legislative Attorney
Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff
to congressional committees and Members of Congress. It operates solely at the behest of and under the direction of
Congress. Information in a CRS Report should not be relied upon for purposes other than public understanding of
information that has been provided by CRS to Members of Congress in connection with CRS’s institutional role.
CRS Reports, as a work of the United States Government, are not subject to copyright protection in the United
States. Any CRS Report may be reproduced and distributed in its entirety without permission from CRS. However,
as a CRS Report may include copyrighted images or material from a third party, you may need to obtain the
permission of the copyright holder if you wish to copy or otherwise use copyrighted material.
LSB11071 · VERSION 1 · NEW