https://crsreports.congress.gov

February 11, 2025

Social Media: Regulatory, Legal, and Policy Considerations for the 119th Congress

Social media platforms enable users to create and share content and interact with other users’ content. A diverse set of platforms disseminate information to billions of people, and are used by most American adults. Social media operators may moderate the content on their site, promoting some posts and disallowing others. Some of their business decisions may be governed by existing federal laws, but social media platforms are not comprehensively regulated in the United States. Some lawmakers have expressed concerns about issues related to social media use, including the spread and promotion of content believed to be harmful; platforms’ restriction of lawful speech; and the lack of privacy protections. Members of the 119th Congress have already introduced bills to address some of these topics, including a bill restricting kids’ use of social media. Meanwhile, state-level regulation has faced legal challenges, as the Free Speech Clause of the U.S. Constitution’s First Amendment imposes some limits on certain regulations of social media platforms.

Existing Federal Regulation of Social Media Platforms and Certain Content

Distribution of Sexually Explicit Material By statute, Congress has prohibited the knowing distribution of certain material in interstate or foreign commerce, including over the internet. Federal law has long criminalized the distribution of “obscene” material, a subset of pornographic content. Because sexual expression is generally protected under the First Amendment, the Supreme Court has adopted a definition of obscenity that exempts material with serious literary, artistic, political, or scientific value. Federal law also prohibits accessing or distributing child sexual abuse material (CSAM), referred to in statute as “child pornography.” Material that qualifies as obscenity or child pornography is considered “unprotected speech,” meaning the government can prohibit it, subject to certain First Amendment limits. In 2002, the Supreme Court invalidated on free speech grounds an amendment to the CSAM statute prohibiting material that “appears to” depict a minor engaged in sexual conduct, because it would have prohibited even non-obscene movies with adult actors. The case may have implications for images generated or altered with artificial intelligence.

A 2022 federal law authorizes individuals whose intimate images were disclosed without their consent to sue the disclosing party in federal court. Many cases involving these claims are in the early stages, with no reported rulings on free speech defenses as of the date of this writing. Some courts have rejected First Amendment challenges to similar state laws. Those courts ruled that while the laws restricted

protected expression, they served compelling government interests without burdening too much protected speech.

Data Protection Congress has enacted statutes that regulate data collected by certain industries or data that fall within certain categories. For example, the Gramm-Leach-Bliley Act imposes data protection obligations on financial institutions, and the Children’s Online Privacy Protection Act regulates the online collection and use of information about children younger than 13. In addition, the Federal Trade Commission sometimes brings enforcement actions alleging that companies’ data protection practices constitute “unfair or deceptive acts or practices.” Congress has not enacted a comprehensive data protection law.

Legal Protections for Hosting or Restricting Speech The First Amendment protects the right to create, circulate, or receive content online by constraining the government’s ability to regulate this activity. The Supreme Court has also recognized a right of editorial control when private platforms choose whether or how to publish others’ speech. In addition, courts have interpreted Section 230 of the Communications Act of 1934 to bar liability for publishing, promoting, restricting, and sometimes even editing third- party content. Section 230 does not bar liability if a social media platform helps develop content, and it contains exceptions allowing certain types of lawsuits.

State Regulation of Social Media

Some states have adopted laws regulating social media platforms and online content. As discussed below, courts have enjoined (i.e., barred) enforcement of some of these laws while legal challenges to them are litigated.

Some laws have attempted to address the content hosted online. For instance, the California Age-Appropriate Design Code Act (CAADCA) requires covered sites to assess and mitigate the risk their product will expose children to harmful content. Florida and Texas have enacted laws restricting online platforms’ ability to moderate user content. Texas’s law, for example, prohibits covered platforms from censoring users based on viewpoint.

Other state laws have focused not on specific content moderation decisions but on broader questions of who can access websites and how content is delivered to users. Many of these laws are aimed at protecting children. Some states have adopted laws requiring social media sites to verify a user’s age and obtain parental consent. Other state laws require age verification only for sites with a certain amount of sexually explicit content, or limit the use of features that may be addictive or otherwise harmful.

Social Media: Regulatory, Legal, and Policy Considerations for the 119th Congress

https://crsreports.congress.gov

Some states have enacted data privacy laws that apply broadly to the online collection or processing of personal data. These laws often create individual rights to limit how companies use personal data, such as a right to opt out of the use of personal data for targeted advertising.

Considerations for Congress

Past policy discussions have centered on whether and how to regulate social media platforms and the user-generated content they host and distribute. Bills in the 118th Congress would have amended Section 230, regulated platforms’ content moderation procedures, created transparency requirements, and supported third-party research of social media platforms. For example, the Kids Online Safety Act—versions of which were passed by the Senate as part of the Kids Online Safety and Privacy Act in July 2024 (S. 2073) and ordered to be reported to the House in September 2024 (H.R. 7891)—would have imposed a “duty of care” and other regulations on certain online platforms reasonably likely to be used by minors.

First Amendment Litigation Courts have enjoined some state laws on First Amendment grounds, preventing them from going into effect. The Supreme Court weighed in on the Florida and Texas content moderation laws in Moody v. NetChoice, LLC, 144 S. Ct. 2383 (2024), holding that some applications of the laws affect platforms’ protected rights to make editorial decisions about the content they display. The Court opined that when Facebook and YouTube decide which third-party content to display and how to organize that content, they are making constitutionally protected expressive choices. Other laws limiting platforms’ ability to host or exclude speech could infringe this right of editorial control.

Apart from editorial control concerns, courts may apply heightened constitutional scrutiny to laws that target specific types of online content. This heightened scrutiny makes it more difficult for the government to establish that a challenged law is constitutional. Specifically, courts usually consider a content-based law—one that applies to speech based on its subject matter, topic, or viewpoint—to be presumptively unconstitutional. As mentioned, however, the government generally can prohibit so-called “unprotected” categories of speech such as obscenity. In January, the Supreme Court heard arguments in a case, Free Speech Coalition v. Paxton, involving a Texas age- verification requirement for certain websites. Because the law is aimed at protecting minors from sexually explicit content, a lower court held that it is not subject to heightened scrutiny and is constitutional. The parties challenging that ruling argue that the law unconstitutionally burdens adults’ right to access non-obscene sexual expression online.

Disclosure requirements may be subject to a different constitutional analysis. Federal appeals courts largely upheld disclosure provisions in Texas’s and Florida’s laws after evaluating them under a lower level of constitutional scrutiny that applies to commercial speech. In contrast, a different federal appeals court concluded California’s CAADCA violated the First Amendment by requiring covered businesses to report on the risk that their services

expose children to harmful content. The court held this requirement reached beyond commercial speech.

Laws regulating content moderation procedures without focusing on the subject matter or ideas in that content might trigger a lower standard of constitutional review. Laws that are content neutral—that do not turn on a particular topic or viewpoint—are usually subject to a less demanding First Amendment test that is easier for the government to satisfy.

Policy Considerations In addition to constitutional considerations, policy considerations for Congress may include (1) addressing concerns regarding social media platforms and content, such as the spread of harmful content and misinformation and data privacy; (2) ensuring a viable consumer-focused tech sector driven by innovation and competitiveness; and (3) addressing the question of federal regulatory authority over social media platforms.

Congress may weigh a range of options to address concerns. For example, Congress may continue to support the current mix of federal and state regulation and industry self-regulation. Congress may also exercise oversight of existing regulatory frameworks, conducting investigations and hearings on the industry practice and agency enforcement. Congress might incentivize social media companies to establish voluntary or collaborative rules and standards as a response to the pressure of stakeholders, the public, or potential litigation. Congress may assess court opinions in litigations related to social media and determine whether Congress should provide legislative solutions. Lastly, Congress may enact legislation that would provide specific regulatory authority to federal agencies. If Congress chooses to legislate, considerations may include the following:

Covered Entities. Whether to cover entities operating

large social media platforms (e.g., those with a certain number of active users or specific revenue thresholds), some other subset of platforms, all social media platforms, or all online platforms.

Content Moderation. Whether to prohibit content

moderation, require moderation of defined harmful content, or provide flexibility regarding the choice of moderated content. Congress might consider whether to amend Section 230, for example, by reforming liability protections for social media platforms’ content moderation practices. Congress might consider imposing transparency and accountability requirements, such as disclosing social media algorithms and content moderation practices. Congress might also address users’ rights regarding what content they see.

Enforcement. Whether an existing agency (e.g., the

Federal Trade Commission or Federal Communications Commission) or a new agency would enforce new requirements established in law. Congress might also consider whether to include a private right of action allowing lawsuits for violations of the law.

Peter J. Benson, Legislative Attorney Valerie C. Brannon, Legislative Attorney Victoria L. Killion, Legislative Attorney

Social Media: Regulatory, Legal, and Policy Considerations for the 119th Congress

https://crsreports.congress.gov | IF12904 · VERSION 1 · NEW

Ling Zhu, Analyst in Telecommunications Policy

IF12904

Disclaimer

This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff to congressional committees and Members of Congress. It operates solely at the behest of and under the direction of Congress. Information in a CRS Report should not be relied upon for purposes other than public understanding of information that has been provided by CRS to Members of Congress in connection with CRS’s institutional role. CRS Reports, as a work of the United States Government, are not subject to copyright protection in the United States. Any CRS Report may be reproduced and distributed in its entirety without permission from CRS. However, as a CRS Report may include copyrighted images or material from a third party, you may need to obtain the permission of the copyright holder if you wish to copy or otherwise use copyrighted material.