This page shows textual changes in the document between the two versions indicated in the dates above. Textual matter removed in the later version is indicated with red strikethrough and textual matter added in the later version is indicated with blue.
https://crsreports.congress.gov
August 5, 2024
On July 30, 2024, the Senate passed the Kids Online Safety and Privacy Act (S. 2073). Title I of the act includes provisions similar to the Kids Online Safety Act (KOSA; S. 1409), and Title II includes provisions similar to the Children and Teens’ Online Privacy Act (S. 1418, commonly referred to as COPPA 2.0).
On May 23, 2024, the House Energy and Commerce Subcommittee on Innovation, Data, and Commerce forwarded to the full committee a different version of KOSA (H.R. 7891) and a discussion draft of the American Privacy Rights Act (APRA; H.R. 8818), which includes provisions similar to the House’s COPPA 2.0 bill (H.R. 7890). The House Committee on Energy and Commerce cancelled a markup, which would have included both KOSA and APRA, that was scheduled for June 27, 2024.
This InFocus provides a summary of Title I of S. 2073, compares it with H.R. 7891, and provides some considerations for Congress. For more information on COPPA 2.0, see CRS Legal Sidebar LSB11161, The American Privacy Rights Act, by Chris D. Linebaugh et al.
Title I of S. 2073 would create requirements for covered platforms that are used by, or reasonably likely to be used by, minors. It defines covered platforms as online platforms, online video games, messaging applications (apps), and video streaming services that connect to the internet, with some exceptions (e.g., email providers and certain news and sports websites and apps). An online platform is defined as “any public-facing website, online service, online app, or mobile app that predominantly provides a community forum for user-generated content.” The bill defines minor as an individual under the age of 17 and child as an individual under the age of 13.
The requirements for covered platforms would include the following:
• Duty of Care. Covered platforms would be required to
“exercise reasonable care in the creation and implementation of any design feature” to prevent and mitigate certain harms to minors, including (1) certain mental health disorders (specifically anxiety, depression, eating disorders, substance use disorders, and suicidal behavior); (2) patterns of use that indicate or encourage compulsive usage by minors; (3) physical violence, cyberbullying, and discriminatory harassment of a minor; (4) sexual exploitation and abuse of minors; and (5) promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol. The platforms would not be required to prevent a minor from “deliberately and
independently searching for, or specifically requesting, content.”
• Safeguards for Minors. Covered platforms would be
required to provide safeguards for a user that they know is a minor that (1) limit the ability for others to communicate with the minor; (2) prevent others from viewing a minor’s personal data that are collected or shared by the platform; (3) limit design features that encourage or increase the frequency, time spent, or activity of minors; (4) control personalized recommendation systems, including the ability to display content in chronological order and limit certain types of recommendations; and (5) restrict the sharing of a minor’s geolocation and provide notice when the minor’s geolocation is tracked. The platforms would be required to set the default setting for these safeguards at the most protective level and provide minors the option to limit the amount of time spent on the platform. The platforms would be prohibited from facilitating advertisements of narcotic drugs, tobacco products, gambling, or alcohol to a minor.
• Parental Tools. Covered platforms would be required
to provide tools for parents of a user that the platform knows is a minor that allow the parent to (1) view a minor’s privacy and account settings, including the safeguards mentioned above; (2) change and control a child’s privacy and account settings; (3) restrict a minor’s purchases and financial transactions; and (4) view metrics and restrict time spent on the platform by the minor. The platforms would be required to provide users notice of which tools have been enabled.
• Reporting Mechanism. Covered platforms would be
required to provide a means for parents, minors, and schools to submit reports about harms to minors. The platforms would need to substantively respond within 10 days if they average more than 10 million U.S. monthly active users, 21 days if they average less than 10 million U.S. monthly active users, and as promptly as needed if the report involves an imminent safety threat.
• Disclosure. Covered platforms would be required to
provide notice about safeguards and parental tools prior to registration or purchase if the platform knows that a user is a minor and obtain verifiable parental consent if the platform knows the user is a child.
• Transparency. A covered platform with more than 10
million U.S. monthly active users that “predominantly provides a community forum for user-generated content and discussion” would be required to issue a public report at least once a year describing “reasonably
Kids Online Safety Act
https://crsreports.congress.gov
foreseeable risks of harms to minors and assessing the prevention and mitigation measures taken” based on an independent, third-party audit. The report would be required to include certain information.
Title I would direct the Federal Trade Commission (FTC) to enter into a contract with the National Academy of Sciences to conduct comprehensive studies on “the risk of harms to minors by use of social media and other online platforms.” It would direct the Secretary of Commerce, in coordination with the FTC and Federal Communications Commission, to conduct a study on potential options to verify a user’s age at the device or operating system level. It would also direct the FTC to issue guidance for covered platforms seeking to conduct market- and product-focused research on minors and guidance related to several provisions in the bill, such as design features that encourage or increase time spent on covered platforms. Finally, it would direct the Secretary of Commerce to establish and convene the Kids Online Safety Council—consisting of representatives from certain federal agencies, academic experts, and other stakeholders—that would provide advice related to some of the provisions.
Title I also includes a Filter Bubble Transparency subtitle that would create additional requirements for online platforms that use an opaque algorithm, defined as “an algorithmic ranking system that determines the selection, order, relative prioritization, or relative prominence of information,” except when used to provide age-appropriate content. These platforms would be required to notify users that an opaque algorithm is used and when changes are made. Platforms would also be required to enable users to switch between the opaque algorithm and an input- transparent algorithm, defined as an algorithmic ranking system that does not use user-specific data, unless expressly provided, such as search terms or saved preferences.
The FTC would enforce Title I. State attorneys general would also enforce the requirements for covered platforms listed above, except for the duty of care provision.
Some differences between S. 2073 and H.R. 7891 include
• The duty of care requirement in S. 2073 would apply to
covered online platforms, whereas in H.R. 7891, the requirement would apply only to high impact online companies. H.R. 7891 defines high impact online companies as an online platform or online video game company that generates $2.5 billion in annual revenue or has 150 million global monthly active users and is primarily used to access or share user-generated content.
• S. 2073 defines know or knows as “actual knowledge or
knowledge fairly implied on the basis of objective circumstances.” H.R. 7891 creates different knowledge standards based on the size of the platform: “knew or should have known” for a high impact online company; “knew or acted in willful disregard” for a covered platform with an annual gross revenue of at least $200 million that collects personal information from at least 200,000 individuals and that does not meet the
definition of a high impact online company; and “actual knowledge” for other covered platforms. S. 2073 would also direct the FTC to issue guidance on the knowledge standard, including best practices and examples. H.R. 7891 does not include a similar provision.
• S. 2073 would require covered platforms to provide
labels and information about advertisements to minors and indicate when content is an advertisement or marketing material, including disclosures of endorsements made by other users of the platform. H.R. 7891 does not include a similar provision.
• H.R. 7891 would direct the Secretary of Education, in
consultation with the FTC and Kids Online Safety Council, to issue guidance to assist elementary and secondary schools in using the notice, safeguards, and tools required from covered platforms. S. 2073 does not include a similar provision.
There are some differences within certain definitions and provisions as well. For example, the definition of personalized recommendation system in S. 2073 lists fewer exceptions than in H.R. 7891.
Some platforms have implemented safeguards for minors that might meet some of the requirements in Title I of S. 2073, potentially in response to congressional concerns. For example, Meta has implemented teen privacy and safety settings, such as reminders to take a break. TikTok offers safeguards for minors, including setting accounts to private by default. Snapchat’s Family Center offers parental controls, such as allowing parents to view their teen’s privacy and safety settings. No federal law currently requires platforms to offer these safeguards; some might stop offering or never offer safeguards.
If S. 2073 were enacted, some of the requirements might be subject to legal challenges. Some groups have argued that certain provisions might violate rights protected by the Free Speech Clause of the First Amendment. Recent state laws enacted to protect children online have been similarly subject to First Amendment challenges.
Some operators might use different age verification methods to identify minors on their platforms, while others might implement changes for all users. For example, a platform might provide notice about its safeguards and parental tools to all users, particularly if it does not have a way to determine whether an individual is a minor before the individual registers with or purchases the platform.
Some of the requirements might make it more costly to operate platforms. For example, implementing a reporting mechanism and hiring a third-party auditor to evaluate the risk prevention and mitigation efforts taken by the platform might be costly, particularly for those with limited resources. The requirements might also encourage developers and companies to create their platforms to include or exclude certain features. For example, a platform might limit the ability for users to communicate with others on the platform if they do not wish to implement
Kids Online Safety Act
https://crsreports.congress.gov | IF12730 · VERSION 1 · NEW
safeguards. If the requirements reduce harms to minors, that benefit might outweigh the potential associated costs. Clare Y. Cho, Specialist in Industrial Organization and Business Policy
IF12730
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff to congressional committees and Members of Congress. It operates solely at the behest of and under the direction of Congress. Information in a CRS Report should not be relied upon for purposes other than public understanding of information that has been provided by CRS to Members of Congress in connection with CRS’s institutional role. CRS Reports, as a work of the United States Government, are not subject to copyright protection in the United States. Any CRS Report may be reproduced and distributed in its entirety without permission from CRS. However, as a CRS Report may include copyrighted images or material from a third party, you may need to obtain the permission of the copyright holder if you wish to copy or otherwise use copyrighted material.