https://crsreports.congress.gov
Updated September 27, 2024
On July 30, 2024, the Senate passed an amended version of S. 2073, titled the Kids Online Safety and Privacy Act. Title I of the act includes provisions similar to those in the Kids Online Safety Act (KOSA; S. 1409). On September 18, 2024, the House Committee on Energy and Commerce ordered an amended House version of KOSA to be reported (H.R. 7891). This In Focus summarizes Title I of S. 2073, compares it with the amended version of H.R. 7891, and provides considerations for Congress.
Title I of S. 2073 would create requirements for covered platforms that are used by, or reasonably likely to be used by, minors. It defines covered platforms as online platforms, online video games, messaging applications (apps), and video streaming services that connect to the internet, with some exceptions (e.g., email providers and certain news and sports websites and apps). An online platform is defined as “any public-facing website, online service, online [app], or mobile [app] that predominantly provides a community forum for user-generated content.” The bill defines minor as an individual under the age of 17 and child as an individual under the age of 13.
The requirements for covered platforms would include the following:
• Duty of Care. Covered platforms would be required to
“exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate” harms to minors, including (1) certain mental health disorders (anxiety, depression, eating disorders, substance use disorders, and suicidal behavior); (2) patterns of use that indicate or encourage addiction-like behaviors by minors; (3) physical violence, online bullying, and harassment of the minor; (4) sexual exploitation and abuse of minors; (5) promotion and marketing of narcotic drugs, tobacco products, gambling, or alcohol; and (6) predatory, unfair, or deceptive marketing practices or other financial harms. The platforms would not be required to prevent a minor from “deliberately and independently searching for, or specifically requesting, content.”
• Safeguards for Minors. Covered platforms would be
required to provide safeguards for a user whom they know is a minor that (1) limit the ability for others to communicate with the minor; (2) prevent others from viewing a minor’s personal data that are collected or shared by the platform; (3) limit design features that encourage or increase the frequency, time spent, or activity of minors; (4) control personalized recommendation systems, including the ability to display content in chronological order and limit certain
types of recommendations; and (5) restrict the sharing of a minor’s geolocation and provide notice when the minor’s geolocation is tracked. The platforms would be required to set the default for these safeguards at the most protective level and provide minors the option to limit the amount of time spent on the platform. The platforms would be prohibited from facilitating advertisements of narcotic drugs, tobacco products, gambling, or alcohol to a minor.
• Parental Tools. Covered platforms would be required
to provide tools for parents of a user whom the platform knows is a minor that allow the parent to (1) view a minor’s privacy and account settings, including the safeguards mentioned above; (2) change and control a child’s privacy and account settings; (3) restrict a minor’s purchases and financial transactions; and (4) view metrics and restrict time spent on the platform by the minor. The platforms would be required to provide users with notice of which tools have been enabled.
• Reporting Mechanism. Covered platforms would be
required to provide a means for parents, minors, and schools to submit reports about harms to minors. The platforms would need to substantively respond within 10 days if they average more than 10 million U.S. monthly active users, 21 days if they average less than 10 million U.S. monthly active users, and as promptly as needed if the report involves an imminent safety threat.
• Disclosure. Covered platforms would be required to
provide notice about safeguards and parental tools prior to registration or purchase if the platform knows that a user is a minor and obtain verifiable parental consent if the platform knows the user is a child.
• Transparency. A covered platform with more than 10
million U.S. monthly active users that “predominantly provides a community forum for user-generated content and discussion” would be required to issue a public report at least once a year describing, among other things, “reasonably foreseeable risks of harms to minors and assessing the prevention and mitigation measures taken” based on an independent, third-party audit.
Title I would direct the Federal Trade Commission (FTC) to enter into a contract with the National Academy of Sciences to conduct comprehensive studies on “the risk of harms to minors by use of social media and other online platforms.” It would direct the Secretary of Commerce, in coordination with the FTC and Federal Communications Commission, to conduct a study on potential options to verify a user’s age at the device or operating system level. It would also direct the FTC, in consultation with the
Kids Online Safety Act
https://crsreports.congress.gov
Secretary of Commerce, to issue guidance for covered platforms seeking to conduct market- and product-focused research on minors and guidance related to several provisions in the bill, such as design features that encourage or increase time spent on covered platforms. Additionally, it would direct the Secretary of Commerce to establish and convene the Kids Online Safety Council—consisting of representatives from certain federal agencies, academic experts, and other stakeholders—that would provide advice related to some of the provisions.
Title I also includes a Filter Bubble Transparency subtitle that would create additional requirements for online platforms that use an opaque algorithm, defined as “an algorithmic ranking system that determines the selection, order, relative prioritization, or relative prominence of information,” except when used to provide age-appropriate content. These platforms would be required to notify users that an opaque algorithm is used and when changes are made. Platforms would also be required to enable users to switch between the opaque algorithm and an input- transparent algorithm, defined as an algorithmic ranking system that does not use user-specific data, unless expressly provided, such as search terms or saved preferences.
The FTC would enforce Title I. State attorneys general would also enforce the requirements for covered platforms listed above, except for the duty of care provision.
Some differences between S. 2073 and H.R. 7891 include the following:
• The duty of care requirement in S. 2073 would apply to
covered platforms. In H.R. 7891, the requirement would apply only to high impact online companies, which are defined as online platforms or online video game companies that generate $1 billion in annual revenue or have 100 million global monthly active users and are primarily used to access or share user-generated content. Some of the harms listed under the duty of care requirement also differ. For example, H.R. 7891 does not mention “certain mental health disorders” or online bullying, and S. 2073 does not include “promotion of inherently dangerous acts that are likely to cause serious bodily harm, serious emotional disturbance, or death.”
• S. 2073 would define know or knows as “actual
knowledge or knowledge fairly implied on the basis of objective circumstances.” H.R. 7891 would create different knowledge standards based on the size of the platform: “knew or should have known” for a high impact online company; “knew or acted in willful disregard” for a covered platform that does not meet the definition of a high impact online company, has an annual gross revenue of at least $200 million, and collects personal information from at least 200,000 individuals; and “actual knowledge” for other covered platforms. S. 2073 would also direct the FTC to issue guidance on the knowledge standard, including best practices and examples; H.R. 7891 would not.
• S. 2073 would require “an assessment of the reasonably
foreseeable risk of harms to minors posed by the covered platform” in the public reports required from covered platforms, whereas H.R. 7891 would require “an assessment of harms to minor based on aggregate data on the exercise of safeguards and parental tools.”
• H.R. 7891 would prohibit an online platform from
conducting market and product-focused research without obtaining parental consent if the user is a minor. S. 2073 would direct the FTC to issue guidance.
• S. 2073 would require covered platforms to provide
labels and information about advertisements to minors and indicate when content is an advertisement or marketing material, including disclosures of endorsements made by other users of the platform. H.R. 7891 does not include a similar provision.
There are some differences within certain definitions and provisions as well. For example, the definition of personalized recommendation system in S. 2073 lists fewer exceptions than in H.R. 7891.
Some platforms have implemented safeguards for minors that might meet some of the requirements in Title I of S. 2073, potentially in response to congressional concerns. For example, Instagram plans to implement protections, such as making the account private by default, for users under the age of 16; parents would need to provide consent to change these protections. TikTok offers safeguards for minors, including setting accounts to private by default. Snapchat’s Family Center offers parental controls, such as allowing them to view their teen’s privacy and safety settings. No federal law requires these safeguards; some platforms might stop offering or never offer safeguards.
If S. 2073 were enacted, some of the requirements might be subject to legal challenges. Some groups have argued that certain provisions might violate rights protected by the Free Speech Clause of the First Amendment. Recent state laws enacted to protect children online have been similarly subject to First Amendment challenges.
Some operators might use different age verification methods to identify minors on their platforms, while others might implement changes for all users. For example, a platform might provide notice about its safeguards and parental tools to all users, particularly if it cannot determine if an individual is a minor before the individual registers.
Some of the requirements might make it more costly to operate platforms. For example, implementing a reporting mechanism and hiring a third-party auditor to evaluate the platform’s risk prevention and mitigation efforts might be costly, particularly for those with limited resources. The requirements might also encourage companies to create platforms that include or exclude certain features. For example, a platform might limit a user’s ability to communicate with others if it does not want to implement safeguards. If the requirements reduce harms to minors, that benefit might outweigh the potential associated costs.
Kids Online Safety Act
https://crsreports.congress.gov | IF12730 · VERSION 2 · UPDATED
Clare Y. Cho, Specialist in Industrial Organization and Business Policy
IF12730
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff to congressional committees and Members of Congress. It operates solely at the behest of and under the direction of Congress. Information in a CRS Report should not be relied upon for purposes other than public understanding of information that has been provided by CRS to Members of Congress in connection with CRS’s institutional role. CRS Reports, as a work of the United States Government, are not subject to copyright protection in the United States. Any CRS Report may be reproduced and distributed in its entirety without permission from CRS. However, as a CRS Report may include copyrighted images or material from a third party, you may need to obtain the permission of the copyright holder if you wish to copy or otherwise use copyrighted material.