< Back to Current Version

Social Media: Content Dissemination and Moderation Practices

Changes from January 27, 2021 to January 8, 2025

This page shows textual changes in the document between the two versions indicated in the dates above. Textual matter removed in the later version is indicated with red strikethrough and textual matter added in the later version is indicated with blue.


Social Media: Misinformation and
January 27, 2021
Content Moderation Issues for Congress
Jason A. Gallo
Social media platforms disseminate information quickly to billions of global users. The Pew
Section Research Manager
Research Center estimates that in 2019, 72% of U.S. adults used at least one social media site and

that the majority of users visited the site at least once a week.
Clare Y. Cho
Analyst in Industrial
Some Members of Congress are concerned about the spread of misinformation (i.e., incorrect or
Organization and Business
inaccurate information) on social media platforms and are exploring how it can be addressed by

companies that operate social media sites. Other Members are concerned that social media
operators’ content moderation practices may suppress speech. Both perspectives have focused on

Section 230 of the Communications Act of 1934 (47 U.S.C. §230), enacted as part of the
Communications Decency Act of 1996, which broadly protects operators of “interactive computer services” from liability for
publishing, removing, or restricting access to another’s content.
Social media platforms enable users to create individual profiles, form networks, produce content by posting text, images, or
videos, and interact with content by commenting on and sharing it with others. Social media operators may moderate the
content posted on their sites by allowing certain posts and not others. They prohibit users from posting content that violates
copyright law or solicits illegal activity, and some maintain policies that prohibit objectionable content (e.g., certain sexual or
violent content) or content that does not contribute to the community or service that they wish to provide. As private
companies, social media operators can determine what content is allowed on their sites, and content moderation decisions
could be protected under the First Amendment. However, operators’ content moderation practices have created unease that
these companies play an outsized role in determining what speech is allowed on their sites, with some commentators stating
that operators are infringing on users’ First Amendment rights by censoring speech.
Two features of social media platforms—the user networks and the algorithmic filtering used to manage content—can
contribute to the spread of misinformation. Users can build their own social networks, which affect the content that they see,
including the types of misinformation they may be exposed to. Most social media operators use algorithms to sort and
prioritize the content placed on their sites. These algorithms are generally built to increase user engagement, such as clicking
links or commenting on posts. In particular, social media operators that rely on advertising placed next to user-generated
content as their primary source of revenue have incentives to increase user engagement. These operators may be able to
increase their revenue by serving more ads to users and potentially charging higher fees to advertisers. Thus, algorithms may
amplify certain content, which can include misinformation, if it captures users’ attention.
The Coronavirus Disease 2019 (COVID-19) pandemic illustrates how social media platforms may contribute to the s pread of
misinformation. Part of the difficulty addressing COVID-19 misinformation is that the scientific consensus about a novel
virus, its transmission pathways, and effective mitigation measures is constantly evolving as new evidence becomes
available. During the pandemic, the amount and frequency of social media consumption increased. Information about
COVID-19 spread rapidly on social media platforms, including inaccurate and misleading information, potentially
complicating the public health response to the pandemic. Some social media operators implemented content moderation
strategies, such as tagging or removing what they considered to be misinformation, while promoting what they deemed to be
reliable sources of information, including content from recognized health authorities.
Congress has held hearings to examine the role social media platforms play in the dissemination of misinformation. Members
of Congress have introduced legislation, much of it to amend Section 230, which could affect the content moderation
practices of interactive computer services, including social media operators. In 2020, the Department of Justice also sent draft
legislation amending Section 230 to Congress. Some commentators identify potential benefits of amending Section 230,
while others have identified potential adverse consequences.
Congress may wish to consider the roles of the public and private sector in addressing misinformation, including who defines
what constitutes misinformation. If Congress determines that action to address the spread of misinformation through social
media is necessary, its options may be limited by the reality that regulation, policies, or incentives to affect one category of
information may affect others. Congress may consider the First Amendment implications of potential legislative actions. Any
effort to address this issue may have unintended legal, social, and economic consequences that may be difficult to foresee.
Congressional Research Service


link to page 4 link to page 5 link to page 7 link to page 9 link to page 11 link to page 12 link to page 13 link to page 15 link to page 17 link to page 21 link to page 22 link to page 22 link to page 24 link to page 25 link to page 26 link to page 8 link to page 16 link to page 30 link to page 33 link to page 33 link to page 28 link to page 30 link to page 34 Social Media: Misinformation and Content Moderation Issues for Congress

Contents
Introduction ................................................................................................................... 1
Overview of Social Media ................................................................................................ 2
U.S. Social Media Use ............................................................................................... 4
Content Moderation ................................................................................................... 6
Social Media Networks and Algorithms .............................................................................. 8
Network Structure ..................................................................................................... 9
Algorithmic Filtering and Prioritization....................................................................... 10
Online Advertising................................................................................................... 12
Example of Misinformation and Social Media: COVID-19 Misinformation in 2020 ................ 14
Context for Congressional Consideration .......................................................................... 18
Federal Proposals to Amend Section 230 ..................................................................... 19
Commentary from Stakeholders on Amending Section 230 ............................................ 19

Considerations for Congress ........................................................................................... 21
Potential Legislative Actions ..................................................................................... 22
Concluding Thoughts ............................................................................................... 23

Figures
Figure 1. Percent of U.S. Adults Who Use at Least One Social Media Site, By Age ................... 5
Figure 2. Social Media Advertising Revenue ..................................................................... 13

Tables

Table B-1. Selected Legislation on Section 230 Introduced in the 116th Congress .................... 27
Table B-2. Selected Legislation Addressing COVID-19 Misinformation Introduced in the
116th Congress ........................................................................................................... 30

Appendixes
Appendix A. Social Media Definitions ............................................................................. 25
Appendix B. Section 230 and COVID-19 Misinformation Legislation................................... 27

Contacts
Author Information ....................................................................................................... 31


Congressional Research Service


link to page 28 Social Media: Misinformation and Content Moderation Issues for Congress

Introduction
Social media platforms have become major channels for the dissemination, exchange, and
circulation of information to bil ions of users around the world. For years, Congress has been
concerned with the use of the internet to host, distribute, and exchange potential y il egal,
harmful, and objectionable content, including graphic sexual content, extremist content, content
that may incite violence, and foreign propaganda. Attention has often focused on social media
platforms, based on their ability to disseminate information quickly and widely and their use of
algorithms to identify and amplify content that is likely to generate high levels of user
engagement.1
Some Members of Congress are concerned about social media dissemination of misinformation
(i.e., incorrect or inaccurate information, regardless of its origin or the intent of the individual
who disseminates it)2 and are exploring how social media platform operators can stop or slow that
dissemination via content moderation. Other Members’ interest in content moderation relates to
concerns that platform operators are moderating content that should not be restricted. Both
perspectives have focused on Section 230 of the Communications Act of 1934 (47 U.S.C. §230,
hereinafter Section 230), enacted as part of the Communications Decency Act of 1996.3 Section
230 broadly protects interactive computer service providers,4 including social media operators,
and their users from liability for publishing, and in some instances removing or restricting access
to, another user’s content.
An example of the role social media can play in the dissemination of information and
misinformation can be seen with the Coronavirus Disease 2019 (COVID-19) pandemic.5 The
spread of COVID-19 misinformation has complicated the public health response to COVID-19.6

1 Algorithms are computer processes that set rules for the data social media platforms receive. T hey help operators sort
and prioritize content and can be used to tailor what a user sees at a particular time. For more information, see
Appe ndix A.
2 Others sometimes use misinformation t o mean incorrect or inaccurate information spread by someone believing it to
be true, as distinct from disinform ation, a term they reserve for false information deliberately spread to gain some
advantage. For additional information on the definitions of misinformation and disinformation, see CRS In Focus
IF11552, Considering the Source: Varieties of COVID-19 Inform ation, by Catherine A. T heohary; Caroline Jack,
Lexicon of Lies: Term s for Problem atic Inform ation, Data & Society Research Institute, August 9, 2017, at
https://datasociety.net/pubs/oh/DataAndSociety_LexiconofLies.pdf; Claire Wardle and Hossein Derakhshan, “ T hinking
about ‘Information Disorder’: Formats of Misinformation, Disinformation, and Mal-Information,” in Cherilyn Ireton
and Julie Posetti, Journalism , Fake News & Disinform ation: Handbook for Journalism Education and Training (Paris:
UNESCO Publishing, 2018), pp. 43-54, at https://en.unesco.org/sites/default/files/f._jfnd_handbook_module_2.pdf.
3 47 U.S.C. §230. While this provision is often referred to as Section 230 of the Communications Decency Act of 1996
(P.L. 104-104), it was enacted as Section 509 of the T elecommunications Act of 1996 , which amended Section 230 of
the Communications Act of 1934. See CRS Legal Sidebar LSB10306, Liability for Content Hosts: An Overview of the
Communication Decency Act’s Section 230
, by Valerie C. Brannon, and CRS Report R45650, Free Speech and the
Regulation of Social Media Content
, by Valerie C. Brannon; Jeff Kosseff, The Twenty-Six Words That Created the
Internet
(Ithaca, NY: Cornell University Press, 2019).
4 47 U.S.C. §230(f)(2) defines an interactive computer service as “any information service, system, or access software
provider that provides or enables computer access by multiple users to a computer server, including specifically a
service or system that provides access to the Internet and such systems operated or services offered by libraries or
educational institutions.”
5 For example, the World Health Organization has described the “over-abundance of information—some accurate and
some not”—that has accompanied the COVID-19 pandemic as an “infodemic.” World Health Organization, Novel
Coronavirus (2019-NCoV) Situation Report-13
, February 2, 2020, at https://www.who.int/docs/default-source/
coronaviruse/situation-reports/20200202-sitrep-13-ncov-v3.pdf.
6 One proposed definition of health misinformation is information about a health phenomenon that is “contrary to the ..
Congressional Research Service
1

link to page 28 Social Media: Misinformation and Content Moderation Issues for Congress

Public health communication plays a critical role in overcoming uncertainty and informing policy
and individual decisions.7 This highlights the chal enge of identifying misinformation during a
pandemic caused by a novel virus, particularly because the scientific consensus is under constant
revision and not always unanimous. It also highlights the chal enge of determining the accuracy
of information in conditions of uncertainty. In some cases, misinformation may be easily
identified by the content moderators employed by social media operators as information that is
verifiably false, while in others, what is accurate or inaccurate may be a matter of judgement
based on available evidence.
This report explores the role that social media can play in the spread of misinformation—in
addition to beneficial information—using the spread of incorrect or inaccurate COVID-19
information as an example. The report provides an overview of social media and content
moderation. It focuses on three main factors that contribute to the amplification and spread of
potential misinformation on social media—(1) the use of data mining and algorithms to sort,
prioritize, recommend, and disseminate information; (2) the maximization of user engagement,
and online advertising revenue for some social media operators, as the foundation of social media
companies’ business models; and (3) the range of content moderation practices across social
media platforms. It discusses options some Members of Congress have proposed to alter
incentives surrounding social media moderation practices to address potential misinformation and
concerns that other Members have raised about censorship. The report concludes with questions
that Congress might consider as it debates whether or not to take legislative action.
Overview of Social Media
Distinguishing features of social media include the primacy of user-generated content,8 the use of
algorithms by the social media operators to sort and disseminate content, and the ability of users
to interact among themselves by forming social networks (see Appendix A for definitions of
social media sites, users, algorithms, platforms, enabling infrastructure, and operators).9 Social
media users are both the producers and consumers of content. They can post text, images, and
videos and consume others’ content by viewing, sharing, or reacting to it.10 Users access social

consensus of the scientific community,” but with the caveat that “ what is considered true and false is constantly
changing as new evidence comes to light and as techniques and methods are advanced.” Briony Swire-T hompson and
David Lazer, “ Public Health and Online Misinformation: Challenges and Recommendations,” Annual Review of Public
Health
41, no. 1 (2020), pp. 433-451, at https://doi.org/10.1146/annurev-publhealth-040119-094127.
7 Nicole M. Krause, Isabelle Freiling, Becca Beets, et al., “Fact-Checking as Risk Communication: T he Multi-Layered
Risk of Misinformation in T imes of COVID-19,” Journal of Risk Research, April 22, 2020, pp. 1-8, at https://doi.org/
10.1080/13669877.2020.1756385.
8 Users can be individuals, organizations, government agencies, and private firms, including news media (e.g.,
Washington Post, Fox News, New York Tim es).
9 Jonathan Obar and Steve Wildman, “Social Media Definition and the Governance Challenge: An Introduction to the
Special Issue,” Telecommunications Policy, vol. 39, no. 9 (2015), pp. 745-750, at https://papers.ssrn.com/sol3/
papers.cfm?abstract_id=2663153. In this report, when we refer to social media operators, we are focused primarily on
the owners of the top nine social media sites, according to a 2019 survey conducted by the P ew Research Center (Pew
Research Center, Social Media Fact Sheet, June 12, 2019, at https://www.pewresearch.org/internet/fact -sheet/social-
media/).
10 Users can share content on social media sites by posting and reposting content or by sharing the initial post to select
individuals or to their entire network. Users can react to content by commenting on it or by “liking” it, indicating that
the user supports or “likes” the post. Some social media sites allow users to express different reactions as well. For
example, Facebook allows users to select an emoji (an icon expressing the emotion of the user), including a thumbs-up,
smiling face, frowning face, and a heart.
Congressional Research Service
2

Social Media: Misinformation and Content Moderation Issues for Congress

media platforms through internet-based interfaces, that is, websites or mobile applications (apps).
Social media operators host user-generated content on their platforms and “organize it, make it
searchable, and [ ... ] algorithmical y select some subset of it to deliver as front-page offerings,
news feeds, subscribed channels, or personalized recommendations.”11 The technical
infrastructure of social media platforms enables connections to other sites, apps, and data, and
may al ow third-party developers to build applications and services that integrate with platforms,
which could provide third-parties access to some user data and preferences.
Many social media operators do not charge their users to establish accounts and use at least some
of their services.12 These operators rely on revenue from advertisements they serve to users and
collect users’ data to target certain advertisements to specific users.13 User data includes
information about personal characteristics, preferences, and opinions provided by users when
setting up accounts, as wel as information gleaned from posted content and online behaviors. The
Interactive Advertising Bureau, an industry trade association, and the research firm eMarketer
estimate that U.S. social media advertising revenue was roughly $36 bil ion in 2019, making up
approximately 30% of al digital advertising revenue.14
Social media sites benefit from network effects; that is, an increasing number of users increases
the value of the site to other users.15 For example, an individual wishing to notify multiple
acquaintances about moving to a new city may choose to share the news on a specific social
media site if his or her acquaintances also use the site. Users may have accounts with multiple
social media sites, such that increased usage of one site may reduce the amount of time the user
spends on another. Therefore, social media operators have a strong incentive to capture as much
of their users’ attention as possible. They commonly employ computational techniques to promote
content that generates strong user engagement, which can be measured by the number of clicks on
links or the amount of time spent reading posts. Some social media sites al ow users to link to
content provided on other sites, permitting users to share content with larger networks and
potential y increasing traffic on the sites.
Social media operators may remove, slow the spread of, or offer warnings for content they deem
objectionable. Social media operators are broadly protected from liability for publishing, and in
some instances removing or restricting access to, another user’s content by Section 230.16 The
authors of Section 230, former Representative Chris Cox and former Representative and current

11 T arleton Gillespie, “Platforms Are Not Intermediaries,” Georgetown Technology Law Review, vol. 2, no. 2 (2018),
pp. 198-216, at https://georgetownlawtechreview.org/wp-content/uploads/2018/07/2.2-Gilespie-pp-198-216.pdf.
12 Some social media operators, such as LinkedIn and Reddit, offer a premium version of their site with additional
services for a monthly fee. Others allow users (which do not include advertisers) to access all of their services without a
monthly fee (e.g., Facebook, T witter). A few operators, such as WeMe, obtain their revenue from subscription fees and
from selling custom emojis rather than online advertising.
13 David M. Lazer, Matthew A. Baum, Yochai Benkler, et al., “T he science of fake news,” Science, vol. 359, no. 6380
(March 9, 2018), pp. 1094-1096, at https://science.sciencemag.org/content/359/6380/1094; Burt Helm, “ How
Facebook’s Oracular Algorithm Determines the Fates of Start -Ups,” New York Times, November 2, 2017, at
https://www.nytimes.com/2017/11/02/magazine/how-facebooks-oracular-algorithm-determines-the-fates-of-start-
ups.html.
14 Interactive Advertising Bureau, Internet Advertising Revenue Report: Full Year 2019 Results & Q1 2020 Revenues,
May 2020, prepared by PricewaterhouseCoopers, at https://www.iab.com/wp-content/uploads/2020/05/FY19-IAB-
Internet -Ad-Revenue-Report_Final.pdf; Debra Aho Williamson, US Social Trends for 2020: eMarketer’s Predictions
for the Year Ahead
, eMarketer, January 15, 2020, at https://www.emarketer.com/content/us-social-trends-for-2020.
15 Arjun Sundararajan, “Network Effects,” author’s website, New York University Stern School of Business, at
http://oz.stern.nyu.edu/io/network.html, viewed December 23, 2020.
16 47 U.S.C. §230.
Congressional Research Service
3

link to page 8 Social Media: Misinformation and Content Moderation Issues for Congress

Senator Ron Wyden, have each stated that their intent was to enable free speech and al ow
interactive computer services to moderate content without government intervention.17 Section 230
has two relevant sections regarding content hosting and moderation: Section 230(c)(1), which
states that interactive computer service providers and users may not “be treated as the publisher
or speaker of any information provided by another” person; and Section 230(c)(2), which states
that interactive computer service providers and users may not be “held liable” for any “good
faith” action “to restrict access to or availability of material that the provider or user considers to
be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.”18
U.S. Social Media Use
The Pew Research Center estimated that in 2019, 72% of U.S. adults, or about 184 mil ion U.S.
adults,19 used at least one social media site, based on the results of a series of surveys.20 This was
up from 5% in 2005. Use varied by age, with the highest percentages using social media being
among the 18-29 year old and 30-49 year old cohorts (see Figure 1). Another report estimates
that in January 2020, there were roughly 230 mil ion social media users in the United States of al
ages (13 is a standard minimum age to register an account on many social media sites), and that
users subscribed to an average of roughly seven social media accounts.21 The majority of U.S.
social media users report visiting the sites weekly and many report visiting the sites daily.22

17 T estimony of Christopher Cox in U.S. Congress, Senate Committee on Commerce, Science, and T ransportation,
Communications, T echnology, Innovation, and the Internet, The PACT Act and Section 230: The Im pact of the Law
that Helped Create the Internet and an Exam ination of Proposed Reform s for Today’s Online World
, 116th Cong., 2nd
sess., July 28, 2020, at https://www.commerce.senate.gov/services/files/BD6A508B-E95C-4659-8E6D-
106CDE546D71; Christopher Cox, “ Policing the Internet: A Bad Idea in 1996 –and T oday,” RealClear Politics, June
25, 2020, at https://www.realclearpolitics.com/articles/2020/06/25/policing_the_internet_a_bad_idea_in_1996_ —
_and_today.html; Ron Wyden, “ I wrote this law to protect free speech. Now T rump wants to revoke it,” CNN Business
Perspectives
, June 9, 2020 at https://www.cnn.com/2020/06/09/perspectives/ron-wyden-section-230/index.html.
18 CRS Legal Sidebar LSB10484, UPDATE: Section 230 and the Executive Order on Preventing Online Censorship ,
by Valerie C. Brannon et al.
19 CRS analysts calculated the 184 million U.S. adult figure using U.S. Census Bureau population estimates. T he
Census Bureau estimates that on July 1, 2019, there were 328,239,523 people in the United States and that 77.7% of
these were 18 years or older. Census Bureau, QuickFacts: United States, at https://www.census.gov/quickfacts/fact/
table/US/PST 045219.
20 Pew Research Center, Social Media Fact Sheet, June 12, 2019, at https://www.pewresearch.org/internet/fact-sheet/
social-media/.
21 Simon Kemp, Digital 2020: The United States of America, Datareportal, February 11 2020, slide 17 and 42, at
https://datareportal.com/reports/digital-2020-united-states-of-america.
22 Social Media Fact Sheet. T he Pew Research Center survey results indicate that 74% of Facebook, 63% of Instagram,
61% of Snapchat, 51% of YouT ube, and 42% of T witter users report daily use in 2019.
Congressional Research Service
4


Social Media: Misinformation and Content Moderation Issues for Congress

Figure 1. Percent of U.S. Adults Who Use at Least One Social Media Site, By Age

Source: Pew Research Center, Internet and Technology, “Social Media Fact Sheet,” June 12, 2019.
https://www.pewresearch.org/internet/fact-sheet/social-media/. Based on surveys conducted 2005-2019.
Media consumption, including social media use, has increased during the COVID-19 pandemic.
This is likely a result, primarily, of entertainment venue closures and an increased amount of time
spent at home as many employees and students shifted to remote work and school. The Nielsen
Company reported a 215% increase in time spent on mobile devices accessing current news in the
United States in March 2020 compared to the year before.23 Facebook reported an increase of
over 50% in total messaging across its offerings global y from February 2020, before most
countries in Europe and North America had closed schools, offices, and public venues, to March
2020, when shutdowns became widespread.24 In April 2020, Kantar, a data and market research
firm that surveyed over 25,000 individuals in 30 global markets, reported that social media usage
had increased global y by 61% over normal usage rates since the start of the COVID-19
pandemic.25
Social media sites also serve as major venues for the circulation of digital content from both
online-only and traditional print and broadcast news outlets.26 Prior to the COVID-19 pandemic, a
2019 Pew Research Center report found that 55% of surveyed U.S. adults reported accessing
news through social media sites, and that 52% of U.S. adults reported using Facebook to access
news.27 The report also states that 88% of U.S. adults were aware that social media operators

23 T he Nielson Company, “COVID-19: T racking the Impact on Media Consumption,” June 16, 2020, at
https://www.nielsen.com/us/en/insights/article/2020/covid-19-tracking-the-impact-on-media-consumption.
24 Alex Schultz and Jay Parikh, “Keeping Our Services Stable and Reliable During the COVID-19 Outbreak,” About
Facebook
, March 24, 2020, at https://about.fb.com/news/2020/03/keeping-our-apps-stable-during-covid-19/.
25 Kantar, “COVID-19 Barometer: Consumer Attitudes, Media Habits and Expectations,” April 3, 2020 , at
https://www.kantar.com/inspiration/coronavirus/covid-19-barometer-consumer-attitudes-media-habits-and-
expectations.
26 Philip M. Napoli, Social Media and the Public Interest: Media Regulation in the Disinformation Age (New York:
Columbia University Press, 2019), pp. 1-2.
27 Elisa Shearer and Elizabeth Grieco, Americans Are Wary of the Role Social Media Sites Play in Delivering the News,
Pew Research Center, October 2, 2019, at https://www.journalism.org/2019/10/02/americans-are-wary-of-the-role-
Congressional Research Service
5

Social Media: Misinformation and Content Moderation Issues for Congress

exert some control over the mix of news that users see on their sites, and that 62% believe that
these operators have too much control over news content.28
Content Moderation
Social media operators maintain policies that prohibit users from posting certain content, such as
content that exhibits graphic violence, child sexual exploitation, and hateful content or speech.29
An operator may temporarily or permanently ban users that violate its policies, depending on the
operator’s perspective on the severity of the users’ violation(s). There is no uniform standard for
content moderation, resulting in practices varying across social media sites.30 Some operators
have chosen to release reports containing information on their content moderation practices, such
as the amount of content removed and the number of appeals,31 but operators are not required to
release this information.
Social media operators rely on several sources to identify content to flag or remove: (1) users, (2)
content moderators, and (3) automated systems, also known as artificial intel igence (AI)
technologies.32 Users can flag or mark inappropriate posts for content moderators to review and
remove when applicable. Automated systems can also flag and remove posts. Content
moderators, primarily contractors, may be able to identify nuanced violations of content policy,
such as taking into account the context of a statement.33 For example, in the first quarter of 2020,
AI technology flagged 99% of violent and graphic content and child nudity on Facebook for
review before any user reported it.34 In contrast, Facebook’s AI technology identified only 16% of
bullying and harassment content, suggesting content moderators are better able to identify this
form of policy violation.
Some social media operators may be compel ed to rely more heavily on AI technologies to
moderate content. Some commentators have raised concern about whether repeatedly reviewing
graphic, explicit, and violent materials harms content moderators’ mental health.35 For example,
in 2020, Facebook reached a settlement in a class-action lawsuit filed by its content moderators

social-media-sites-play-in-delivering-the-news/.
28 Ibid.
29 For example, Facebook and T witter provide lists of inappropriate content at https://www.facebook.com/
communitystandards/introduction and https://help.twitter.com/en/rules-and-policies/twitter-rules, respectively.
30 Marietje Schaake and Rob Reich, Election 2020: Content Moderation and Accountability, Stanford University
Human-Centered Artificial Intelligence, Stanford Freeman Spogli Institute Cyber Policy Center, Issue Brief, October
2020, at https://hai.stanford.edu/sites/default/files/2020-10/HAI_CyberPolicy_IssueBrief_3.pdf.
31 For example, the latest reports released by Facebook and T witter are available at https://transparency.facebook.com/
community-standards-enforcement and https://transparency.twitter.com/en/reports/removal-requests.html, respectively.
32 T arleton Gillespie, “Content Moderation, AI, and the Question of Scale,” Big Data & Society, vol. 7, no. 2, (2020):
pp. 1-5, at https://doi.org/10.1177/2053951720943234. According to the article, only a few operato rs use machine
learning techniques to identify new content that violates the social media sites’ policies. Most operators rely primarily
on algorithms that are coded to identify specific phrases and images.
33 For example, according to a class-action lawsuit filed in September 2018 against Facebook and Pro Unlimited,
Facebook had content moderators review more than 10 million potentially rule -breaking posts per week and sought to
review all user-reported violations within 24 hours (Selena Scola v. Facebook I nc. and Pro Unlimited Inc., 18 CIV
05135 (San Mateo County Superior Court), at https://assets.documentcloud.org/documents/6889335/18-CIV-05135-
Complaint.pdf). Social media operators do not publicly disclose the number of content violations that are flagged by
users, content moderators, and AI technologies.
34 Paul Barrett, “Who Moderates the Social Media Giants? A Call to End Outsourcing,” NYU Stern Center for Business
and Human Rights, June 4, 2020, at https://bhr.stern.nyu.edu/blogs/2020/6/4/who-moderates-the-social-media-giants.
35 Ibid.
Congressional Research Service
6

Social Media: Misinformation and Content Moderation Issues for Congress

who claimed to have experienced post-traumatic stress disorder from reviewing content on its
sites; Facebook agreed to pay $52 mil ion to its content moderators.36 During the COVID-19
pandemic, some content moderators worked remotely, but privacy and security concerns meant
some of the content moderation was done by automated systems.37 These systems can quickly
review large volumes of content “when scale problems make manual curation or intervention
unfeasible.”38
By relying more heavily on automated systems, social media operators may mistakenly remove or
fail to remove content. Thus, some operators have stated that no account would be permanently
suspended solely by an automated enforcement system during the COVID-19 pandemic.39 For
example, Facebook’s automated systems have reportedly removed ads from smal businesses,
mistakenly identifying them as content that violates its policies and causing the businesses to lose
money during the appeals process.40 A wide range of smal businesses have reportedly been
affected by these mistakes, including a seed company for sharing a photo of Wal a Wal a onions
as being overtly sexual and a solar roof company that used acronyms that are similar to
cryptocurrency tokens.41 In 2019, Facebook restored 23% of the 76 mil ion appeals it received,
and restored an additional 284 mil ion pieces of content without an appeal—about 2% of the
content that it took action on for violating its policies.42 During the COVID-19 pandemic, the
amount of content removed by Facebook and the amount restored without an appeal increased for
some categories—such as hate speech, bullying, and harassment—and decreased for other
categories, such as adult nudity and sexual activity.43
Some social media operators have altered their content moderation practices over time. For
example, in 2019, Twitter and Instagram released new policies to reduce bullying and hate speech
on their sites.44 Some of these changes may have partial y been in response to criticism social
media operators received for al owing certain content on their sites, such as hate speech against
Rohingya Muslims in Myanmar that spread on Facebook.45 Some operators have reportedly

36 Bobby Allyn, “In Settlement, Facebook to Pay $52 Million to Content Moderators with PT SD,” NPR, May 12, 2020,
at https://www.npr.org/2020/05/12/854998616/in-settlement-facebook-to-pay-52-million-to-content-moderators-with-
ptsd.
37 Shannon Bond, “Facebook, YouT ube Warn of More Mistakes As Machines Replace Moderators,” NPR, March 31,
2020, at https://www.npr.org/2020/03/31/820174744/facebook-youtube-warn-of-more-mistakes-as-machines-replace-
moderators; Elizabeth Dwoskin and Nitasha T iku, “ Facebook Sent Home T housands of Human Moderators Due to
Coronavirus. Now the Algorithms Are In Charge,” Washington Post, March 24, 2020, at
https://www.washingtonpost.com/technology/2020/03/23/facebook-moderators-coronavirus/.
38 Robert Gorwa, Reuben Binns, and Christian Katzenbach, “Algorithmic Content Moderation: T echnical and Political
Challenges in the Automation of Platform Governance,” Big Data & Society, vol. 1, no. 15 (January-June 2020), p. 3,
at https://journals.sagepub.com/doi/10.1177/2053951719897945.
39 Shannon Bond, “Facebook, YouT ube Warn of More Mistakes As Machines Replace Moderators,” NPR, March 31,
2020, at https://www.npr.org/2020/03/31/820174744/facebook-youtube-warn-of-more-mistakes-as-machines-replace-
moderators; “ An Update On Our Continuity Strategy During COVID-19,” Twitter, updated April 1, 2020, at
https://blog.twitter.com/en_us/topics/company/2020/An-update-on-our-continuity-strategy-during-COVID-19.html.
40 Sarah Frier, “Facebook’s AI Mistakenly Bans Ads for Struggling Businesses,” Bloomberg, November 27, 2020, at
https://www.bloomberg.com/news/articles/2020-11-27/facebook-s-ai-mistakenly-bans-ads-for-struggling-businesses.
41 Ibid.
42 “Community Standards Enforcement Report,” Facebook, November 2020, at https://transparency.facebook.com/
community-standards-enforcement.
43 T hese trends are based on comparing the numbers listed for the first and third quarters of 2020. Ibid.
44 Sara Harrison, “T witter and Instagram Unveil New Ways to Combat Hate—Again,” Wired, July 11, 2019, at
https://www.wired.com/story/twitter-instagram-unveil-new-ways-combat -hate-again/.
45 T om Miles, “U.N. Investigation Cite Facebook Role in Myanmar Crisis,” Reuters, March 12, 2018, at
Congressional Research Service
7

Social Media: Misinformation and Content Moderation Issues for Congress

reconsidered their approach to trade-offs between free expression and safety, such as taking a
harder line with removing misinformation.46 For example, Facebook partners with third-party
fact-checkers to review and rate the accuracy of articles and posts, placing those identified as
false lower in users’ news feeds.47 In addition, Facebook includes information about the publisher
of articles posted on its site and displays articles from the third-party fact-checkers below posts
on the same topic. Twitter labels content containing misleading information or disputed claims
that it determines to be “moderately harmful,” while removing misleading content that it
determines to be “severely harmful.”48 These actions were taken voluntarily by Facebook and
Twitter. Currently, the decision to moderate, or to not moderate, certain content is at the discretion
of each operator.
Misinformation can spread on social media sites, even with content moderation techniques
implemented by operators. Misinformation can spread before moderators discover, review, and
remove the content. To add further complication, users can share content across social media
platforms, meaning content can spread on another platform even after the original content is
removed. Users who recontextualize the original problematic content, for example, through
reposting content or posting screenshots of it, may complicate an operator’s enforcement of its
policies. In addition, some operators may choose not to remove some content that violates its
policies. For example, Facebook’s CEO Mark Zuckerberg stated on a post, “A handful of times a
year, we leave up content that would otherwise violate our policies if the public interest value
outweighs the risk of harm.”49
Through their content moderation practices, social media operators may remove content that
some users find valuable. Some commentators and legislators have raised concern that these
operators are removing too much content, including content from whistleblowers.50 As social
media sites have grown in popularity, they have created some unease that companies determine
what speech is acceptable.51 However, as private companies, social media operators are general y
able to determine what content is al owed on their sites.52
Social Media Networks and Algorithms
Social media platforms are shaped by the structures of their user networks and computational
tools, such as algorithmic filtering, that operators use to manage large volumes of user-generated

https://www.reuters.com/article/us-myanmar-rohingya-facebook/u-n-investigators-cite-facebook-role-in-myanmar-
crisis-idUSKCN1GO2PN.
46 “Social Media’s Struggle with Self-Censorship,” The Economist, October 22, 2020, at https://www.economist.com/
briefing/2020/10/22/social-medias-struggle-with-self-censorship.
47 T essa Lyons, “Hard Questions: What’s Facebook’s Strategy for Stopping False News?,” Facebook, May 23, 2018, at
https://about.fb.com/news/2018/05/hard-questions-false-news/.
48 Yoel Roth and Nick Pickles, “Updating Our Approach to Misleading Information,” Twitter, May 11, 2020, at
https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html.
49 Mark Zuckerberg, Facebook, June 26, 2020, at https://www.facebook.com/zuck/posts/10112048980882521.
50 “Social Media’s Struggle with Self-Censorship,” The Economist, October 22, 2020, at https://www.economist.com/
briefing/2020/10/22/social-medias-struggle-with-self-censorship.
51 Zeynep T ufecki, “T witter Has Officially Replaced the T own Square,” Wired, December 27, 2017, at
https://www.wired.com/story/twitter-has-officially-replaced-the-town-square/; “ Social Media’s Struggle with Self-
Censorship,” The Economist, October 22, 2020, at https://www.economist.com/briefing/2020/10/22/social-medias-
struggle-with-self-censorship.
52 CRS Report R45650, Free Speech and the Regulation of Social Media Content, by Valerie C. Brannon.
Congressional Research Service
8

Social Media: Misinformation and Content Moderation Issues for Congress

content continual y posted on their sites and increase user engagement with content.53 Both
network structure and computational tools are intended to increase the number of users and user
engagement.54 These components al ow operators to increase their revenue, particularly for those
that have online advertisements on their platforms, but may also increase the spread of
misinformation that increases user engagement. Each social media operator balances incentives to
moderate and prioritize content to increase user engagement and its revenue.
Network Structure
Social media users can establish connections to other users of a site, creating social networks or
communities that can be based on common interests, relationships that exist offline, employment,
or other factors. The structure of these networks affect how individuals search for one another and
how connections are initiated and established,55 which can also depend on the level of privacy
offered by the operator and chosen by each user. For example, some social media sites al ow
users to choose whether to make their profiles open to the public or only to those who have
established connections by mutual consent.
On some social media sites, users can limit the content that they see through the networks they
choose to build. Each user can choose to follow or stop fol owing other users, including those
who post content that the user disagrees with. Thus, social media sites can facilitate “echo
chambers” or “filter bubbles,” where a user’s ideas are reiterated and reinforced by others while
other ideas may be excluded.56 Some research has shown that the overlap of networks (i.e., those
with common followers) increases the likelihood that two users wil share content through the
network, although this effect depends on the novelty of the content.57 Echo chambers can enhance
the spread of information, including but not limited to misinformation, particularly before the
information “goes viral” (i.e., spreads rapidly on the internet).58
Social media operators often have economic incentives to encourage users to expand their
networks, as the value of a site to a user increases as more users join or increase their activity on

53 Michael Bosetta, “T he Digital Architectures of Social Media: Comparing Political Campaigning on Facebook,
T witter, Instagram, and Snapchat in the 2016 U.S. Election,” Journalism & Mass Communication Quarterly, vol. 95,
no. 2 (2018), pp. 471-496. T he article includes datafication as a separate category and includes a fourth component—
functionality—which includes aspects such as the graphical interface. While these may be important aspects of the
social media structure, it is less relevant to the spread of m isinformation, and thus not discussed in this report.
54 Renee DiResta, “Free Speech Is Not the Same As Free Reach,” Wired, August 30, 2018, at https://www.wired.com/
story/free-speech-is-not-the-same-as-free-reach/.
55 Michael Bosetta, “T he Digital Architectures of Social Media: Comparing Political Campaigning on Facebook,
T witter, Instagram, and Snapchat in the 2016 U.S. Election,” Journalism & Mass Communication Quarterly, vol. 95,
no. 2 (2018), pp. 471-496; Danah Boyd, “ Social Network Sites as Networked Publics: Affordances, Dynamics, and
Implications,” A Networked Self: Identity, Community, and Culture on Social Network Sites (New York, NY:
Routledge, 2011.)
56 “Digital Media Literacy—What Is an Echo Chamber?,” Goodwill Community Foundation Inc. at
https://edu.gcfglobal.org/en/digital-media-literacy/how-filter-bubbles-isolate-you/1/; Christopher Seneca, “ How to
Break Out of Your Social Media Echo Chamber,” Wired, September 17, 2020, at https://www.wired.com/story/
facebook-twitter-echo-chamber-confirmation-bias/.
57 Jing Peng, Ashish Agarwal, Kartik Hosanagar, et al., “Network Overlap and Content Sharing on Social Media
Platforms,” Journal of Marketing Research, vol. 55 (August 2018), pp. 571-585, at https://journals.sagepub.com/doi/
10.1509/jmr.14.0643.
58 Petter T örnberg, “Echo Chambers and Viral Misinformation: Modeling Fake News as Complex Contagion,” PLOS
ONE
, vol. 13, no. 9 (2018); Michela Del Vicario , Alessandro Bessi, Fabiana Zollo, et al., “ T he Spreading of
Misinformation Online,” Proceedings of the National Academ y of Sciences, vol. 113, no. 3 (January 19, 2016), pp. 554-
559, at https://www.pnas.org/content/113/3/554.
Congressional Research Service
9

link to page 28 Social Media: Misinformation and Content Moderation Issues for Congress

the site. Some social media sites recommend connections based on peripheral connections (i.e.,
someone who is a friend of one of the user’s friends) and often al ow users to search for others,
using their name, email address, occupation, or other information.59 Expanding the number of
users increases the number of possible connections and recommendations, which can encourage
even more individuals to join, exposing more users to advertisements that generate revenue for
the social media operator.
Algorithmic Filtering and Prioritization
Social media sites contain large amounts of content. Over the last decade, decreased costs of
social media enabling infrastructure have made it possible for operators to increase the amount of
user-generated content that they maintain.60 Operators use algorithms to sort, index, curate, and
prioritize user content, as wel as to suppress il egal and other content the operator chooses to
moderate. Social media operators can change or refine their algorithms to meet evolving business
goals in response to internal incentives (e.g., maximizing engagement, increasing advertising
revenue) and external pressures (e.g., user complaints, stakeholders), affecting what users see,
what content is privileged and promoted, and what content rapidly spreads across the platform
(i.e., “goes viral”).61 Specifics about the algorithms that social media operators use are considered
proprietary and are not publicly available, although there is a general understanding of how these
algorithms work.
Each user’s activities are quantified and used to determine the selection, sequence, and visibility
of posts.62 For example, Facebook states that its News Feed prioritizes recent content that is found
to be relevant to the user, based on factors such as previous engagement with the content
provider.63 The algorithms also may prioritize content that is likely to sustain user engagement—
such as sharing, commenting on, or reacting to content—rather than the content’s veracity.64
According to a Wall Street Journal article, slides presented by an internal Facebook team to
company executives in 2018 stated, “Our algorithms exploit the human brain’s attraction to
divisiveness,” and warned that the algorithms would promote “more and more divisive content in

59 Danah Boyd, “Social Network Sites as Networked Publics: Affordances, Dynamics, and Implications,” in A
Networked Self: Identity, Com m unity, and Culture o n Social Network Sites
(New York, NY: Routledge, 2011).
60 A definition of “social media enabling infrastructure” is provided in Appendix A. Jonathan Obar and Steve
Wildman, “Social Media Definition and the Governance Challenge: An Introduction to the Special Issue,”
Telecom m unications Policy, vol. 39, no. 9 (2015), pp. 745-750, at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=
2663153.
61 Leo Mirani, “ T he World in 2019: Slow Social,” The Economist, 2019, at https://worldin2019.economist.com/
slowingdownsocialmedia.
62 T aina Bucher, “Want to be on the T op? Algorithmic Power and the T hreat of Invisibility on Facebook,” New Media
& Society
, vol. 14, no. 7 (2012): 1164-1180, at https://journals.sagepub.com/doi/abs/10.1177/1461444812440159;
Michael Bosetta, “T he Digital Architectures of Social Media: Comparing Political Campaigning on Facebook, T witter,
Instagram, and Snapchat in the 2016 U.S. Election,” Journalism & Mass Com m unication Quarterly, vol. 95, no. 2
(2018), pp. 471-496, at https://journals.sagepub.com/doi/full/10.1177/1077699018763307.
63 “How News Feed Works,” Facebook Help Center, accessed on October 28, 2020, at https://www.facebook.com/
help/1155510281178725; Kelley Cotter, Janghee Cho, and Emilee Rader, “ Explaining the News Feed Algorithm: An
Analysis of the ‘News Feed FYI’ Blog,” In Proceedings of the 2017 CHI Conference Extended Abstracts on Hum an
Factors in Com puting System s - CHI EA ’17
, pp. 1553-1560, Denver, CO: ACM Press, 2017, at https://doi.org/
10.1145/3027063.3053114.
64 Michael Bosetta, “T he Digital Architectures of Social Media: Comparing Political Campaigning on Facebook,
T witter, Instagram, and Snapchat in the 2016 U.S. Election,” Journalism & Mass Communication Quarterly, vol. 95,
no. 2 (2018), pp. 471-496, at https://journals.sagepub.com/doi/pdf/10.1177/1077699018763307.
Congressional Research Service
10

Social Media: Misinformation and Content Moderation Issues for Congress

an effort to gain user attention and increase time on the platform.”65 One study found that users
are more likely to read and share emotional news content and content that provides relevant and
practical information, particularly positive news.66
Some social media operators have made changes to their algorithms. For example, in 2018,
Facebook started prioritizing “meaningful posts,” or those shared by family and friends rather
than news organizations and brands.67 Some social media operators al ow users to personalize
which content is prioritized. On May 31, 2019, Facebook launched the tool “Why am I seeing this
post?” It al ows users to see why the content was posted on their news feed—such as whether the
post is from someone in their network or if they have previously interacted with similar posts—
and al ows users to adjust their preferences, such as prioritizing posts from specific people or
pages.68 On Twitter, users can prioritize content through their searches or lists they have created,69
and can opt to see content in reverse chronological order only from accounts that a user follows.
Users can also choose to follow certain “topics,” which al ows users to follow the most popular
conversations about a specific topic.70 Information on how these changes are incorporated into the
algorithms is not publicly available.
Users can try to use the algorithms on social media to make their content go viral. They can
partner with “influencers,” that is, users with a large number of followers,71 and try to have their
content reposted by popular accounts.72 Social media sites also benefit from content going viral,
which could attract more users and encourage users to spend more time on their sites.
Internet bots—software applications that can perform automated tasks such as rapidly posting,
liking, and recirculating content on social media sites using inauthentic accounts—can affect the
prioritization of content on social media sites and may be used to spread misinformation.73 Bots
can post or amplify content by engaging with it. For example, a bot may be programmed to
search for and respond to posts containing specific words or phrases. This can cause algorithms

65 Jeff Horowitz and Deepa Seetharaman, “Facebook Executives Shut Down Efforts to Make the Site Less Divisive,”
Wall Street Journal, May 26, 2020, at https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-
executives-nixed-solutions-11590507499.
66 Ahmed Al-Rawi, “Viral News on Social Media,” Digital Journalism , vol. 7, no. 1 (2019), pp. 63-79, at
https://www.tandfonline.com/doi/full/10.1080/21670811.2017.1387062.
67 Hayley T sukayama, “Facebook’s Changing Its News Feed. How Will It Affect What You See?,” Washington Post,
January 12, 2018, at https://www.washingtonpost.com/news/the-switch/wp/2018/01/12/facebooks-changing-its-news-
feed-how-will-it-affect -what -you-see/; Jonah Bromwich and Matthew Haag, “ Facebook Is Changing. What Does T hat
Mean to Your News Feed?,” New York Times, January 12, 2018, at https://www.nytimes.com/2018/01/12/technology/
facebook-news-feed-changes.html.
68 Ramya Sethurman, “Why Am I Seeing T his? We Have An Answer For You,” Facebook Newsroom , March 31, 2019,
at https://about.fb.com/news/2019/03/why-am-i-seeing-this/.
69 “About Your T witter T imeline,” Twitter, access on September 29, 2020 at https://help.twitter.com/en/using-twitter/
twitter-timeline.
70 “Introducing T opics,” Twitter, November 11, 2019, at https://blog.twitter.com/en_us/topics/product/2019/
introducing-topics.html.
71 Some influencers may be paid by advertisers or users. Arielle Pardes, “Instagram Will (Finally) Pay Influencers,”
Wired, May 27, 2020, at https://www.wired.com/story/instagram-finally-pay-influencers-badges-igtv-ads/.
72 Steve Olenski, “7 Ways to Up Your Chances of Going Viral on Social Media,” Forbes, February 6, 2018, at
https://www.forbes.com/sites/steveolenski/2018/02/06/7-ways-to-up-your-chances-of-going-viral-on-social-media.
73 Fake, or inauthentic, accounts are profiles impersonating other individuals or organizations. An internet bot is
software that runs automated computer programs over the internet, generally capable of performing simple, repetitive
tasks faster than an individual can. Some websites use a “Completely Automated Public T uring to tell Com puters and
Humans Apart,” or CAPT CHA test, to try to identify internet bots. More information on CAPT CHA tests is available
at https://www.cloudflare.com/learning/bots/how-captchas-work/.
Congressional Research Service
11

link to page 16 Social Media: Misinformation and Content Moderation Issues for Congress

used by social media operators to inadvertently prioritize misinformation. Users and social media
operators can recognize some internet bots based on various factors, such as the syntax used, the
user’s profile, or abnormal account activity.74 Some users may choose not to engage with this
content by not sharing or reposting it, and some social media operators remove this content.
However, bots are becoming more sophisticated, making it more difficult for users and content
moderators to recognize them, particularly if a post has already gone viral. Users may
inadvertently share or like content created or shared by an internet bot.75 Studies have indicated
that bots can contribute to the long-term spread of misinformation.76
Online Advertising
Social media operators have economic incentives to increase user engagement on their sites,
particularly operators that rely on online advertising revenue. These operators can increase their
revenue by amplifying content that is more likely to be shared and commented on, which could
include misinformation. As a user spends more time scrolling through posts or newsfeeds, social
media operators can expose that user to more advertisements and collect more data about the user.
This increases the likelihood that the user wil click on at least one advertisement and al ows
operators to build better profiles of the user’s characteristics and revealed preferences. These
advertisements are often displayed as posts, general y distinguishable through labels such as
“sponsored.”
Advertising sales are the primary source of revenue for most social media operators. In 2019,
online advertising global y provided about 98% ($70 bil ion) of Facebook Inc.’s annual
revenue,77 84% ($135 bil ion) of Google’s,78 and 87% ($3 bil ion) of Twitter’s.79 Facebook CEO
Mark Zuckerberg highlighted the importance of advertising in prepared remarks to the House
Committee on the Judiciary, Subcommittee on Antitrust, Commercial, and Administrative Law,
stating, “Facebook supports its mission of connecting people around the world by sel ing ads.”80
According to an Interactive Advertising Revenue report, revenue from advertising on social
media in the United States increased from about $2.9 bil ion in 2012 to $35.6 bil ion in 2019
(Figure 2), and is projected to continue increasing.81 Based on this data, social media made up

74 Will Knight, “How to T ell if You’re T alking to a Bot,” MIT Technology Review, July 18, 2018, at
https://www.technologyreview.com/2018/07/18/141414/how-to-tell-if-youre-talking-to-a-bot/; Ryan Detert, “ Bot or
Not: Seven Ways to Detect an Online Bot,” Forbes, August 6, 2018, at https://www.forbes.com/sites/
forbesagencycouncil/2018/08/06/bot-or-not-seven-ways-to-detect-an-online-bot/.
75 Kate Starbird, “Disinformation’s Spread: Bots, T rolls and All of Us,” Nature, vol. 571 (July 25, 2019), p. 449, at
https://www.nature.com/articles/d41586-019-02235-x.
76 Marina Azzimonti and Marcos Fernandes, “Social Media Networks, Fake News, and Polarization,” NBER Working
Paper 24462
, March 2018, at http://www.nber.org/papers/w24462.
77 Facebook Inc. SEC Form 10-K for the year ending December 31, 2019, p. 56.
78 T he percentage is calculated by dividing Google advertising revenue by Google’s revenues, not Alphabet’s
(Google’s parent company) total revenues. Alphabet Inc. SEC Form 10 -K for the year ending December 31, 2019, p.
29. YouT ube, owned by Google, generated roughly $15 billion in revenue.
79 T witter Inc. SEC Form 10-K for the year ending December 31, 2019, p. 39.
80 T estimony of Mark Zuckerberg, Chief Executive Officer, Facebook Inc. in U.S. Congress, House Committee on the
Judiciary, Subcommittee on Antitrust, Commercial, and Administrative Law, Online Platform s and Market Power,
Part 6: Exam ining the Dom inance of Am azon, Apple, Facebook, and Google
, hearings, 116th Cong., 2nd sess., July 28,
2020.
81 Interactive Advertising Bureau, “Internet Advertising Revenue Report: Full Year 2019 Results & Q1 2020
Revenues,” prepared by PricewaterhouseCoopers, May 2020, pp. 19, at https://www.iab.com/wp-content/uploads/2020/
05/FY19-IAB-Internet-Ad-Revenue-Report_Final.pdf.
Congressional Research Service
12


Social Media: Misinformation and Content Moderation Issues for Congress

about 29% of total U.S. internet advertising revenue in 2019. eMarketer estimates that video ads
on social media wil make up one-third of al U.S. digital ad spending in 2020, and projects that
spending on social media sites wil increase 20.4% in 2020.82
Figure 2. Social Media Advertising Revenue
(in bil ions of current dol ars)

Source: Interactive Advertising Bureau, “Internet Advertising Revenue Report,” May 2020, prepared by
PricewaterhouseCoopers, at https://www.iab.com/wp-content/uploads/2020/05/FY19-IAB-Internet-Ad-Revenue-
Report_Final.pdf.
Note: Revenue includes social media networking and gaming websites and apps across al devices, including
desktop computers, laptops, and mobile devices.
Collecting user data al ows operators to offer different advertisements based on its potential
relevance to different users.83 The data amassed by social media operators enables them to build
complex profiles and sel advertising space targeting specific user categories to companies,
organizations, and political campaigns.84 It also gives established social media operators an
advantage over market entrants, as entrants are likely to have less user data and therefore may be
less able to help advertisers target users with precision.
Social media operators place ad spaces in a marketplace that runs an instantaneous auction with
advertisers that can place automated bids. Some operators run their own advertising marketplaces.
For example, Facebook and LinkedIn provide ad managers for businesses on their respective

82 Debra Aho Williamson, “U.S. Social T rends for 2020,” eMarketer, January 15, 2020, at https://www.emarketer.com/
content/us-social-trends-for-2020.
83 T arleton Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape
Social Media
, New Haven & London, Yale University Press, 2018.
84 Brian O’Connell, “How Does Facebook Make Money? Six Primary Revenue Streams,” The Street, October 23, 2018
at https://www.thestreet.com/technology/how-does-facebook-make-money-14754098.
Congressional Research Service
13

Social Media: Misinformation and Content Moderation Issues for Congress

social media sites.85 Others, such as Twitter, partner with third-party advertising services such as
Google Doubleclick Bid Manager and the Trade Desk.86
Based on the auction results and user profiles, different users may receive different ads.87
Targeted advertising has made it possible for marketers to customize their messages and reach
potential consumers more easily and quickly, advertising products differently to each individual.88
Advertising rates can be tied to the number of users of a social media site, how much time users
spend engaging with content, and how often advertisements are viewed.89 Thus, social media
operators with large user bases and track records of high engagement may be able to charge
higher fees. According to the Interactive Advertising Bureau, in 2019, the majority of advertisers
on the internet made payments based on a performance pricing model, such as a cost-per-click or
a share of revenue.90 This could mean that social media operators following this pricing model are
unable to obtain revenue from advertisements that users do not click.
Some social media sites al ow advertisers to pay to promote their posts. For example, Facebook
al ows users, including commercial entities, to “boost” a post by turning it into an advertisement
that can be spread to those who do not follow their accounts, increasing the likelihood that the
post is shared, liked, or commented on.91 Some social media sites—including Twitter and
Facebook—al ow users to opt out of targeted ads.92 However, while this means that users may not
see targeted ads, it does not change the number of ads the user sees and does not ensure that a
social media operator is no longer collecting the users’ data.
Example of Misinformation and Social Media:
COVID-19 Misinformation in 2020
During 2020 in the absence of a vaccine that can inoculate individuals against the COVID-19
virus, behavioral interventions such as self-quarantining, social distancing, mask wearing, and
hand washing—plus policy interventions like testing, contact tracing, and office closures—were
implemented in efforts to slow the spread of the virus.93 These interventions rely on timely public

85 Details on advertising on Facebook and LinkedIn are available at https://www.facebook.com/business/help/
200000840044554 andhttps://business.linkedin.com/marketing-solutions/ads, respectively.
86 Additional information on turning off targeted ads for T witter and Facebook is available at
https://business.twitter.com/en/help/troubleshooting/how-twitter-ads-work.html and https://www.facebook.com/help/
568137493302217, respectively.
87 For more information on how the digital advertising marketplace operates, see CRS In Focus IF11448, How
Consum er Data Affects Com petition Through Digital Advertising
, by Clare Y. Cho.
88 T odd Powers, Dorothy Advincula, Manila S. Austin, et al., “Digital and Social Media In the Purchase Decision
Process,” Journal of Advertising Research, vol. 52, no. 4 (December 2012), pp. 479-489, at
http://www.journalofadvertisingresearch.com/content/52/4/479.
89 Hayley T sukayama, “Facebook’s Changing Its News Feed. How Will It Affect What You See?,” Washington Post,
January 12, 2018, at https://www.washingtonpost.com/news/the-switch/wp/2018/01/12/facebooks-changing-its-news-
feed-how-will-it-affect -what -you-see/.
90 Interactive Advertising Bureau, “Internet Advertising Revenue Report: Full Year 2019 Results & Q1 2020
Revenues,” prepared by PricewaterhouseCoopers, May 2020, at https://www.iab.com/wp-content/uploads/2020/05/
FY19-IAB-Internet-Ad-Revenue-Report_Final.pdf.
91 More information about “boosted” posts is available at https://www.facebook.com/business/help/240208966080581.
92 T odd Powers, Dorothy Advincula, Manila S. Austin, et al., “Digital and Social Media In the Purchase Decision
Process,” Journal of Advertising Research, vol. 52, no. 4 (December 2012), pp. 479 -489, at
http://www.journalofadvertisingresearch.com/content/52/4/479.
93 Johannes Haushofer and C. Jessica E. Metcalf, “Which Interventions Work Best in a Pandemic?,” Science 368, no.
Congressional Research Service
14

Social Media: Misinformation and Content Moderation Issues for Congress

health communication. Some collective behaviors, such as preventive measures to slow the
spread of COVID-19, are disseminated and adopted, in part, through reinforcement and
affirmation provided during social contact, including social media.94 For example, individuals
may be more or less likely to adopt mitigation measures if they see others supporting and
engaging in these measures or rejecting them online. The circulation of COVID-19 information
on social media sites that may be incomplete, inaccurate, or misleading could be detrimental to
public health and make efforts to address the pandemic or achieve public acceptance of a
vaccination more chal enging.95
Public health crises typical y drive people to seek information.96 In the United States, online
searches for information about COVID-19 increased dramatical y following the first reported
U.S. cases in late January 2020.97 A June 2020 survey found that 55% of U.S. adults between 18
and 24 years old relied on social media, such as Facebook, YouTube, Instagram, and Twitter, for
COVID-19 information, as did 47% of 25-44 year olds, 31% of 45-64 year olds, and 21% of
individuals over 65.98
In 2020, a range of information about COVID-19, its origin, means of transmission, treatments,
and mitigation measures has been disseminated through social media. Some of this information
has been accurate based on the state of knowledge at the time of original publication, and some
has been incomplete, inaccurate, or misleading.99 Some information that was previously believed
to be accurate was subsequently judged to be inaccurate, due to the evolution of scientific
consensus of what is known about the pandemic as new evidence becomes available.

6495 (June 5, 2020), pp. 1063–65, at https://doi.org/10.1126/science.abb6144.
94 Douglas Guilbeault, Joshua Becker, and Damon Centola, “Complex Contagions: A Decade in Review,” in Complex
Spreading Phenom ena in Social System s: Influence and Contagion in Real-World Social Networks
, ed. Sune Lehmann
and Yong-Yeol Ahn (Cham, Switzerland: Springer, 2018), pp. 3 -25. T he authors note that “ the properties of social
networks that have been shown to accelerate the spreading dynamics of disease diffusion —such as small world
topologies, weak ties, and scale-free degree distributions—can also be used to make inferences about the role of
networks in the domains of social and political behavior.”
95 Katherine E. Bliss and J. Stephen Morrison, The Risks of Misinformation and Vaccine Hesitancy within the Covid -19
Crisis
, Center for Strategic & International Studies, commentary, September 4, 2020, at https://www.csis.org/analysis/
risks-misinformation-and-vaccine-hesitancy-within-covid-19-crisis; World Health Organization, Managing the
COVID-19 Infodem ic: Prom oting Healthy Behaviours and Mitigating the Harm from Misinform ation and
Disinform ation
, joint statement by WHO, UN, UNICEF, UNDP, UNESCO, UNAIDS, IT U, UN Global Pulse, and
IFRC, September 23, 2020, at https://www.who.int/news-room/detail/23-09-2020-managing-the-covid-19-infodemic-
promoting-healthy-behaviours-and-mitigating-the-harm-from-misinformation-and-disinformation.
96 Yan Huang and Chun Yang, “A Metacognitive Approach to Reconsidering Risk Perceptions and Uncertainty:
Understand Information Seeking During COVID-19,” Science Com m unication, vol. I, no. 27 (August 16, 2020), at
https://journals.sagepub.com/doi/pdf/10.1177/1075547020959818.
97 Ana I. Bento, T huy Nguyen, and Coady Wing, et al., “Evidence from Internet Search Data Shows Information-
Seeking Responses to News of Local COVID-19 Cases,” Proceedings of the National Academ y of Sciences, vol. 117,
no. 21 (May 2020), at https://www.pnas.org/content/117/21/11220.
98 Katherine Ognyanova, Roy H. Perlis, and Matthew A. Baum, et al., The State of the Nation: A 50-State COVID-19
Survey
, T he COVID-19 Consortium for Understanding the Public’s Policy Preferences Across States, June 7, 2020, at
https://shorensteincenter.org/wp-content/uploads/2020/06/COVID19-CONSORT IUM-REPORT -JUNE2020.pdf.
99 For types of COVID-19 misinformation, see J. Scott Brennen, Felix M. Simon, and Philip N. Howard, et al., Types,
Sources, and Claim s of COVID-19 Misinform ation
, Reuters Institute for the Study of Journalism, Factsheet, April
2020, at https://reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation, and An Nguyen
and Daniel Catalan-Matamoros, “ Digital Mis/Disinformation and Public Engagement with Health and Science
Controversies: Fresh Perspectives from Covid-19,” Media and Com m unication, vol. 8, no. 2 (June 26, 2020), pp. 323 -
328, at https://www.cogitatiopress.com/mediaandcommunication/article/view/3352.
Congressional Research Service
15

Social Media: Misinformation and Content Moderation Issues for Congress

While personal information-seeking online can contribute to healthy behaviors by informing
decisions,100 a 2020 multinational study has found that exposure to incomplete, inaccurate, or
misleading COVID-19 information demotivates individuals from seeking additional potential y
beneficial health information.101 In June 2020, the Harvard Kennedy School’s Shorenstein Center
on Media, Politics, and Social Policy found that social media exposure is associated “with
misperceptions regarding basic facts about COVID-19” and “behaviors and attitudes that
potential y magnify the scale and lethality of COVID-19.”102 Exposure to inaccurate or unclear
COVID-19 information may impact the efficacy of public health campaigns. Exposure to
information on social media sites can occur both through active information seeking as wel as
through passive acquisition, or incidental exposure, especial y to content promoted in a social
media user’s feed.103
Misinformation may spread rapidly on social media platforms. Research using Twitter data from
2006-2017 has indicated that rumors or claims containing inaccurate information “diffuse
significantly farther, faster, deeper, and more broadly” on social media than those containing
accurate information.104 A June 2020 study by the Oxford Internet Institute found that users
shared YouTube videos containing COVID-19 misinformation nearly 20 mil ion times between
October 2019 and June 2020, generating 71 mil ion reactions (e.g., commenting, reposting) on
Facebook, Twitter, and Reddit.105 These figures exceed the 15 mil ion shares and 42 mil ion
reactions and comments generated by al YouTube videos posted during the same period by the
top five English-language news broadcasters (as measured by number of subscribers) combined.
The study examined over 1 mil ion COVID-19 videos on YouTube, identified the videos that
YouTube had removed for containing misinformation, and tracked their dissemination. The study
found that Facebook was the most significant channel for the removed videos’ circulation,
highlighting the importance of cross-platform information dissemination.106
To address perceived COVID-19 misinformation, some social media operators have implemented
content moderation strategies, such as tagging or removing information they deem to be

100 Annette Mills and Nelly T odorova, “An Integrated Perspective on Factors Influencing Online Health -Information
Seeking Behaviours,” Australasian Conference on Information Systems (ACIS) 2016 Proceedings, December 2016, at
https://aisel.aisnet.org/acis2016/83.
101 Hye Kyung Kim, Jisoo Ahn, and Lucy Atkinson, et al., “Effects of COVID-19 Misinformation on Information
Seeking, Avoidance, and Processing: A Multicountry Comparative Study,” Science Communication, vol. I, no. 30
(September 13, 2020), at https://journals.sagepub.com/doi/10.1177/1075547020959670.
102 Aengus Bridgman, Eric Merkley, and Peter John Loewen, et al., “T he Causes and Consequences of COVID -19
Misperceptions: Understanding the Role of News and Social Media,” Harvard Kennedy School Misinform ation
Review
, vol. 1, no. 3 (June 18, 2020), at https://misinforeview.hks.harvard.edu/article/the-causes-and-consequences-of-
covid-19-misperceptions-understanding-the-role-of-news-and-social-media/.
103 Anna Sophie Kümpel, “T he Issue T akes It All? Incidental News Exposure and News Engagement On Facebook,”
Digital Journalism , vol. 7, no. 2 (2019), pp. 165-186, at https://www.tandfonline.com/doi/full/10.1080/
21670811.2018.1465831.
104 Soroush Vosoughi, Deb Roy, and Sinan Aral, “T he Spread of T rue and False News Online,” Science, vol. 359, no.
6380 (March 9, 2018): pp. 1146–51, at https://doi.org/10.1126/science.aap9559.
105 Aleksi Knuutila, Aliaksandr Herasimenka, Hubert Au, et al., COVID-Related Misinformation on YouT ube: T he
Spread of Misinformation Videos on Social Media and the Effectiveness of Platform Policies, Oxford Internet Institute,
University of Oxford, Computational Propaganda Project, COMPROP Data Memo 2020.6, September 21, 2020, at
https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2020/09/Knuutila-YouT ube-misinfo-memo-v1.pdf.
106 Additional information on cross-platform disinformation can be found in T om Wilson and Kate Starbird, “Cross-
Platform Disinformation Campaigns: Lessons Learned and Next Steps,” The Harvard Kennedy School Misinformation
Review
, vol. 1, no. 1 (January 2020) found at https://misinforeview.hks.harvard.edu/article/cross-platform-
disinformation-campaigns/.
Congressional Research Service
16

Social Media: Misinformation and Content Moderation Issues for Congress

misinformation and promoting information about the pandemic from sources that they consider
reliable.107 Many social media operators updated their public-facing policies and documented the
actions that they are taking to address misinformation. On March 16, 2020, Facebook, Google,
LinkedIn, Microsoft, Reddit, Twitter, and YouTube released a joint statement that they would be
combatting fraud and misinformation about COVID-19.108 Facebook Inc. reported that from April
2020 through June 2020 it took down 7 mil ion posts containing, what they identified as,
misinformation about COVID-19 from its social media sites Facebook and Instagram, as wel as
putting warning notes on 98 mil ion additional posts that were misleading but not deemed
harmful enough to remove.109 Twitter has started adding labels for claims it deems disputed or
misleading, and removing information that its moderators consider likely to lead to severe harm,
based on internal determination in consultation with “trusted partners.”110 The shift to automated
content moderation using machine learning and artificial intel igence tools at Facebook, Google,
and Twitter during the COVID-19 pandemic may have led to some il egal material (e.g., sexual y
explicit content, content that violates copyright law) remaining online in certain areas and
unproblematic content being taken down.111
Some social media operators started prioritizing COVID-19 information from recognized health
authorities. On March 18, 2020, Facebook launched a COVID-19 Information Center, which
provides real-time updates from national health authorities, such as the Centers for Disease
Control and Prevention, and global organizations. When the COVID-19 Information Center
launched, Facebook featured it at the top of users’ news feeds.112 YouTube is working to raise the
profile of sources of information it deems authoritative across its site, including on its Home Page
and in search results.113
These efforts reflect recent attempts by some social media operators to prioritize content about
the COVID-19 pandemic that they deem authoritative to counter perceived misinformation.
Currently, each social media operator develops and institutes content moderation policies tailored
to what they determine to be the needs of its individual services. The development and
application of content moderation policies is strictly the purview of each social media operator,
and therefore differ widely in scope and operation. Members of Congress have expressed a range
of views about the discretionary nature of the development and application of these policies.
Some Members have argued in hearings that they are developed opaquely and applied arbitrarily,

107 Evelyn Douek, “COVID-19 and Social Media Content Moderation,” Lawfare, March 25, 2020, at
https://www.lawfareblog.com/covid-19-and-social-media-content -moderation.
108 Microsoft owns LinkedIn; Google and YouT ube have the same parent company, Alphabet. Catherine Shu and
Jonathan Shieber, “Facebook, Reddit, Google, LinkedIn, Microsoft, T witter, and YouT ube Issue Joint Statement on
Misinformation,” TechCrunch, March 16, 2020, at https://techcrunch.com/2020/03/16/facebook-reddit-google-
linkedin-microsoft -twitter-and-youtube-issue-joint -statement-on-misinformation/.
109 Rachel Lerman, “Facebook Says It Has T aken Down 7 Million Posts for Spreading Coronavirus Misinformation,”
Washington Post, August 11, 2020, at https://www.washingtonpost.com/technology/2020/08/11/facebook-covid-
misinformation-takedowns/.
110 Yoel Roth and Nick Pickles, “Updating Our Approach to Misleading Information,” Twitter, May 11, 2020, at
https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html.
111 Mark Scott and Laura Kayali, “ What Happened When Machines T ook Over Social Media,” Politico Pro, October
19, 2020, at https://subscriber.politicopro.com/article/2020/10/what-happened-when-machines-took-over-social-media-
2010676.
112 Kang-Xing Jin, “Keeping People Safe and Informed About the Coronavirus,” Facebook, August 19, 2020, at
https://about.fb.com/news/2020/08/coronavirus/. Facebook’s COVID-19 Information Center can be accessed at
https://www.facebook.com/coronavirus_info/.
113 YouT ube T eam, “Expanding Fact Checks on YouT ube to the Un ited States,” YouTube, April 28, 2020, at
https://blog.youtube/news-and-events/expanding-fact -checks-on-youtube-to-united-states.
Congressional Research Service
17

Social Media: Misinformation and Content Moderation Issues for Congress

others claim that some social media operators do not act quickly or decisively enough to
moderate potential misinformation, while stil others find that they are overly zealous in
moderating certain content and engage in censorship.114
Context for Congressional Consideration
Companies that provide content, applications, and services over the internet, including social
media operators, are general y not regulated by most federal agencies.115 However, there are laws
and regulations that do apply to specific internet content and federal agencies can hold individuals
and companies accountable for violating them.116 Although the Federal Communications
Commission (FCC) currently classifies broadband-internet access services as an information
service, subjecting these service providers to a regulatory framework,117 it currently does not
regulate internet applications or content.118 Efforts by some Members of Congress to address their
concerns about social media operators’ content moderation practices—ranging from operators not
doing enough to mitigate the spread of misinformation to operators censoring speech—have
focused on revising Section 230.
Currently, social media operators wil likely fal within the definition of interactive computer
services in Section 230(f)(2), which includes any “information service, system, or access software
provider that provides or enables computer access by multiple users to a computer server.” Thus
they may be protected from liability for publishing, and in some instances removing or restricting
access to, another person’s content. Additional y, social media operators could be exercising
constitutional y protected rights when they moderate content.119

114 U.S. Congress, House Committee on Energy and Commerce, Subcommittee on Communications and T echnology
and Subcommittee on Consumer Protection and Commerce, Online Disinform ation, 116th Cong., 2nd sess., June 24,
2020; U.S. Congress, Senate Committee on Com merce, Science, and T ransportation, Subcommittee on
Communications, T echnology, Innovation, and the Internet, Online Censorship Reform , 116th Cong., 2nd sess., July 28,
2020; U.S. Congress, House Committee on the Judiciary, Subcommittee on Antitrust, Comme rcial, and Administrative
Law, U.S. Congress, Online Platform s and Market Power: Am azon, Facebook, Google, and Apple , 116th Cong., 2nd
sess., July 29, 2020; U.S. Congress, House Permanent Select Committee on Intelligence, Misinform ation, 116th Cong.,
2nd sess., October 15, 2020; U.S. Congress, Senate Committee on Commerce, Science, and T ransportation, Big Tech
Com pany’s Liability Shield
, 116th Cong., 2nd sess., October 28, 2020.
115 CRS Legal Sidebar LSB10309, Regulating Big Tech: Legal Implications, coordinated by Valerie C. Brannon.
116 For example, the Federal T rade Commission (FT C), which is authorized in 15 U.S.C. §45 to protect consumers from
deceptive and unfair acts or practices in or affecting commer ce, has conducted investigations and filed charges against
companies for conducting deceptive practices on the internet. T he FT C also regulates operators of commercial websites
and online services directed to children under 13, such as ensuring parental co nsent is obtained, as required by the
Children’s Online Privacy Protection Act (15 U.S.C. §6501-6506).
117 In its 2015 Open Internet Order, the FCC stated it would not regulate individuals and corporate entities providing
content on the internet. There has been considerable debate as to how extensive the FCC’s regulatory authority of
broadband internet access service should be, specifically whether it should be classified as an information service and
regulated under T itle I or a telecommunications service an d regulated under T itle II of the 1934 Communications Act.
More information is available in CRS Report R40616, The Net Neutrality Debate: Access to Broadband Networks, by
Angele A. Gilroy.
118 Federal Communications Commission, In the Matter of Protecting and Promoting the Open Internet, paragraph
382.GN Docket No. 14-28, released March 12, 2015. Although the FCC opened a rulemaking to clarify the meaning of
Section 230, thus far, the FCC has not taken any m easures to suggest that it will start regulating internet content on a
regular basis.
119 CRS Report R45650, Free Speech and the Regulation of Social Media Content, by Valerie C. Brannon.
Congressional Research Service
18

link to page 30 link to page 30 link to page 33 link to page 30 Social Media: Misinformation and Content Moderation Issues for Congress

Federal Proposals to Amend Section 230
On May 28, 2020, President Trump issued an executive order instructing federal agencies to take
certain actions with respect to Section 230, such as clarifying the scope of the immunity provision
for online platforms.120 In accordance with the executive order, the National Telecommunications
and Information Administration (NTIA) filed a petition with the FCC on July 27, 2020 that the
Secretary of Commerce was requesting a rulemaking to clarify provisions of Section 230,
including the circumstances under which an interactive computer service restricting access to
content would not receive immunity.121 The use of the phrase “restricting access” in the executive
order and NTIA petition mirrors the original language used in Section 230 that covers content
moderation. In addition, on September 23, 2020, the Department of Justice sent draft legislation
to Congress to reform Section 230 by narrowing the scope of liability protection.122 On October
15, 2020, FCC Chairman Ajit Pai released a statement that the FCC would be moving forward
with rulemaking to clarify the meaning of Section 230, after the FCC’s general counsel concluded
that the FCC has the legal authority to interpret Section 230.123 However, FCC Chairman Ajit Pai
stated that he would not be moving forward with rulemaking on Section 230 during the remainder
of his tenure as FCC Chairman.124
In the 116th Congress, several bil s were introduced to amend Section 230, primarily to clarify the
liability protections interactive computer services receive for hosting or removing specific types
of content (see Table B-1 in Appendix B), in addition to legislation focused, in part, on
addressing COVID-19 misinformation (see Table B-2 in Appendix B). Some proposals to amend
Section 230 would have narrowed the scope of liability protection, such as to only protect the
removal of certain, specified categories of content. Other legislation would have al owed social
media operators to be held liable for not removing objectionable content under certain conditions
or in a timely fashion.
The 117th Congress may introduce bil s that were introduced in the 116th Congress or new bil s
that amend Section 230. When this report was published, the 117th Congress had introduced one
bil to amend Section 230: H.R. 285.
Commentary from Stakeholders on Amending Section 230
Some stakeholders, which include academics and researchers, have provided various
justifications for amending Section 230,125 including censorship concerns due to the market

120 White House, “Executive Order on Preventing Online Censorship,” Executive Order 13925, May 28, 2020, at
https://trumpwhitehouse.archives.gov/presidential-actions/executive-order-preventing-online-censorship/. For more
information on Section 230 and the executive order, see CRS Legal Sidebar LSB10484, UPDATE: Section 230 and the
Executive Order on Preventing Online Censorship
, by Valerie C. Brannon et al.
121 National T elecommunications and Information Administration, In the Matter of Section 230 of the Communications
Act of 1934
, July 27, 2020, at https://www.ntia.gov/files/ntia/publications/ntia_petition_for_rulemaking_7.27.20.pdf.
122 “Department of Justice’s Review of Section 230 of the Communications Decency Act of 1996,” Department of
Justice Press Release, September 23, 2020, at https://www.justice.gov/opa/pr/justice-department -unveils-proposed-
section-230-legislation; Department of Justice, “ Department of Justice’s Review of Section 230 of the Communications
Decency Act of 1996,” at https://www.justice.gov/ag/department-justice-s-review-section-230-communications-
decency-act-1996.
123 “Statement of Chairman Pai on Section 230,” Federal Communications Commission, October 15, 2020, at
https://docs.fcc.gov/public/attachments/DOC-367567A1.pdf.
124 Emily Birnbaum, “Ajit Pai is Distancing Himself From President T rump,” Protocol, January 7, 2021, at
https://www.protocol.com/ajit-pai-distancing-trump.
125 For a summary of various Section 230 reform proposals, see Paul M. Barrett, Regulating Social Media: The Fight
Congressional Research Service
19

Social Media: Misinformation and Content Moderation Issues for Congress

dominance of major technology firms and their role as gatekeepers to other media,126 and
concerns that judicial interpretations can leave “victims of online abuse with no leverage against
site operators whose business models facilitate abuse.”127 Others highlight the general lack of
transparency that surrounds social media operators’ content moderation decisions.128 A 2018
Georgetown Law Technology Review article recommends pairing Section 230 liability protections
with new public obligations for social media operators, including transparency and moderation
standards, advisory oversight from regulators, and regular legislative review of Section 230.129
Others have expressed skepticism about legislative changes to Section 230 intended to either
expand or restrict social media operators’ content moderation practices.130 Amending Section 230
to encourage moderation of misinformation and other objectionable content, or to limit the
liability protections afforded interactive computer services for removing content, could affect al
interactive computer services (e.g. search engines, internet service providers, video sharing sites,
website comment sections) and their users, unless new legislative language explicitly specifies a
subset of interactive computer services and users. Therefore, some stakeholders assert that
legislative action in either direction may have unintended consequences. For example, social
media operators may adjust their content moderation practices, ranging from aggressively
screening content to not moderating any legal content, including content that may be considered

Over Section 230—and Beyond, New York University Stern Center for Business and Human Rights, September 2020,
at https://bhr.stern.nyu.edu/s/NYU-Section-230_FINAL-ONLINE-UPDATED_Sept -8.pdf.
126 Rachel Bovard, “ T he FCC Should Address Distortions of Section 230,” Federalist Society, September 22, 2020, at
https://fedsoc.org/commentary/fedsoc-blog/the-fcc-should-address-distortions-of-section-230; Craig Parshall, “ Big
T ech and T he Whole First Amendment ,” Federalist Society, August 14, 2020, at https://fedsoc.org/commentary/fedsoc-
blog/big-tech-and-the-whole-first-amendment ; Craig Parshall and Jon Schweppe, Protecting Free Speech and
Defending Kids: A Proposal to Am end Section 230
, American Principles Project, June 2020, at
https://americanprinciplesproject.org/wp-content/uploads/2020/07/APP-Sec230-paper.pdf; Jonathan and T epper,
“Facebook and Google Must Be Regulated Now,” The American Conservative, May 13, 2019, at
https://www.theamericanconservative.com/articles/facebook-and-google-must-be-regulated-now/.
127 Danielle Keats Citron and Benjamin Wittes, “T he Internet Will Not Break: Denyin g Bad Samaritans § 230
Immunity,” Fordham Law Review, vol. 86, no. 2 (2017), at https://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=
5435&context=flr.
128 Joan Donovan, “Why Social Media Can’t Keep Moderating Content in the Shadows,” MIT Technology Review,
November 6, 2020 at https://www.technologyreview.com/2020/11/06/1011769/social-media-moderation-transparency-
censorship/; Kate Klonick, “ T he New Governors: T he People, Rules, and Processes Governing Online Speech,”
Harvard Law Review, vol. 4 (2018), pp. 1598-1670, at https://harvardlawreview.org/wp-content/uploads/2018/04/1598-
1670_Online.pdf; Mark MacCarthy, Transparency Requirem ents for Digital Social Media Platform s:
Recom m endations for Policy Makers and Industry
, T ransatlantic Working Group on Content Moderation Online and
Freedom of Expression, February 12, 2020, at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3615726.
129 T arleton Gillespie, “Platforms Are Not Intermediaries,” Georgetown Technology Law Review, vol. 2, no. 2 (2018),
pp. 198-216, at https://georgetownlawtechreview.org/wp-content/uploads/2018/07/2.2-Gilespie-pp-198-216.pdf.
130 Matthew Feeney, “ Leave Section 230 Alone,” commentary, Cato Institute, October 29, 2020, at
https://www.cato.org/publications/commentary/leave-section-230-alone; Elliot Harmon, “ Don’t Blame Section 230 for
Big T ech’s Failures. Blame Big T ech.,” Electronic Frontier Foundation, November 16, 2020, at https://www.eff.org/
deeplinks/2020/11/dont -blame-section-230-big-techs-failures-blame-big-tech; Jennifer Huddleston, “ T he FCC Should
Not Engage in Sect ion 230 Rulemaking,” The Federalist Society, October 6, 2020, at https://fedsoc.org/commentary/
fedsoc-blog/the-fcc-should-not-engage-in-section-230-rulemaking; Daniel Lyons, “ Beyond 230: Reframing the
conservative debate over social media regulation ,” AEIdeas, American Enterprise Institute, November 4, 2020, at
https://www.aei.org/technology-and-innovation/beyond-230-reframing-the-conservative-debate-over-social-media-
regulation/; Kate Ruane, “ Dear Congress: Platform Accountability Should Not T hreaten Online E xpression,” News &
Commentary, American Civil Liberties Union, October 27, 2020, at https://www.aclu.org/news/free-speech/dear-
congress-platform-accountability-should-not-threaten-online-expression/; Adam T hierer and Neil Alan Chilson, FCC’s
O'Rielly on First Am endm ent & Fairness Doctrine Dangers
, T he Federalist Society, FEDSOC Blog, at
https://fedsoc.org/commentary/fedsoc-blog/fcc-s-o-rielly-on-first-amendment-fairness-doctrine-dangers.
Congressional Research Service
20

Social Media: Misinformation and Content Moderation Issues for Congress

objectionable or obscene to most users. Increased exposure to liability may also threaten
competition, as start-up firms may not have the resources to address legal chal enges.131
Several stakeholders propose the establishment of a new federal agency to provide regulatory
oversight of social media operators, promote competition, and protect consumer data privacy.132
Others have examined the broader regulatory and legal landscape shaping the current social
media platform content moderation debate beyond Section 230. A 2019 essay published by the
Hoover Institution acknowledges that the private companies that operate interactive computer
services currently hold a great deal of control over speech on their platforms, and notes that the
First Amendment may protect their moderation decisions.133 It proposes several potential
solutions, although it notes that many of these are untested and would face legal scrutiny,
depending on how they are designed. The proposed solutions include defining rules for operators
based on size and reach; al owing users to customize algorithmic filtering or curation settings;
and opening the raw, unsorted, and uncurated content feeds of dominant platforms to al ow others
to build customizable services that users may choose based on their content preferences.134
Considerations for Congress
Among the overarching questions regarding misinformation and content moderation practices on
social media are the following:
 Should Congress or the Executive Branch take action to address misinformation
or content regulation?
 Is action necessary to reduce the spread of misinformation or to prevent
censorship?
 If action to address the spread of misinformation and prevent censorship is
deemed necessary, which institutions, public and private, should bear
responsibility for it?
 Who defines misinformation, how, for what purpose, and under what authority?
While Congress may choose not to take any actions to address social media operators’ content
moderation practices, if it chooses to, there are a range of potential legislative actions it could
take, from legislation designed to support existing practices to regulation of social media
operators.

131 Jeff Kosseff, “T he Gradual Erosion of the Law T hat Shaped the Internet,” Columbia Science & Technology Law
Review
, vol. 18, no. 1 (2016), pp. 1-41, at https://heinonline.org/HOL/Page?handle=hein.journals/cstlr18&collection=
journals&id=1&startid=&endid=41.
132 Paul M. Barrett, Regulating Social Media: The Fight Over Section 230 —and Beyond, New York University Stern
Center for Business and Human Rights, September 2020, at https://bhr.stern.nyu.edu/s/NYU-Section-230_FINAL-
ONLINE-UPDAT ED_Sept -8.pdf; T om Wheeler, Phil Verveer, and Gene Kimmelman, New Digital Realities, New
Oversight Solutions in the U.S.: The Case for a Digital Platform Agency and a New Approach to Regulatory Oversight
,
Harvard Kennedy School Shorenstein center on Media, Politics and Public Policy, August 2020, at
https://shorensteincenter.org/wp-content/uploads/2020/08/New-Digital-Realities_August -2020.pdf.
133 Daphne Keller, “Who Do You Sue? State and Platform Hy brid Power Over Online Speech,” Hoover Institute Aegis
Series Paper No. 1902
, January 29, 2019, at https://assets.documentcloud.org/documents/5735692/Who-Do-You-Sue-
State-and-Platform-Hybrid-Power.pdf.
134 Ibid.
Congressional Research Service
21

Social Media: Misinformation and Content Moderation Issues for Congress

Potential Legislative Actions
Congress may decide, possibly in light of free speech concerns, that no legislative action should
be undertaken to either restrict certain types of content or to require private sector actors to carry
content. Social media operators could adjust their own moderation policies and voluntarily
address the spread of misinformation. Social media operators regularly refine their algorithms to
adjust which content they prioritize and moderate. Absent additional regulation, social media
operators may or may not adjust their operations to curtail the spread of what they deem
misinformation in response to their users, advertisers, government bodies, and other external
stakeholders.
Some social media operators may develop tailored approaches to prevent the spread of
misinformation on their sites. However, each operator’s approach may vary in scope and efficacy,
potential y achieving success in some cases while failing in others. These efforts may also be
unevenly applied, resulting in the circulation of misinformation as content moves across
platforms that employ uncoordinated approaches to dealing with content. Congress could
consider whether it could take complimentary actions, such as requiring some or al social media
operators to regularly publish detailed content moderation transparency reports (similar to or
beyond what some operators already do voluntarily). This may encourage a positive balance
between the speech rights of users and social media operators. One action that has been proposed
is to mandate that social media users disclose their identity. If Congress decides to pursue similar
measures, it could weigh prospective benefits with the potential privacy implications and the
possible effects on speech. Such measures may help address inauthentic online behavior and the
spread of perceived misinformation by bots, but may not address its spread by other users.
Congress may consider whether the prevalence of misinformation on social media platforms is
sufficiently detrimental to public wel -being to warrant legislative action, given the large role that
platforms play in hosting speech and information exchange among hundreds of mil ions of
Americans. However, any legislation that attempts to formal y define misinformation, as distinct
from other forms of speech, may be contested.
Amending Section 230 to address misinformation on interactive computer services—either to
increase or limit moderation—could affect not only social media platforms, but also many other
types of entities, potential y including search engines, internet service providers, video sharing
sites, dating sites, travel sites, and the comment section of websites.135 If Congress intends any
changes to Section 230 to apply only to social media platforms, it may need to develop a
definition of “social medial platforms” that distinguishes these platforms from other interactive
computer services and that seeks to prevent circumvention of the application of this definition by
nominal changes in the way individual firms operate their businesses.
Congress may choose to regulate social media companies’ content moderation practices,
particularly if it believes these companies wil not alter their practices in response to pressure
from users and competitors. If there were numerous social media sites that were considered to be
interchangeable, users displeased with the types of content al owed or suppressed on one site
would be able to move to another site. However, large social media operators may be considered
natural monopolies that benefit from incumbency advantages, including network effects and
economies of scale, that make it difficult for new firms to enter and compete in the market,
limiting the number of social media sites users can choose from. Although the initial fixed cost of

135 47 U.S.C. §230(f)(2) defines an interactive computer service as “any information service, system, or access software
provider that provides or enables computer access by multiple users to a computer server, including specifically a
service or system that provides access to the Internet and such systems operated or services offered by libraries or
educational institutions.”
Congressional Research Service
22

Social Media: Misinformation and Content Moderation Issues for Congress

creating a website is low, developing the underlying infrastructure—such as systems to moderate
content and to collect, process, and store user data—and obtaining enough users to benefit from
network effects can be costly and create natural barriers to entry. In addition, users may be
unwil ing to join more than a certain number of social media sites, and the amount of time each
user can spend on a site is natural y constrained by other activities. As social media operators
compete for more users and their time by offering new features and other amenities, a few
operators may eventual y dominate.
Historical y, some natural monopolies have been considered public utilities and regulated as such,
often through the establishment of both common carriage rules and the establishment of
regulatory federal and state agencies that act on the public’s behalf. Similarly, federal regulatory
oversight of social media operators could be established through the creation of a federal agency,
commission, federal agency program, interagency activity, or program at a current agency. If it
were to pursue this course, Congress would need to specify the entity’s jurisdiction, specific
objectives, and the authorities it would exercise. These could include standards for content
moderation practices and user privacy. The entity may be required to establish an appeals process
and potential remedies for individuals and entities who feel that regulations and laws have been
misapplied and that they have suffered harm as a result. Congress may also consider the effects—
intended or otherwise—that public-sector action may have on the general availability of
information, its quality, public safety, speech rights, competition, and privacy.
There may be concerns about whether any entity tasked with addressing misinformation
adequately represents the diverse population of the United States and its interests, whether it
balances the equities of relevant public and private stakeholders and citizens, and whether it
adequately balances the public interest need to minimize the negative effects of misinformation
with the protection of First Amendment rights and other civil liberties.
Antitrust actions to break up the largest social media operators and promote more venues for
speech might increase the number of social media sites offered to users, which could result in
operators competing with content moderation practices. However, it is unclear if, absent any other
changes, increasing the number of social media outlets wil address the spread of misinformation.
In a market with a larger number of operators, social media platforms may develop content
moderation policies with varying approaches to defining and moderating misinformation to
distinguish themselves from competitors; what may be considered misinformation on one
platform may not be on another. Antitrust actions could be accompanied by legislative actions, not be on another. Antitrust actions could be accompanied by legislative actions,
such as requiring certain content to be moderated or not moderated, content moderation such as requiring certain content to be moderated or not moderated, content moderation
transparency reports, and inauthentic behavior disclosures. Nevertheless, a limited number of transparency reports, and inauthentic behavior disclosures. Nevertheless, a limited number of
operators may continue to dominate, particularly if the social media market is susceptible to operators may continue to dominate, particularly if the social media market is susceptible to
natural monopolies. natural monopolies.
Congress may choose to direct a federal entity to engage in advisory rather than regulatory Congress may choose to direct a federal entity to engage in advisory rather than regulatory
actions. Such activities could include conducting or commissioning formal studies to identify the actions. Such activities could include conducting or commissioning formal studies to identify the
scale and scope of misinformation spread through social media, developing interagency plans to scale and scope of misinformation spread through social media, developing interagency plans to
address misinformation, supporting authoritative information sources that social media operators address misinformation, supporting authoritative information sources that social media operators
could voluntarily link to, and engaging with the private sector to establish content moderation could voluntarily link to, and engaging with the private sector to establish content moderation
transparency and reporting guidelines. transparency and reporting guidelines.
Concluding Thoughts
If Congress chooses to address the spread of misinformation on social media or content If Congress chooses to address the spread of misinformation on social media or content
moderation practices general y, it might consider the intended scope of proposed actions, under moderation practices general y, it might consider the intended scope of proposed actions, under
what conditions they would be applied, and the range of potential legal, social, and economic what conditions they would be applied, and the range of potential legal, social, and economic
Congressional Research Service Congressional Research Service
23 23

Social Media: Misinformation and Content Moderation Issues for Congress

consequences, both intended and unintended, that may result. It might consider whether any consequences, both intended and unintended, that may result. It might consider whether any
action that it takes imposes costs, monetary or otherwise, that further entrenches the market action that it takes imposes costs, monetary or otherwise, that further entrenches the market
power of incumbent operators. It might also consider how U.S. actions, such as regulating social power of incumbent operators. It might also consider how U.S. actions, such as regulating social
media companies’ content moderation practices, would fit within an international legal media companies’ content moderation practices, would fit within an international legal
framework. Major social media operators are multinational corporations, and the internet framework. Major social media operators are multinational corporations, and the internet
provides access to their websites worldwide, unless governments erect firewal s to block access. provides access to their websites worldwide, unless governments erect firewal s to block access.
Crafting legislation to address the activities of U.S.-based social media sites in other countries Crafting legislation to address the activities of U.S.-based social media sites in other countries
may be difficult, particularly if another country seeks to impose obligations that are in conflict may be difficult, particularly if another country seeks to impose obligations that are in conflict
with U.S. law. Conversely, it may not be possible for U.S. legislation to regulate the internal with U.S. law. Conversely, it may not be possible for U.S. legislation to regulate the internal
activities—such as algorithms or content moderation practices—of foreign-based social media activities—such as algorithms or content moderation practices—of foreign-based social media
platforms. platforms.
Congressional Research Service Congressional Research Service
24 24

Social Media: Misinformation and Content Moderation Issues for Congress

Appendix A. Social Media Definitions
This report considers social media to include online sites that al ow users to access interactive This report considers social media to include online sites that al ow users to access interactive
services, create and engage with content, and connect with other users; the networks of social services, create and engage with content, and connect with other users; the networks of social
media users associated with specific sites; the software and hardware infrastructures that enable media users associated with specific sites; the software and hardware infrastructures that enable
the provision and operation of social media sites and their interoperation with external data and the provision and operation of social media sites and their interoperation with external data and
services; and the structures and policies of corporations governing the social media sites and services; and the structures and policies of corporations governing the social media sites and
infrastructures they operate. infrastructures they operate.
Social Media Site
This report defines a social media site as an internet-based interface that al ows users to develop This report defines a social media site as an internet-based interface that al ows users to develop
individual and group profiles; make, share, view, and interact with content; connect with other individual and group profiles; make, share, view, and interact with content; connect with other
users; and join affinity groups.136 Users post and access content through a website or application users; and join affinity groups.136 Users post and access content through a website or application
on a computer or mobile device. Many interactive computer services, such as those designed to on a computer or mobile device. Many interactive computer services, such as those designed to
al ow users to arrange dates, provide travel information, or offer recommendations about al ow users to arrange dates, provide travel information, or offer recommendations about
businesses or professional services, share some but not al of these characteristics, and are not businesses or professional services, share some but not al of these characteristics, and are not
considered social media sites for purposes of this report. considered social media sites for purposes of this report.
Social Media User
Social media users are individuals who have registered an account with at least one social media Social media users are individuals who have registered an account with at least one social media
site. As of January 2020, there were an estimated 3.8 bil ion social media users global y out of an site. As of January 2020, there were an estimated 3.8 bil ion social media users global y out of an
estimated 4.5 bil ion internet users.137 In the United States research firm Dataportal estimates 70% estimated 4.5 bil ion internet users.137 In the United States research firm Dataportal estimates 70%
of the total population were active social media users as of the start of 2020, based on reported of the total population were active social media users as of the start of 2020, based on reported
potential advertising reach of social media platforms.138 potential advertising reach of social media platforms.138
Social Media Algorithm
Social media operators use algorithms to tailor some of what each user sees at a particular time on Social media operators use algorithms to tailor some of what each user sees at a particular time on
their sites. These algorithms are used to predict the relevance of content to specific users, based their sites. These algorithms are used to predict the relevance of content to specific users, based
on past user behavior, and other factors. Algorithms help social media operators with the logistics on past user behavior, and other factors. Algorithms help social media operators with the logistics
of sorting the massive amount of content that users post and to prioritize content based on of sorting the massive amount of content that users post and to prioritize content based on
estimation of relevance for dissemination. Each social media operator determines relevance estimation of relevance for dissemination. Each social media operator determines relevance
differently based on the user and usage data it collects and weighs. The data includes contacts and differently based on the user and usage data it collects and weighs. The data includes contacts and
interaction with contacts, specific content read or watched, the amount of time spent reading and interaction with contacts, specific content read or watched, the amount of time spent reading and
watching specific content, specific content liked and shared, and subscriptions to topical or watching specific content, specific content liked and shared, and subscriptions to topical or
thematic content categories and groups. As data collection grows, the social media providers thematic content categories and groups. As data collection grows, the social media providers
constantly refine and adjust their algorithms. Social media operators are able to sel narrowly constantly refine and adjust their algorithms. Social media operators are able to sel narrowly
targeted advertising based on their ability to reach specific users. targeted advertising based on their ability to reach specific users.

136 Jonathan Obar and Steve Wildman, “Social Media Definition and the Governance Challenge: An Introduction to the 136 Jonathan Obar and Steve Wildman, “Social Media Definition and the Governance Challenge: An Introduction to the
Special Issue,” Special Issue,” Telecommunications Policy, vol. 39, no. 9 (2015), pp. 745-750. , vol. 39, no. 9 (2015), pp. 745-750.
137 Simon Kemp, 137 Simon Kemp, Digital 2020: Global Digital Overview, Datareportal, January 30, 2020, at https://datareportal.com/, Datareportal, January 30, 2020, at https://datareportal.com/
reports/digital-2020-global-digital-overview. reports/digital-2020-global-digital-overview.
138 Simon Kemp, 138 Simon Kemp, Digital 2020: The United States of America, Datareportal, February 11 2020, at , Datareportal, February 11 2020, at
https://datareportal.com/reports/digital-2020-united-states-of-america. https://datareportal.com/reports/digital-2020-united-states-of-america.
Congressional Research Service Congressional Research Service
25 25

Social Media: Misinformation and Content Moderation Issues for Congress

Social Media Platform
The term social media platform refers to the technical infrastructure of social media that, in The term social media platform refers to the technical infrastructure of social media that, in
addition to al owing users to post and interact with content and establish social networks, enables addition to al owing users to post and interact with content and establish social networks, enables
connection to other sites, applications, and data, and al ows third-party developers to build connection to other sites, applications, and data, and al ows third-party developers to build
applications and services that integrate with the platform.139 Application programming interfaces applications and services that integrate with the platform.139 Application programming interfaces
regulate and facilitate data exchange between applications making “a website programmable by regulate and facilitate data exchange between applications making “a website programmable by
offering structured access to its data and functionality and turn[ing] it into a platform that others offering structured access to its data and functionality and turn[ing] it into a platform that others
can build on.”140 can build on.”140
Social Media Enabling Infrastructure
Social media enabling infrastructure consists of the distributed architecture of hardware and Social media enabling infrastructure consists of the distributed architecture of hardware and
software that enables the provision of social media sites. This infrastructure may be owned by software that enables the provision of social media sites. This infrastructure may be owned by
social media operators or third-party providers, and includes data centers containing the computer social media operators or third-party providers, and includes data centers containing the computer
systems that serve, store, and process data and telecommunication systems that aid the flow of systems that serve, store, and process data and telecommunication systems that aid the flow of
information to, from, and within a social media network. This infrastructure enables social media information to, from, and within a social media network. This infrastructure enables social media
operators to host content; provide content recommendations; deliver content to users with operators to host content; provide content recommendations; deliver content to users with
minimal delay; and store, mine, and share user and partner data. Social media operators rely minimal delay; and store, mine, and share user and partner data. Social media operators rely
heavily on public and private telecommunication networks to provide content and exchange data heavily on public and private telecommunication networks to provide content and exchange data
with end users. with end users.
Social Media Operator
Social media operators are the companies that operate social media sites. For example, the top Social media operators are the companies that operate social media sites. For example, the top
nine social media sites, among others, as ranked by percentage of U.S. adults who reported using nine social media sites, among others, as ranked by percentage of U.S. adults who reported using
them in a June 2019 survey conducted by the Pew Research Center,141 were YouTube, Facebook, them in a June 2019 survey conducted by the Pew Research Center,141 were YouTube, Facebook,
Instagram, Pinterest, LinkedIn, Snapchat, Twitter, WhatsApp, and Reddit. Each is operated by a Instagram, Pinterest, LinkedIn, Snapchat, Twitter, WhatsApp, and Reddit. Each is operated by a
corporate entity headquartered in the United States. Alphabet Inc., parent of Google LLC, owns corporate entity headquartered in the United States. Alphabet Inc., parent of Google LLC, owns
YouTube. Facebook Inc. owns its namesake service, as wel as Instagram and WhatsApp. YouTube. Facebook Inc. owns its namesake service, as wel as Instagram and WhatsApp.
Pinterest Inc., Twitter Inc., and Reddit Inc. operate their respective namesake services. The Pinterest Inc., Twitter Inc., and Reddit Inc. operate their respective namesake services. The
Microsoft Corporation owns LinkedIn. Snap Inc. operates Snapchat. Each of these companies is Microsoft Corporation owns LinkedIn. Snap Inc. operates Snapchat. Each of these companies is
publicly traded, with the exception of Reddit Inc., which is privately held. publicly traded, with the exception of Reddit Inc., which is privately held.

139 For additional information on the definition of social media platforms, see L. DeNardis and A.M. Hackl, “Internet 139 For additional information on the definition of social media platforms, see L. DeNardis and A.M. Hackl, “Internet
governance by social media platforms,” governance by social media platforms,” Telecom m unications Policy, vol. 39, no. 9 (October 2015), pp. 761 -770, at , vol. 39, no. 9 (October 2015), pp. 761 -770, at
https://www.sciencedirect.com/science/article/pii/S0308596115000592; T arleton Gillsepie, “ T he Politics of https://www.sciencedirect.com/science/article/pii/S0308596115000592; T arleton Gillsepie, “ T he Politics of
‘Platforms’,” ‘Platforms’,” New Media & Society, vol. 12, no. 3 (May 1, 2010), pp. 347-364 at https://doi.org/10.1177/, vol. 12, no. 3 (May 1, 2010), pp. 347-364 at https://doi.org/10.1177/
1461444809342738. 1461444809342738.
140 Anne Helmond, “T he Platformization of the Web: Making Web Data Platform Ready,” 140 Anne Helmond, “T he Platformization of the Web: Making Web Data Platform Ready,” Social Media + Society, July , July
2015, at https://journals.sagepub.com/doi/full/10.1177/2056305115603080. 2015, at https://journals.sagepub.com/doi/full/10.1177/2056305115603080.
141 Pew Research Center, 141 Pew Research Center, Social Media Fact Sheet, June 12, 2019, https://www.pewresearch.org/internet/fact-sheet/, June 12, 2019, https://www.pewresearch.org/internet/fact-sheet/
social-media/. social-media/.
Congressional Research Service Congressional Research Service
26 26

Social Media: Misinformation and Content Moderation Issues for Congress

Appendix B. Section 230 and COVID-19
Misinformation Legislation

Table B-1. Selected Legislation on Section 230 Introduced in the 116th Congress
Legislation
Title
Section on Section 230
H.R. 4027 H.R. 4027
Stop the Censorship Act Stop the Censorship Act
Would have amended Section 230(c) to limit the scope of liability Would have amended Section 230(c) to limit the scope of liability
protection for restricting access to only content that is unlawful. protection for restricting access to only content that is unlawful.
H.R. 4232 H.R. 4232
Protecting Local Authority Protecting Local Authority
Would have amended Section 230(c) to state that the bil would Would have amended Section 230(c) to state that the bil would
and Neighborhoods Act and Neighborhoods Act
not affect enforcement of laws related to leasing and renting not affect enforcement of laws related to leasing and renting
property. property.
H.R. 492 H.R. 492
Biased Algorithm Biased Algorithm
Would have amended Section 230(c) to remove liability protection Would have amended Section 230(c) to remove liability protection
Deterrence Act of 2019 Deterrence Act of 2019
from social media services if the service or its algorithm does any from social media services if the service or its algorithm does any
of the fol owing: (1) displays user-generated content in an order of the fol owing: (1) displays user-generated content in an order
that is not chronological; (2) delays the display of such content that is not chronological; (2) delays the display of such content
relative to other content; or (3) hinders the display of such relative to other content; or (3) hinders the display of such
content for reasons other than to carry out the user’s direction or content for reasons other than to carry out the user’s direction or
to restrict material that the provider or user considers obscene, to restrict material that the provider or user considers obscene,
lewd, lascivious, filthy, excessively violent, harassing, or otherwise lewd, lascivious, filthy, excessively violent, harassing, or otherwise
objectionable. objectionable.
H.R. 7808 H.R. 7808
Stop the Censorship Act of Stop the Censorship Act of
Would have amended Section 230(c) to limit the scope of liability Would have amended Section 230(c) to limit the scope of liability
2020 2020
protection for restricting access to content that is unlawful or protection for restricting access to content that is unlawful or
promotes violence or terrorism, rather than objectionable promotes violence or terrorism, rather than objectionable
content. content.
H.R. 8454 H.R. 8454
Eliminating Abusive and Eliminating Abusive and
Would have amended Section 230(e) to state that the bil would Would have amended Section 230(e) to state that the bil would
Rampant Neglect of Rampant Neglect of
not affect enforcement of child sexual exploitation laws and not affect enforcement of child sexual exploitation laws and
Interactive Technologies Interactive Technologies
protect interactive computer service providers from liability for protect interactive computer service providers from liability for
(EARN IT) Act of 2020 (EARN IT) Act of 2020
certain encryption technologies. certain encryption technologies.
H.R. 8515 H.R. 8515
Don’t Push My Buttons Act Don’t Push My Buttons Act
Would have amended Section 230(c) to remove liability protection Would have amended Section 230(c) to remove liability protection
from interactive computer services that col ect information about from interactive computer services that col ect information about
users’ habits, preferences, or beliefs and that use an automated users’ habits, preferences, or beliefs and that use an automated
function to deliver content to the user based on the information function to deliver content to the user based on the information
col ected about each user. col ected about each user.
H.R. 8517 H.R. 8517
Protect Speech Act Protect Speech Act
Would have amended Section 230(c) to provide liability protection Would have amended Section 230(c) to provide liability protection
for interactive computer services that restrict access to content for interactive computer services that restrict access to content
that (1) is obscene, lewd, lascivious, filthy, excessively violent, that (1) is obscene, lewd, lascivious, filthy, excessively violent,
promoting terrorism or violent extremism, harassing, promoting promoting terrorism or violent extremism, harassing, promoting
self-harm, or unlawful; or (2) violates the applicable terms of self-harm, or unlawful; or (2) violates the applicable terms of
service or use. The liability protections would not have applied to service or use. The liability protections would not have applied to
other actions taken by interactive computer services. The bil also other actions taken by interactive computer services. The bil also
specified instances in which a person or entity could be held liable specified instances in which a person or entity could be held liable
for information provided by another person or entity. for information provided by another person or entity.
H.R. 8596 H.R. 8596
Limiting Section 230 Limiting Section 230
Would have amended Section 230(c) to provide liability protection Would have amended Section 230(c) to provide liability protection
Immunity to Good Immunity to Good
only if interactive computer services adopt and maintain terms of only if interactive computer services adopt and maintain terms of
Samaritans Act Samaritans Act
service that describe any policies related to restricting access to service that describe any policies related to restricting access to
material. The provider would also have been required to design material. The provider would also have been required to design
and operate the terms of service in “good faith,” or with fair and operate the terms of service in “good faith,” or with fair
dealing standards without fraudulent intent. dealing standards without fraudulent intent.
Congressional Research Service Congressional Research Service
27 27

Social Media: Misinformation and Content Moderation Issues for Congress

Legislation
Title
Section on Section 230
H.R. 8636 H.R. 8636
Protecting Americans from Protecting Americans from
Would have amended Section 230(c) to remove liability protection Would have amended Section 230(c) to remove liability protection
Dangerous Algorithms Act Dangerous Algorithms Act
from interactive computer services that use algorithms or other from interactive computer services that use algorithms or other
computational process to rank or alter the delivery or display of computational process to rank or alter the delivery or display of
information, except for those sorted chronological y, alphabetical y, information, except for those sorted chronological y, alphabetical y,
by user rating, or randomly. by user rating, or randomly.
H.R. 8719 H.R. 8719
Curbing Abuse and Saving Curbing Abuse and Saving
Would have amended Section 230(c) to remove liability protection Would have amended Section 230(c) to remove liability protection
Expression in Technology Expression in Technology
for interactive computer services that create, develop, posts, for interactive computer services that create, develop, posts,
(CASE-IT) Act (CASE-IT) Act
material y contributes to il egal content, or induces another person material y contributes to il egal content, or induces another person
to do so. The bil would also have removed liability protection to do so. The bil would also have removed liability protection
from interactive computer services that knowingly permits or from interactive computer services that knowingly permits or
facilitates certain contact between adults and minors and content facilitates certain contact between adults and minors and content
that is indecent, obscene, or otherwise harmful to minors. that is indecent, obscene, or otherwise harmful to minors.
H.R. 8896 H.R. 8896
Abandoning Online Abandoning Online
Would have repealed Section 230. Would have repealed Section 230.
Censorship (AOC) Act Censorship (AOC) Act
H.R. 8922 H.R. 8922
Break Up Big Tech Act of Break Up Big Tech Act of
Would have amended Section 230(c) to remove liability protection Would have amended Section 230(c) to remove liability protection
2020 2020
from interactive computer services that (1) sel advertising based from interactive computer services that (1) sel advertising based
on users’ personal characteristics, (2) place items or facilitates the on users’ personal characteristics, (2) place items or facilitates the
placement of items into the stream of commerce, (3) col ect data placement of items into the stream of commerce, (3) col ect data
for commercial purposes, and (4) use a design that addicts users to for commercial purposes, and (4) use a design that addicts users to
the service. The bil would also have removed liability protections the service. The bil would also have removed liability protections
from social media services that display user-generated content in from social media services that display user-generated content in
an order other than chronological order. an order other than chronological order.
S. 1914 S. 1914
Ending Support for Internet Ending Support for Internet
Would have amended Section 230(c) to provide liability protection Would have amended Section 230(c) to provide liability protection
Censorship Act Censorship Act
only if the interactive computer service receives an immunity only if the interactive computer service receives an immunity
certification from the Federal Trade Commission. To receive the certification from the Federal Trade Commission. To receive the
immunity certification, the interactive computer service would immunity certification, the interactive computer service would
have been required to prove that it does not moderate have been required to prove that it does not moderate
information in a political y biased manner. information in a political y biased manner.
S. 3398 S. 3398
Eliminating Abusive and Eliminating Abusive and
Would have amended Section 230(e) to remove liability Would have amended Section 230(e) to remove liability
Rampant Neglect of Rampant Neglect of
protections of online service providers regarding claims al eging protections of online service providers regarding claims al eging
Interactive Technologies Interactive Technologies
violations of child sexual exploitation laws. violations of child sexual exploitation laws.
(EARN IT) Act of 2020 (EARN IT) Act of 2020
S. 3983 S. 3983
Limiting Section 230 Limiting Section 230
Would have amended Section 230(c) to provide liability protection Would have amended Section 230(c) to provide liability protection
Immunity to Good Immunity to Good
only if the interactive computer service adopts and maintains only if the interactive computer service adopts and maintains
Samaritans Act Samaritans Act
terms of service that describes any policies related to restricting terms of service that describes any policies related to restricting
access to material. The interactive computer service would have access to material. The interactive computer service would have
been required to design and operate the terms of service in “good been required to design and operate the terms of service in “good
faith,” or with fair dealing standards without fraudulent intent. faith,” or with fair dealing standards without fraudulent intent.
S. 4062 S. 4062
Stopping Big Tech’s Stopping Big Tech’s
Would have amended Section 230(c) to provide liability protection Would have amended Section 230(c) to provide liability protection
Censorship Act Censorship Act
only for interactive computer services that take reasonable steps only for interactive computer services that take reasonable steps
to prevent or address the unlawful use or publication of to prevent or address the unlawful use or publication of
information. The bil would have removed liability protection for information. The bil would have removed liability protection for
interactive computer services that restrict access to content unless interactive computer services that restrict access to content unless
the action is taken in a viewpoint-neutral manner, only limits the the action is taken in a viewpoint-neutral manner, only limits the
time, place, or manner in which the material is available, and there time, place, or manner in which the material is available, and there
is a compel ing reason for restricting access. The bil also requires is a compel ing reason for restricting access. The bil also requires
that interactive computer services clearly explain the practices that interactive computer services clearly explain the practices
used to restrict access. used to restrict access.
S. 4066 S. 4066
Platform Accountability and Platform Accountability and
Would have amended Section 230(c) to include an intermediary Would have amended Section 230(c) to include an intermediary
Consumer Transparency Consumer Transparency
liability standard on notification of il egal content or activity. liability standard on notification of il egal content or activity.
(PACT) Act (PACT) Act
Congressional Research Service Congressional Research Service
28 28

Social Media: Misinformation and Content Moderation Issues for Congress

Legislation
Title
Section on Section 230
S. 4337 S. 4337
Behavioral Advertising Behavioral Advertising
Would have amended Section 230(c) to remove liability protection Would have amended Section 230(c) to remove liability protection
Decisions Are Downgrading Decisions Are Downgrading
from interactive computer services that serve or deliver from interactive computer services that serve or deliver
Services (BAD ADS) Act Services (BAD ADS) Act
advertisements based on users’ personal characteristics. advertisements based on users’ personal characteristics.
S. 4534 S. 4534
Online Freedom and Online Freedom and
Would have amended Section 230(c) to limit the scope of liability Would have amended Section 230(c) to limit the scope of liability
Viewpoint Diversity Act Viewpoint Diversity Act
protection to restricting access to content that promotes self- protection to restricting access to content that promotes self-
harm, terrorism, or is unlawful, rather than objectionable content. harm, terrorism, or is unlawful, rather than objectionable content.
S. 4632 S. 4632
Online Content Policy Online Content Policy
Would have amended Section 230(c) to limit the scope of liability Would have amended Section 230(c) to limit the scope of liability
Modernization Act Modernization Act
protection to restricting access to content that promotes self- protection to restricting access to content that promotes self-
harm, terrorism, or is unlawful, rather than objectionable content. harm, terrorism, or is unlawful, rather than objectionable content.
S. 4756 S. 4756
Don’t Push My Buttons Act Don’t Push My Buttons Act
Would have amended Section 230(c) to remove liability protection Would have amended Section 230(c) to remove liability protection
from interactive computer services that col ect information about from interactive computer services that col ect information about
users’ habits, preferences, or beliefs and that use an automated users’ habits, preferences, or beliefs and that use an automated
function to deliver content to the user based on the information function to deliver content to the user based on the information
col ected about each user. col ected about each user.
S. 4758 S. 4758
See Something, Say See Something, Say
Would have amended Section 230(e) to remove liability protection Would have amended Section 230(e) to remove liability protection
Something Online Act of Something Online Act of
for failure to take reasonable steps to prevent or address for failure to take reasonable steps to prevent or address
2020 2020
suspicious transmission activity. suspicious transmission activity.
S. 5012 S. 5012
Holding Sexual Predators Holding Sexual Predators
Would have amended Section 230(e) to state that the bil would Would have amended Section 230(e) to state that the bil would
and Online Enablers and Online Enablers
have no effect on sexual exploitation and other abuses of children have no effect on sexual exploitation and other abuses of children
Accountable Act of 2020 Accountable Act of 2020
laws. laws.
S. 5020 S. 5020
N/A N/A
Would have repealed Section 230. Would have repealed Section 230.
S. 5085 S. 5085
N/A N/A
Would have repealed Section 230. Would have repealed Section 230.
Source: CRS usingCRS using Congress.gov. Congress.gov.
Notes: This listing includes bil s whose major purposes included changing 47 U.S.C. §230. The table does not This listing includes bil s whose major purposes included changing 47 U.S.C. §230. The table does not
include bil s that made only passing reference to Section 230. N/A indicates that a title was not available when include bil s that made only passing reference to Section 230. N/A indicates that a title was not available when
the list was compiled on January 5, 2021. the list was compiled on January 5, 2021.


Congressional Research Service Congressional Research Service
29 29

Social Media: Misinformation and Content Moderation Issues for Congress

Table B-2. Selected Legislation Addressing COVID-19 Misinformation Introduced in
the 116th Congress
Legislation
Title
Section on COVID-19 Misinformation
H.R. 133 H.R. 133
Consolidated Appropriations Act Authorized funding for public awareness campaigns to Consolidated Appropriations Act Authorized funding for public awareness campaigns to
improve information about COVID-19 vaccines, including improve information about COVID-19 vaccines, including
countering misinformation. countering misinformation.
H.R. 6599 H.R. 6599
COVID Research Act of 2020 COVID Research Act of 2020
Would have provided coordination of research and Would have provided coordination of research and
development for pandemic disease prediction, forecasting, development for pandemic disease prediction, forecasting,
computing, and other purposes, including identifying chal enges computing, and other purposes, including identifying chal enges
and developing strategies to address misinformation. and developing strategies to address misinformation.
H.R. 6800; H.R. 6800;
Health and Economic Recovery Health and Economic Recovery
Would have cal ed for a study on the current understanding of Would have cal ed for a study on the current understanding of
H.R. 8406; H.R. 8406;
Omnibus Emergency Solutions Omnibus Emergency Solutions
the spread of COVID-19-related disinformation on the the spread of COVID-19-related disinformation on the
H.R. 925 H.R. 925
(HEROES) Act (HEROES) Act
internet and social media platforms. It would have authorized internet and social media platforms. It would have authorized
$1 mil ion for the National Science Foundation to contract $1 mil ion for the National Science Foundation to contract
with the National Academies of Science, Engineering, and with the National Academies of Science, Engineering, and
Medicine to conduct the study. Medicine to conduct the study.
H.R. 7484 H.R. 7484
Preventing China from Exploiting Preventing China from Exploiting
Would have assessed the means and methods used by China Would have assessed the means and methods used by China
COVID-19 Act COVID-19 Act
to disseminate misinformation on social media platforms and to disseminate misinformation on social media platforms and
through other English-based media. through other English-based media.
H.R. 7546 H.R. 7546
Minority Community Public Minority Community Public
Would have authorized appropriations for grants to provide Would have authorized appropriations for grants to provide
Health Emergency Response Act Health Emergency Response Act
public education related to the COVID-19 pandemic, including public education related to the COVID-19 pandemic, including
of 2020 of 2020
responses to misinformation. responses to misinformation.
H.R. 8061 H.R. 8061
Community Immunity During Community Immunity During
Would have amended Sec. 317 the Public Health Service Act Would have amended Sec. 317 the Public Health Service Act
COVID-19 Act of 2020 COVID-19 Act of 2020
to authorize grant funding to combat misinformation on the to authorize grant funding to combat misinformation on the
safety of vaccines, including those licensed to prevent, safety of vaccines, including those licensed to prevent,
mitigate, or treat COVID-19. mitigate, or treat COVID-19.
H.R. 8203 H.R. 8203
COVID-19 Health Disparities COVID-19 Health Disparities
Would have cal ed for public awareness campaigns to dispel Would have cal ed for public awareness campaigns to dispel
Action Act of 2020 Action Act of 2020
misinformation about COVID-19 symptoms, testing, or misinformation about COVID-19 symptoms, testing, or
treatment. treatment.
H.R. 8395 H.R. 8395
COVID-19 Disinformation COVID-19 Disinformation
Would have authorized $1 mil ion for the National Science Would have authorized $1 mil ion for the National Science
Research and Reporting Act of Research and Reporting Act of
Foundation to contract with the National Academies of Foundation to contract with the National Academies of
2020 2020
Science, Engineering, and Medicine to study the role of Science, Engineering, and Medicine to study the role of
misinformation on the public response to COVID-19 and the misinformation on the public response to COVID-19 and the
role of social media in disseminating misinformation and role of social media in disseminating misinformation and
disinformation. disinformation.
H.R. 8966 H.R. 8966
COVID-19 Vaccine Awareness COVID-19 Vaccine Awareness
Would have authorized funding for public awareness Would have authorized funding for public awareness
Support Act of 2020 Support Act of 2020
campaigns to improve information about availability of campaigns to improve information about availability of
COVID-19 vaccines, including countering misinformation and COVID-19 vaccines, including countering misinformation and
disinformation. disinformation.
S. 3669 S. 3669
COVID-19 International COVID-19 International
Would have authorized $10 mil ion to the U.S. Agency for Would have authorized $10 mil ion to the U.S. Agency for
Response and Recovery Act of Response and Recovery Act of
Global Media to enhance investigative and specialized Global Media to enhance investigative and specialized
2020 2020
reporting on COVID-19, expand efforts to counter COVID- reporting on COVID-19, expand efforts to counter COVID-
19 disinformation in its media markets, increase staff training, 19 disinformation in its media markets, increase staff training,
and increase staff and resources to provide appropriate and increase staff and resources to provide appropriate
research and support. research and support.
S. 4262 S. 4262
COVID-19 Health Disparities COVID-19 Health Disparities
Would have cal ed for public awareness campaigns to dispel Would have cal ed for public awareness campaigns to dispel
Action Act of 2020 Action Act of 2020
misinformation about COVID-19 symptoms, testing, or misinformation about COVID-19 symptoms, testing, or
treatment. treatment.
S. 4499 S. 4499
COVID-19 Misinformation and COVID-19 Misinformation and
Would have established a federal interagency COVID-19 Would have established a federal interagency COVID-19
Disinformation Task Force Act Disinformation Task Force Act
misinformation and disinformation task force. misinformation and disinformation task force.
of 2020 of 2020
Congressional Research Service Congressional Research Service
30 30

Social Media: Misinformation and Content Moderation Issues for Congress

Legislation
Title
Section on COVID-19 Misinformation
S. 4507 S. 4507
GET CARE Act of 2020 GET CARE Act of 2020
Would have amended the Public Health Service Act to include Would have amended the Public Health Service Act to include
Sec. 230B, which would have authorized grant funding to carry Sec. 230B, which would have authorized grant funding to carry
out a national, evidence-based campaign to increase awareness out a national, evidence-based campaign to increase awareness
of the importance of seeking preventive care during the of the importance of seeking preventive care during the
COVID-19 pandemic, including combating misinformation. COVID-19 pandemic, including combating misinformation.
S. 4732 S. 4732
COVID-19 Disinformation COVID-19 Disinformation
Would have authorized $1 mil ion for the National Science Would have authorized $1 mil ion for the National Science
Research and Reporting Act of Research and Reporting Act of
Foundation to contract with the National Academies of Foundation to contract with the National Academies of
2020 2020
Science, Engineering, and Medicine to study the role of Science, Engineering, and Medicine to study the role of
misinformation on the public response to COVID-19 and the misinformation on the public response to COVID-19 and the
role of social media in promoting the spread of false role of social media in promoting the spread of false
information. information.
S. 4737 S. 4737
Community Immunity During Community Immunity During
Would have amended Sec. 317 of the Public Health Service Would have amended Sec. 317 of the Public Health Service
COVID-19 Act of 2020 COVID-19 Act of 2020
Act to authorize grant funding to combat misinformation on Act to authorize grant funding to combat misinformation on
the safety of vaccines, including those licensed to prevent, the safety of vaccines, including those licensed to prevent,
mitigate, or treat COVID-19. mitigate, or treat COVID-19.
S. 4800 S. 4800
Health and Economic Recovery Health and Economic Recovery
Would have cal ed for a study on the current understanding of Would have cal ed for a study on the current understanding of
Omnibus Emergency Solutions Omnibus Emergency Solutions
the spread of COVID-19-related disinformation on the the spread of COVID-19-related disinformation on the
(HEROES) Act (HEROES) Act
internet and social media platforms. It would also have internet and social media platforms. It would also have
authorized $1 mil ion for the National Science Foundation to authorized $1 mil ion for the National Science Foundation to
contract with the National Academies of Science, Engineering, contract with the National Academies of Science, Engineering,
and Medicine to conduct the study. and Medicine to conduct the study.
S. 4958 S. 4958
COVID-19 Vaccine Awareness COVID-19 Vaccine Awareness
Would have authorized funding for public awareness Would have authorized funding for public awareness
Support Act of 2020 Support Act of 2020
campaigns to improve information about availability of campaigns to improve information about availability of
COVID-19 vaccines, including countering misinformation and COVID-19 vaccines, including countering misinformation and
disinformation. disinformation.
Source: CRS using Congress.gov. CRS using Congress.gov.
Notes: The listed bil s were introduced after January 1, 2020; the list was compiled on January 5, 2021. Only The listed bil s were introduced after January 1, 2020; the list was compiled on January 5, 2021. Only
bil s that specify actions to be taken specifical y about the spread of COVID-19 misinformation are listed. If a bil bil s that specify actions to be taken specifical y about the spread of COVID-19 misinformation are listed. If a bil
had the same title as another bil in the same legislative body, and the section on COVID-19 misinformation was had the same title as another bil in the same legislative body, and the section on COVID-19 misinformation was
the same, the legislation numbers were grouped together. the same, the legislation numbers were grouped together.

Author Information

Jason A. Gallo Jason A. Gallo
Clare Y. Cho Clare Y. Cho
Section Research Manager Section Research Manager
Analyst in Industrial Organization and Business Analyst in Industrial Organization and Business



Acknowledgments
Rita Tehan, Senior Research Librarian, compiled legislation in the 116th Congress addressing Section 230 Rita Tehan, Senior Research Librarian, compiled legislation in the 116th Congress addressing Section 230
and COVID-19 misinformation.and COVID-19 misinformation.
Congressional Research Service Congressional Research Service
31 31

Social Media: Misinformation and Content Moderation Issues for Congress



Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan
shared staff to congressional committees and Members of Congress. It operates solely at the behest of and shared staff to congressional committees and Members of Congress. It operates solely at the behest of and
under the direction of Congress. Information in a CRS Report should n ot be relied upon for purposes other under the direction of Congress. Information in a CRS Report should n ot be relied upon for purposes other
than public understanding of information that has been provided by CRS to Members of Congress in than public understanding of information that has been provided by CRS to Members of Congress in
connection with CRS’s institutional role. CRS Reports, as a work of the United States Government, are not connection with CRS’s institutional role. CRS Reports, as a work of the United States Government, are not
subject to copyright protection in the United States. Any CRS Report may be reproduced and distributed in subject to copyright protection in the United States. Any CRS Report may be reproduced and distributed in
its entirety without permission from CRS. However, as a CRS Report may include copyrighted images or its entirety without permission from CRS. However, as a CRS Report may include copyrighted images or
material from a third party, you may need to obtain the permission of the copyright holder if you wish to material from a third party, you may need to obtain the permission of the copyright holder if you wish to
copy or otherwise use copyrighted material. copy or otherwise use copyrighted material.

Congressional Research Service Congressional Research Service
R46662 R46662 · VERSION 1 · NEW
32 32