
Updated December 19, 2019
Defense Primer: U.S. Policy on Lethal Autonomous
Weapon Systems
Lethal autonomous weapon systems (LAWS) are a special
deliver effects to human-identified targets using
class of weapon systems that use sensor suites and
autonomous functions.
computer algorithms to independently identify a target and
employ an onboard weapon system to engage and destroy
The directive does not cover “autonomous or semi-
the target without manual human control of the system.
autonomous cyberspace systems for cyberspace operations;
Although these systems generally do not yet exist, it is
unarmed, unmanned platforms; unguided munitions;
believed they would enable military operations in
munitions manually guided by the operator (e.g., laser- or
communications-degraded or -denied environments in
wire-guided munitions); mines; [and] unexploded explosive
which traditional systems may not be able to operate.
ordnance,” nor subject them to its guidelines.
Contrary to a number of news reports, U.S. policy does not
Role of human operator. DODD 3000.09 requires that all
prohibit the development or employment of LAWS.
systems, including LAWS, be designed to “allow
Although the United States does not currently have LAWS
commanders and operators to exercise appropriate levels of
in its inventory, some senior military and defense leaders
human judgment over the use of force.” As noted in an
have stated that the United States may be compelled to
August 2018 U.S. government white paper, “‘appropriate’
develop LAWS in the future if potential U.S. adversaries
is a flexible term that reflects the fact that there is not a
choose to do so. At the same time, a growing number of
fixed, one-size-fits-all level of human judgment that should
states and nongovernmental organizations are appealing to
be applied to every context. What is ‘appropriate’ can differ
the international community for regulation of or a ban on
across weapon systems, domains of warfare, types of
LAWS due to ethical concerns.
warfare, operational contexts, and even across different
functions in a weapon system.”
Developments in both autonomous weapons technology and
international discussions of LAWS could hold implications
Furthermore, “human judgment over the use of force” does
for congressional oversight, defense investments, military
not require manual human “control” of the weapon system,
concepts of operations, treaty-making, and the future of
as is often reported, but rather broader human involvement
war.
in decisions about how, when, where, and why the weapon
will be employed. This includes a human determination that
U.S. Policy
the weapon will be used “with appropriate care and in
Definitions. There is no agreed definition of lethal
accordance with the law of war, applicable treaties, weapon
autonomous weapon systems that is used in international
system safety rules, and applicable rules of engagement.”
fora. However, Department of Defense Directive (DODD)
3000.09 (the directive), which establishes U.S. policy on
To aid this determination, DODD 3000.09 requires that
autonomy in weapons systems, provides definitions for
“[a]dequate training, [tactics, techniques, and procedures],
different categories of autonomous weapon systems for the
and doctrine are available, periodically reviewed, and used
purposes of the U.S. military. These definitions are
by system operators and commanders to understand the
principally grounded in the role of the human operator with
functioning, capabilities, and limitations of the system’s
regard to target selection and engagement decisions, rather
autonomy in realistic operational conditions.” The directive
than in the technological sophistication of the weapon
also requires that the weapon’s human-machine interface be
system.
“readily understandable to trained operators” so they can
make informed decisions regarding the weapon’s use.
DODD 3000.09 defines LAWS as “weapon system[s] that,
once activated, can select and engage targets without
Weapons review process. DODD 3000.09 requires that the
further intervention by a human operator.” This concept of
software and hardware of all systems, including lethal
autonomy is also known as “human out of the loop” or “full
autonomous weapons, be tested and evaluated to ensure
autonomy.” The directive contrasts LAWS with human-
they
supervised, or “human on the loop,” autonomous weapon
systems, in which operators have the ability to monitor and
Function as anticipated in realistic operational
halt a weapon’s target engagement. Another category is
environments
against
adaptive
adversaries;
semi-autonomous, or “human in the loop,” weapon systems
complete engagements in a timeframe consistent
that “only engage individual targets or specific target
with commander and operator intentions and, if
groups that have been selected by a human operator.” Semi-
unable to do so, terminate engagements or seek
autonomous weapons include so-called “fire and forget”
additional human operator input before continuing
weapons, such as certain types of guided missiles, that
https://crsreports.congress.gov
Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems
the engagement; and are sufficiently robust to
Potential Questions for Congress
minimize failures that could lead to unintended
To what extent are potential U.S. adversaries developing
engagements or to loss of control of the system to
LAWS?
unauthorized parties.
Any changes to the system’s operating state—for example,
How should the United States balance LAWS research
due to machine learning—would require the system to go
and development with ethical considerations?
through testing and evaluation again to ensure that it has
retained its safety features and ability to operate as
What role should the United States play in UN CCW
intended.
discussions of LAWS? Should the United States support
the status quo, propose a political declaration, or
Senior-level review. In addition to the standard weapons
advocate regulation of or a ban on LAWS?
review process, a secondary senior-level review is required
for LAWS and certain types of semi-autonomous and
If the United States chooses to develop LAWS, are
human-supervised autonomous weapons that deliver lethal
current weapons review processes and legal standards
effects. This review requires the Under Secretary of
for their employment in conflict sufficient?
Defense for Policy, the Chairman of the Joint Chiefs of
Staff, and either the Under Secretary of Defense for
CRS Products
Acquisition and Sustainment or the Under Secretary of
CRS Report R45178, Artificial Intelligence and National Security,
Defense for Research and Engineering to approve the
system “before formal development and again before
by Kelley M. Sayler
fielding in accordance with the guidelines” listed in
CRS Report R44466, Lethal Autonomous Weapon Systems:
Enclosure 3 of the directive. In the event of “urgent military
Issues for Congress, by Nathan J. Lucas
operational need,” this senior-level review may be waived
CRS In Focus IF11294, International Discussions Concerning
by the Deputy Secretary of Defense “with the exception of
Lethal Autonomous Weapon Systems, by Zelin Liu and Michael
the requirement for a legal review.”
Moodie
CRS Report R45392, U.S. Ground Forces Robotics and
The United States is not currently developing LAWS;
Autonomous Systems (RAS) and Artificial Intelligence (AI):
therefore, no weapon system has gone through the senior-
Considerations for Congress, coordinated by Andrew Feickert
level review process to date.
International Discussions of LAWS
Other Resources
Since 2014, the United States has participated in
Department of Defense Directive 3000.09, “Autonomy in
international discussions of LAWS, sometimes colloquially
Weapon Systems,” Updated May 8, 2017,
referred to as “killer robots,” under the auspices of the
https://www.esd.whs.mil/portals/54/documents/dd/issuances/
United Nations Convention on Certain Conventional
dodd/300009p.pdf.
Weapons (UN CCW). In 2017, these discussions
U.S. Government, “Humanitarian Benefits of Emerging
transitioned from an informal “meeting of experts” to a
Technologies in the Area of Lethal Autonomous Weapons,”
formal “Group of Governmental Experts” (GGE) tasked
March 28, 2018,
with examining the technological, military, ethical, and
https://www.unog.ch/80256EDD006B8954/(httpAssets)/7C177
legal dimensions of LAWS. In 2018 and 2019, the GGE has
AE5BC10B588C125825F004B06BE/$file/CCW_GGE.1_2018_
considered proposals by states parties to issue political
WP.4.pdf.
declarations about LAWS, as well as proposals to regulate
them.
U.S. Government, “Human-Machine Interaction in the
Development, Deployment and Use of Emerging Technologies
In addition, approximately 25 countries and 100
in the Area of Lethal Autonomous Weapons Systems,” August
nongovernmental organizations have called for a
28, 2018,
preemptive ban on LAWS due to ethical concerns,
https://www.unog.ch/80256EDD006B8954/(httpAssets)/D1A2
including concerns about operational risk, accountability
BA4B7B71D29FC12582F6004386EF/$file/2018_GGE+LAWS_
for use, and compliance with the proportionality and
August_Working+Paper_US.pdf.
distinction requirements of the law of war. The U.S.
United Nations Office at Geneva, “Background on Lethal
government does not currently support a ban on LAWS and
Autonomous Weapons Systems in the CCW,”
has addressed ethical concerns about the systems in a
https://www.unog.ch/80256EE600585943/(httpPages)/8FA3C2
March 2018 white paper, “Humanitarian Benefits of
562A60FF81C1257CE600393DF6?OpenDocument.
Emerging Technologies in the Area of Lethal Autonomous
Defense Innovation Board, “AI Principles: Recommendations
Weapons.” The paper notes that “automated target
on the Ethical Use of Artificial Intelligence by the Department
identification, tracking, selection, and engagement
of Defense,” October 2019.
functions can allow weapons to strike military objectives
more accurately and with less risk of collateral damage” or
civilian casualties.
Kelley M. Sayler, Analyst in Advanced Technology and
Although the UN CCW is a consensus-based forum, the
Global Security
outcome of its discussions could hold implications for U.S.
policy on lethal autonomous weapons.
IF11150
https://crsreports.congress.gov
Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems
Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff to
congressional committees and Members of Congress. It operates solely at the behest of and under the direction of Congress.
Information in a CRS Report should not be relied upon for purposes other than public understanding of information that has
been provided by CRS to Members of Congress in connection with CRS’s institutional role. CRS Reports, as a work of the
United States Government, are not subject to copyright protection in the United States. Any CRS Report may be
reproduced and distributed in its entirety without permission from CRS. However, as a CRS Report may include
copyrighted images or material from a third party, you may need to obtain the permission of the copyright holder if you
wish to copy or otherwise use copyrighted material.
https://crsreports.congress.gov | IF11150 · VERSION 2 · UPDATED