
Updated November 14, 2022
Defense Primer: U.S. Policy on Lethal Autonomous
Weapon Systems
Lethal autonomous weapon systems (LAWS) are a special
halt a weapon’s target engagement. Another category is
class of weapon systems that use sensor suites and
semi-autonomous, or “human in the loop,” weapon systems
computer algorithms to independently identify a target and
that “only engage individual targets or specific target
employ an onboard weapon system to engage and destroy
groups that have been selected by a human operator.” Semi-
the target without manual human control of the system.
autonomous weapons include so-called “fire and forget”
Although these systems are not yet in widespread
weapons, such as certain types of guided missiles, that
development, it is believed they would enable military
deliver effects to human-identified targets using
operations in communications-degraded or -denied
autonomous functions.
environments in which traditional systems may not be able
to operate.
The directive does not cover “autonomous or semi-
autonomous cyberspace systems for cyberspace operations;
Contrary to a number of news reports, U.S. policy does not
unarmed, unmanned platforms; unguided munitions;
prohibit the development or employment of LAWS.
munitions manually guided by the operator (e.g., laser- or
Although the United States does not currently have LAWS
wire-guided munitions); mines; [and] unexploded explosive
in its inventory, some senior military and defense leaders
ordnance,” nor subject them to its guidelines.
have stated that the United States may be compelled to
develop LAWS in the future if U.S. competitors choose to
Role of human operator. DODD 3000.09 requires that all
do so. At the same time, a growing number of states and
systems, including LAWS, be designed to “allow
nongovernmental organizations are appealing to the
commanders and operators to exercise appropriate levels of
international community for regulation of or a ban on
human judgment over the use of force.” As noted in an
LAWS due to ethical concerns.
August 2018 U.S. government white paper, “‘appropriate’
is a flexible term that reflects the fact that there is not a
Developments in both autonomous weapons technology and
fixed, one-size-fits-all level of human judgment that should
international discussions of LAWS could hold implications
be applied to every context. What is ‘appropriate’ can differ
for congressional oversight, defense investments, military
across weapon systems, domains of warfare, types of
concepts of operations, treaty-making, and the future of
warfare, operational contexts, and even across different
war.
functions in a weapon system.”
U.S. Policy
Furthermore, “human judgment over the use of force” does
Then-Deputy Secretary of Defense Ashton Carter issued
not require manual human “control” of the weapon system,
DOD’s policy on autonomy in weapons systems,
as is often reported, but rather broader human involvement
Department of Defense Directive (DODD) 3000.09 (the
in decisions about how, when, where, and why the weapon
directive), in November 2012. U.S. defense officials have
will be employed. This includes a human determination that
stated that they plan to release an updated directive by the
the weapon will be used “with appropriate care and in
end of 2022.
accordance with the law of war, applicable treaties, weapon
system safety rules, and applicable rules of engagement.”
Definitions. There is no agreed definition of lethal
autonomous weapon systems that is used in international
To aid this determination, DODD 3000.09 requires that
fora. However, DODD 3000.09 provides definitions for
“[a]dequate training, [tactics, techniques, and procedures],
different categories of autonomous weapon systems for the
and doctrine are available, periodically reviewed, and used
purposes of the U.S. military. These definitions are
by system operators and commanders to understand the
principally grounded in the role of the human operator with
functioning, capabilities, and limitations of the system’s
regard to target selection and engagement decisions, rather
autonomy in realistic operational conditions.” The directive
than in the technological sophistication of the weapon
also requires that the weapon’s human-machine interface be
system.
“readily understandable to trained operators” so they can
make informed decisions regarding the weapon’s use.
DODD 3000.09 defines LAWS as “weapon system[s] that,
once activated, can select and engage targets without
Weapons review process. DODD 3000.09 requires that the
further intervention by a human operator.” This concept of
software and hardware of all systems, including lethal
autonomy is also known as “human out of the loop” or “full
autonomous weapons, be tested and evaluated to ensure
autonomy.” The directive contrasts LAWS with human-
they
supervised, or “human on the loop,” autonomous weapon
systems, in which operators have the ability to monitor and
https://crsreports.congress.gov
Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems
Function as anticipated in realistic operational
Weapons.” The paper notes that “automated target
environments
against
adaptive
adversaries;
identification, tracking, selection, and engagement
complete engagements in a timeframe consistent
functions can allow weapons to strike military objectives
with commander and operator intentions and, if
more accurately and with less risk of collateral damage” or
unable to do so, terminate engagements or seek
civilian casualties.
additional human operator input before continuing
Although the U.N. CCW is a consensus-based forum, the
the engagement; and are sufficiently robust to
outcome of its discussions could hold implications for U.S.
minimize failures that could lead to unintended
policy on lethal autonomous weapons.
engagements or to loss of control of the system to
Potential Questions for Congress
unauthorized parties.
Any changes to the system’s operating state—for example,
What is the status of U.S. competitors’ development of
due to machine learning—would require the system to go
LAWS? Is the United States adequately investing in
through testing and evaluation again to ensure that it has
counter-autonomy capabilities?
retained its safety features and ability to operate as
To what extent, if at all, should the United States initiate
intended.
or accelerate its own development of LAWS?
How should the United States balance LAWS research
Senior-level review. In addition to the standard weapons
and development with ethical considerations? What, if
review process, a secondary senior-level review is required
any, restrictions should there be on DOD’s development
for LAWS and certain types of semi-autonomous and
or employment of LAWS?
human-supervised autonomous weapons that deliver lethal
If the United States chooses to develop LAWS, are
effects. This review requires the Under Secretary of
current weapons review processes and legal standards
Defense for Policy, the Chairman of the Joint Chiefs of
for their employment in conflict sufficient?
Staff, and either the Under Secretary of Defense for
Acquisition and Sustainment or the Under Secretary of
What role should the United States play in U.N. CCW
Defense for Research and Engineering to approve the
discussions of LAWS? Should the United States support
system “before formal development and again before
the status quo, propose a political declaration, or
fielding in accordance with the guidelines” listed in
advocate regulation of or a ban on LAWS?
Enclosure 3 of the directive. In the event of “urgent military
operational need,” this senior-level review may be waived
CRS Products
by the Deputy Secretary of Defense “with the exception of
CRS In Focus IF11294, International Discussions Concerning
the requirement for a legal review.” DOD is reportedly in
Lethal Autonomous Weapon Systems, by Kelley M. Sayler.
the process of developing a handbook to guide senior
CRS Report R45178, Artificial Intelligence and National Security,
leaders through this review process; however, as the United
by Kelley M. Sayler.
States is not currently known to be developing LAWS, no
weapon system is known to have gone through the senior-
CRS Report R45392, U.S. Ground Forces Robotics and
level review process to date.
Autonomous Systems (RAS) and Artificial Intelligence (AI):
Considerations for Congress, coordinated by Andrew Feickert.
International Discussions of LAWS
Since 2014, the United States has participated in
Other Resources
international discussions of LAWS, sometimes colloquially
Department of Defense Directive 3000.09, “Autonomy in
referred to as “killer robots,” under the auspices of the
Weapon Systems,” Updated May 8, 2017,
United Nations Convention on Certain Conventional
https://www.esd.whs.mil/portals/54/documents/dd/issuances/
Weapons (U.N. CCW). In 2017, these discussions
dodd/300009p.pdf.
transitioned from an informal “meeting of experts” to a
formal “Group of Governmental Experts” (GGE) tasked
U.S. Government, “Humanitarian Benefits of Emerging
Technologies in the Area of Lethal Autonomous Weapons,”
with examining the technological, military, ethical, and
March 28, 2018.
legal dimensions of LAWS. In 2018 and 2019, the GGE has
considered proposals by states parties to issue political
U.S. Government, “Human-Machine Interaction in the
declarations about LAWS, as well as proposals to regulate
Development, Deployment and Use of Emerging Technologies
them.
in the Area of Lethal Autonomous Weapons Systems,” August
28, 2018.
In addition, approximately 30 countries and 165
United Nations Office at Geneva, “Background on Lethal
nongovernmental organizations have called for a
Autonomous Weapons Systems in the CCW.”
preemptive ban on LAWS due to ethical concerns,
including concerns about operational risk, accountability
for use, and compliance with the proportionality and
distinction requirements of the law of war. The U.S.
Kelley M. Sayler, Analyst in Advanced Technology and
government does not currently support a ban on LAWS and
Global Security
has addressed ethical concerns about the systems in a
IF11150
March 2018 white paper, “Humanitarian Benefits of
Emerging Technologies in the Area of Lethal Autonomous
https://crsreports.congress.gov
Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems
Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff to
congressional committees and Members of Congress. It operates solely at the behest of and under the direction of Congress.
Information in a CRS Report should not be relied upon for purposes other than public understanding of information that has
been provided by CRS to Members of Congress in connection with CRS’s institutional role. CRS Reports, as a work of the
United States Government, are not subject to copyright protection in the United States. Any CRS Report may be
reproduced and distributed in its entirety without permission from CRS. However, as a CRS Report may include
copyrighted images or material from a third party, you may need to obtain the permission of the copyright holder if you
wish to copy or otherwise use copyrighted material.
https://crsreports.congress.gov | IF11150 · VERSION 9 · UPDATED