
Updated May 15, 2023
Defense Primer: U.S. Policy on Lethal Autonomous
Weapon Systems
Lethal autonomous weapon systems (LAWS) are a special
semi-autonomous, or “human in the loop,” weapon systems
class of weapon systems that use sensor suites and
that “only engage individual targets or specific target
computer algorithms to independently identify a target and
groups that have been selected by a human operator.” Semi-
employ an onboard weapon system to engage and destroy
autonomous weapons include so-called “fire and forget”
the target without manual human control of the system.
weapons, such as certain types of guided missiles, that
Although these systems are not yet in widespread
deliver effects to human-identified targets using
development, it is believed they would enable military
autonomous functions.
operations in communications-degraded or -denied
environments in which traditional systems may not be able
The directive does not apply to autonomous or semi-
to operate.
autonomous cyberspace capabilities; unarmed platforms;
unguided munitions; munitions manually guided by the
Contrary to a number of news reports, U.S. policy does not
operator (e.g., laser- or wire-guided munitions); mines;
prohibit the development or employment of LAWS.
unexploded explosive ordnance; or autonomous or semi-
Although the United States does not currently have LAWS
autonomous systems that are not weapon systems, nor
in its inventory, some senior military and defense leaders
subject them to its guidelines.
have stated that the United States may be compelled to
develop LAWS in the future if U.S. competitors choose to
Role of human operator. DODD 3000.09 requires that all
do so. At the same time, a growing number of states and
systems, including LAWS, be designed to “allow
nongovernmental organizations are appealing to the
commanders and operators to exercise appropriate levels of
international community for regulation of or a ban on
human judgment over the use of force.” As noted in an
LAWS due to ethical concerns.
August 2018 U.S. government white paper, “‘appropriate’
is a flexible term that reflects the fact that there is not a
Developments in both autonomous weapons technology and
fixed, one-size-fits-all level of human judgment that should
international discussions of LAWS could hold implications
be applied to every context. What is ‘appropriate’ can differ
for congressional oversight, defense investments, military
across weapon systems, domains of warfare, types of
concepts of operations, treaty-making, and the future of
warfare, operational contexts, and even across different
war.
functions in a weapon system.”
U.S. Policy
Furthermore, “human judgment over the use of force” does
Then-Deputy Secretary of Defense Ashton Carter issued
not require manual human “control” of the weapon system,
DOD’s policy on autonomy in weapons systems,
as is often reported, but rather broader human involvement
Department of Defense Directive (DODD) 3000.09 (the
in decisions about how, when, where, and why the weapon
directive), in November 2012. DOD has since updated the
will be employed. This includes a human determination that
directive—most recently in January 2023.
the weapon will be used “with appropriate care and in
accordance with the law of war, applicable treaties, weapon
Definitions. There is no agreed definition of lethal
system safety rules, and applicable rules of engagement.”
autonomous weapon systems that is used in international
fora. However, DODD 3000.09 provides definitions for
To aid this determination, DODD 3000.09 requires that
different categories of autonomous weapon systems for the
“[a]dequate training, [tactics, techniques, and procedures],
purposes of the U.S. military. These definitions are
and doctrine are available, periodically reviewed, and used
principally grounded in the role of the human operator with
by system operators and commanders to understand the
regard to target selection and engagement decisions, rather
functioning, capabilities, and limitations of the system’s
than in the technological sophistication of the weapon
autonomy in realistic operational conditions.” The directive
system.
also requires that the weapon’s human-machine interface be
“readily understandable to trained operators” so they can
DODD 3000.09 defines LAWS as “weapon system[s] that,
make informed decisions regarding the weapon’s use.
once activated, can select and engage targets without
further intervention by a human operator.” This concept of
Weapons review process. DODD 3000.09 requires that the
autonomy is also known as “human out of the loop” or “full
software and hardware of covered semi-autonomous and
autonomy.” The directive contrasts LAWS with human-
autonomous weapon systems, be tested and evaluated to
supervised, or “human on the loop,” autonomous weapon
ensure they
systems, in which operators have the ability to monitor and
halt a weapon’s target engagement. Another category is
https://crsreports.congress.gov
Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems
Function as anticipated in realistic operational
March 2018 white paper, “Humanitarian Benefits of
environments against adaptive adversaries taking
Emerging Technologies in the Area of Lethal Autonomous
realistic and practicable countermeasures, [and]
Weapons.” The paper notes that “automated target
complete engagements within a timeframe and
identification, tracking, selection, and engagement
geographic area, as well as other relevant
functions can allow weapons to strike military objectives
environmental
and
operational
constraints,
more accurately and with less risk of collateral damage” or
consistent with commander and operator intentions.
civilian casualties.
If unable to do so, the systems will terminate the
Although the U.N. CCW is a consensus-based forum, the
engagement or obtain additional operator input
outcome of its discussions could hold implications for U.S.
before continuing the engagement.
policy on lethal autonomous weapons.
Systems must also be “sufficiently robust to minimize the
Potential Questions for Congress
probability and consequences of failures.” Any changes to
the system’s operating state—for example, due to machine
What is the status of U.S. competitors’ development of
learning—would require the system to go through testing
LAWS? Is the United States adequately investing in
and evaluation again to ensure that it has retained its safety
counter-autonomy capabilities?
features and ability to operate as intended. The directive
To what extent, if at all, should the United States initiate
also notes that “the use of AI capabilities in autonomous or
or accelerate its own development of LAWS?
semi-autonomous systems will be consistent with the DOD
How should the United States balance LAWS research
AI Ethical Principles.”
and development with ethical considerations? What, if
any, restrictions should there be on DOD’s development
Senior-level review. In addition to the standard weapons
or employment of LAWS?
review process, a secondary senior-level review is required
If the United States chooses to develop LAWS, are
for covered autonomous and semi-autonomous systems.
current weapons review processes and legal standards
This review requires the Under Secretary of Defense for
for their employment in conflict sufficient?
Policy (USD[P]), the Vice Chairman of the Joint Chiefs of
Staff (VCJCS), and the Under Secretary of Defense for
What role should the United States play in U.N. CCW
Research and Engineering (USD[R&E]) to approve the
discussions of LAWS? Should the United States support
system before formal development. USD(P), VCJCS, and
the status quo, propose a political declaration, or
the Under Secretary of Defense for Acquisition and
advocate regulation of or a ban on LAWS?
Sustainment (USD[A&S]) must then approve the system
before fielding. In the event of “urgent military need,” this
CRS Products
senior-level review may be waived by the Deputy Secretary
CRS In Focus IF11294, International Discussions Concerning
of Defense. DODD 3000.09 additionally establishes the
Lethal Autonomous Weapon Systems, by Kelley M. Sayler.
Autonomous Weapon System Working Group—composed
CRS Report R45178, Artificial Intelligence and National Security,
of representatives of USD(P); USD(R&E); USD(A&S);
by Kelley M. Sayler.
DOD General Counsel; the Chief Digital and AI Officer;
the Director, Operational Test and Evaluation; and the
CRS Report R45392, U.S. Ground Forces Robotics and
Chairman of the Joint Chiefs of Staff—to support and
Autonomous Systems (RAS) and Artificial Intelligence (AI):
advise the senior-level review process.
Considerations for Congress, coordinated by Andrew Feickert.
International Discussions of LAWS
Other Resources
Since 2014, the United States has participated in
Department of Defense Directive 3000.09, “Autonomy in
international discussions of LAWS, sometimes colloquially
Weapon Systems,” Updated January 25, 2023,
referred to as “killer robots,” under the auspices of the
https://www.esd.whs.mil/portals/54/documents/dd/issuances/d
United Nations Convention on Certain Conventional
odd/300009p.pdf.
Weapons (U.N. CCW). In 2017, these discussions
transitioned from an informal “meeting of experts” to a
U.S. Government, “Humanitarian Benefits of Emerging
formal “Group of Governmental Experts” (GGE) ta
Technologies in the Area of Lethal Autonomous Weapons,”
sked
March 28, 2018.
with examining the technological, military, ethical, and
legal dimensions of LAWS. In 2018 and 2019, the GGE has
U.S. Government, “Human-Machine Interaction in the
considered proposals by states parties to issue political
Development, Deployment and Use of Emerging Technologies
declarations about LAWS, as well as proposals to regulate
in the Area of Lethal Autonomous Weapons Systems,” August
them.
28, 2018.
United Nations Office at Geneva, “Background on Lethal
In addition, approximately 30 countries and 165
Autonomous Weapons Systems in the CCW.”
nongovernmental organizations have called for a
preemptive ban on LAWS due to ethical concerns,
including concerns about operational risk, accountability
for use, and compliance with the proportionality and
Kelley M. Sayler, Analyst in Advanced Technology and
distinction requirements of the law of war. The U.S.
Global Security
government does not currently support a ban on LAWS and
IF11150
has addressed ethical concerns about the systems in a
https://crsreports.congress.gov
Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems
Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff to
congressional committees and Members of Congress. It operates solely at the behest of and under the direction of Congress.
Information in a CRS Report should not be relied upon for purposes other than public understanding of information that has
been provided by CRS to Members of Congress in connection with CRS’s institutional role. CRS Reports, as a work of the
United States Government, are not subject to copyright protection in the United States. Any CRS Report may be
reproduced and distributed in its entirety without permission from CRS. However, as a CRS Report may include
copyrighted images or material from a third party, you may need to obtain the permission of the copyright holder if you
wish to copy or otherwise use copyrighted material.
https://crsreports.congress.gov | IF11150 · VERSION 10 · UPDATED