Order Code RL33371
K-12 Education: Implementation Status
of the No Child Left Behind Act of 2001
(P.L. 107-110)
Updated March 26, 2008
Gail McCallion, Coordinator,
Richard N. Apling, Jeffrey J. Kuenzi,
Wayne C. Riddle, Rebecca R. Skinner,
and David P. Smole
Specialists in Education Policy
Domestic Social Policy Division

K-12 Education: Implementation Status
of the No Child Left Behind Act of 2001 (P.L. 107-110)
Summary
The No Child Left Behind Act of 2001 (NCLBA) amended and extended the
Elementary and Secondary Education Act of 1965 (ESEA). ESEA programs are
authorized through FY2008, and the 110th Congress is considering whether to amend
and extend the ESEA.
The NCLBA, signed into law on January 8, 2002, expanded requirements for
the use of standards and assessments to measure student academic achievement, and
it strengthened state, local educational agency (LEA), and school accountability
provisions related to student achievement and other outcomes. This report
summarizes the provisions and the implementation status to date of several major
NCLBA requirements as they relate to specific NCLBA programs, and it examines
some of the implementation issues that have arisen as a consequence of these
requirements. This report is divided by topic into eleven sections, following the
sequential order (to the extent feasible) of NCLBA provisions. Although the report
may be read in its entirety, each section is written to stand alone to assist readers who
may elect to read only about topics of particular interest.
Section 1 of the report examines new standard and assessment requirements
contained in Title I-A of the ESEA as amended by the NCLBA, as well as how these
requirements build upon preexisting requirements, and the timeline for their
implementation. Section 2 focuses on new requirements regarding the National
Assessment of Educational Progress (NAEP) adopted by the NCLBA. Section 3
addresses the implementation of adequate yearly progress (AYP) requirements
adopted in the NCLBA. Section 4 looks at the outcome accountability requirements
included in the NCLBA. Section 5 examines NCLBA changes to ESEA provisions
regarding the education of limited English proficient (LEP) students. Section 6
discusses NCLBA changes regarding teacher quality issues. Section 7 focuses on the
Reading First program, newly authorized by the NCLBA. Section 8 discusses
NCLBA changes that strengthen parental involvement requirements. Section 9
addresses NCLBA provisions requiring LEAs receiving funding under the ESEA to
provide military recruiters with the same access to secondary school students that
they provide to institutions of higher education or prospective employers. Section
10 addresses NCLBA changes to ESEA requirements applicable to the participation
of children enrolled in private schools. Section 11 discusses the unsafe school choice
option established by the NCLBA.
This report will be updated periodically.

Contents
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Section 1. Standards and Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Bush Administration Reauthorization Proposals . . . . . . . . . . . . . . . . . . 9
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Section 2. National Assessment of Educational Progress . . . . . . . . . . . . . . . . . . 10
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Bush Administration Reauthorization Proposals . . . . . . . . . . . . . . . . . 12
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Section 3. Adequate Yearly Progress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Section 4. Outcome Accountability Under ESEA Title I-A . . . . . . . . . . . . . . . . 22
Rewards, Support, and Recognition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
School and LEA Improvement, Corrective Action, and Restructuring . . . . 23
Schools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
LEAs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Assistance for Local School Improvement . . . . . . . . . . . . . . . . . . . . . 26
Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Data on Schools and LEAs Failing to Meet AYP and
Identified for Improvement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
School Choice and Supplemental Educational Services . . . . . . . . . . . 31
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Identification of Schools and LEAs for Improvement . . . . . . . . . . . . . 32
Public School Choice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Supplemental Educational Services . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Section 5. Education of Limited English Proficient Pupils . . . . . . . . . . . . . . . . 35
Language Acquisition State Grants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
LEP Assessments and Accountability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Section 6. Teacher Quality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Requirement That All Teachers Be Highly Qualified . . . . . . . . . . . . . . . . . 43
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Teacher and Principal Training and Recruiting Fund . . . . . . . . . . . . . . . . . 46
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

Section 7. Reading Skills Improvement Grants . . . . . . . . . . . . . . . . . . . . . . . . . 48
Reading First . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Scientifically Based Research Requirements in the No Child
Left Behind Act . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Limitations of Existing Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Identifying Relevant Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Section 8. Parental Involvement Requirements . . . . . . . . . . . . . . . . . . . . . . . . . 55
ESEA Title I, Part A Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Section 1118, “Parental Involvement” . . . . . . . . . . . . . . . . . . . . . . . . . 56
Significant Title I-A Parental Involvement Requirements
Outside Section 1118 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Examples of Other ESEA Parental Involvement Requirements . . . . . . . . . 60
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
Section 9. Military Recruitment at Secondary Schools . . . . . . . . . . . . . . . . . . . 63
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Section 10. Participation of Children Enrolled in Private Schools . . . . . . . . . . . 67
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Section 11. Unsafe School Choice Option . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
Implementation Status . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Implementation Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
List of Tables
Table 1. ESEA Title I-A Accountability Stages for Schools and LEAs . . . . . . . 24
Table 2. Reported Percentage of Public Schools and Local Educational
Agencies Failing to Make Adequate Yearly Progress on the
Basis of Spring 2006 Assessment Results . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Table 3. Number of Schools and LEAs Identified for Improvement . . . . . . . . . 31

K-12 Education: Implementation Status
of the No Child Left Behind Act of 2001
(P.L. 107-110)
Introduction
The No Child Left Behind Act of 2001 (NCLBA) amended and extended the
Elementary and Secondary Education Act of 1965 (ESEA). ESEA programs are
authorized through FY2008, and the 110th Congress is considering whether to amend
and extend the ESEA.
The NCLB, signed into law on January 8, 2002, expanded requirements for the
use of standards and assessments to measure student academic achievement; and it
strengthened state, local educational agency (LEA), and school accountability
provisions related to student achievement and other outcomes. This report
summarizes the provisions and implementation status to date of several major NCLB
requirements as they relate to specific NCLB programs, and it examines some of the
implementation issues that have arisen as a consequence of these requirements. This
report is divided by topic into eleven sections, following the sequential order (to the
extent feasible) of NCLB provisions. Although the report may be read in its entirety,
each section is written to stand alone to assist readers who may elect to read only
about topics of particular interest. Selected issues raised in each section are briefly
highlighted in the following synopsis.
Section 1 of the report examines new standard and assessment requirements
contained in Title I-A of the ESEA, as amended by the NCLB, as well as how these
requirements build upon preexisting requirements, and the timeline for their
adoption. Section 1 also addresses two major questions regarding the
implementation of standard and assessment requirements: what is the additional
financial cost borne by states as they implement the newly required annual
assessments in additional grades and incorporate new standards and assessments at
three grade levels in science; and what are the potential educational costs and benefits
of these changes?
Section 2 of the report focuses on the National Assessment of Educational
Progress (NAEP), a federally funded series of assessments of the academic
performance of elementary and secondary students in the United States. The NCLB
includes a new requirement that states wishing to remain eligible for grants under
ESEA Title I-A participate in biennial state NAEP tests in 4th and 8th grade. Section
2 addresses three implementation issues: will these new requirements increase the
influence of NAEP on state standards and assessments; will problems arise as a result
of the new mandatory requirement for participation of states, but continuing

CRS-2
voluntary participation of pupils; and can NAEP results be used to “confirm” state
test score trends?
Section 3 of the report addresses the implementation of adequate yearly progress
(AYP) requirements that were made more challenging under the NCLB. The primary
purpose of AYP requirements is to serve as the basis for identifying schools and
LEAs where performance is inadequate, so that these inadequacies may be addressed,
first through provision of increased support and, ultimately, through a variety of
“corrective actions.” The NCLB requires states to make concrete progress toward
meeting an ultimate goal of all pupils reaching a proficient or advanced level of
achievement within 12 years. Some of the implementation issues discussed in
Section 3 include whether the Department of Education’s (ED) reviews of state AYP
policies are rigorous, transparent, appropriate, and consistent; whether the goal of all
students reaching a proficient or higher level of achievement within 12 years will
lead to states weakening pupil achievement standards; whether “too many” schools
and LEAs are failing to meet AYP goals; whether the wide variations in state
standards for pupil achievement are undermining AYP provisions; whether some
states are effectively excluding some disadvantaged pupil groups from being
considered in school-level AYP determinations by setting minimum group sizes for
these pupil groups too high; and whether the NCLB requirement for disaggregation
of pupil groups in AYP determinations makes it too difficult for schools or LEAs
with diverse populations to meet AYP standards.
Section 4 of the report looks at new outcome accountability requirements
established in the NCLB. LEAs and schools are being held to higher accountability
standards under the NCLB. This section discusses in detail the system of rewards
and sanctions that has been established to hold Title I-A schools and LEAs
accountable for their performance. Some of the implementation issues discussed in
Section 4 include the impact of ED’s approval of changes in state accountability
plans; difficulties that have arisen in implementing the requirement that students
attending schools identified for school improvement be provided public school
choice; and difficulties that have arisen in the implementation of the requirement that
students attending schools identified for a second year of school improvement,
corrective action, or restructuring be offered supplemental educational services.
Section 5 of the report examines NCLB changes to ESEA provisions regarding
the education of limited English proficient (LEP) students. Among other things, the
NCLB changed two grant programs for LEP students from competitive grants to
formula grants, and it added new assessment and accountability provisions, including
requirements regarding English language assessments. Implementation issues that
have arisen include the effect on states of year-to-year fluctuations in funding under
the newly adopted formula grant programs, the shortage of qualified bilingual
teachers, and difficulties in meeting the new requirements for English language
assessments.
Section 6 of the report discusses NCLB changes regarding teacher quality
issues. Major changes include a requirement that all teachers be highly qualified by
the end of the 2005-2006 school year, and the replacement of the ESEA Eisenhower
Professional Development and Class Size Reduction programs with a new Teacher
and Principal Training and Recruiting Fund. One implementation issue concerns the

CRS-3
scope and application of the highly qualified teacher requirements (HQT), the
meaning of some of the HQT requirements, and the ability of different kinds of
districts to meet them. A second implementation issue concerns the impact of
enhanced flexibility in the new Teacher and Principal Training and Recruiting Fund.
Concerns have been raised that this may result in a shift away from the emphasis on
math and science professional development in the Eisenhower program.
Section 7 of the report focuses on the Reading First (RF) program. The RF
program was newly authorized by the NCLB. The program is intended to incorporate
the latest scientific understanding of what works in teaching reading to improve and
expand K-3 reading programs to address concerns about student reading achievement
and to reach children at younger ages. Implementation issues that have arisen include
criticisms by some of the perceived “overprescriptiveness” of the RF program as it
has been administered, perceptions of insufficient transparency regarding ED’s
requirements of states, and allegations of conflicts of interest between consultants to
the program and commercial reading and assessment companies. ED’s OIG has
issued several critical audit reports on Reading First. In addition, the House
Committee on Education and Labor has held oversight hearings on Reading First, and
the Senate Committee on Health, Education, Labor and Pensions issued a report on
Reading First Technical Assistance directors with financial ties to publishers.
Section 8 of the report discusses NCLB changes that strengthen parental
involvement requirements. Examples of these changes include new requirements for
school-parent compacts, a requirement that 1% of LEA Title I-A grants be set aside
for parental involvement activities, and a requirement that states and LEAs
participating in Title I-A provide aggregate assessment results and certain other data
to parents and the public through report cards. National studies on implementation
issues of current parental involvement provisions are not yet available. However,
studies of previous parental involvement requirements found that 25% of Title I-A
schools had not implemented school-parent compacts, that parents remained less
involved with their children’s schools than desirable, and that parents were not
receiving the desired level or types of information from school report cards.
Section 9 of the report addresses NCLB provisions requiring LEAs receiving
funding under the ESEA to provide military recruiters with the same access to
secondary school students that they provide to postsecondary institutions or
prospective employers. Implementation issues that have arisen concern some
confusion and controversy over the implementation of the requirements, in part due
to provisions permitting secondary school students or their parents to choose to notify
the LEA that they are opting out of the disclosure of this information. Among other
concerns, ED has stated that some LEAs have misapplied the parental “opt out”
requirements by requiring written parental consent before providing information to
military recruiters, thereby creating an “opt in” rather than an “opt out” policy. On
the other hand, some parent groups have criticized schools for failing to make the
“opt out” option clearer to parents.
Section 10 of the report addresses NCLB changes to ESEA requirements
applicable to the participation of children enrolled in private schools. The most
significant changes address how services to eligible children must be arranged
between LEAs and the private schools in which eligible children are enrolled; the

CRS-4
specific programs under which services must be provided; and how the effectiveness
of these services must be assessed. Implementation issues that have arisen include
concerns regarding the timeliness of LEA consultations with private school officials,
and concerns regarding the availability of funding to serve eligible private school
students.
Section 11 of the report discusses the unsafe school choice option established
by the NCLB. This new provision requires states to establish statewide policies that
provide an opportunity to transfer to another school within the same LEA to students
attending persistently dangerous public schools, and to students who are victims of
a violent crime that occurred on their school grounds. In implementing this
provision, concerns have been raised because — although most states have
established criteria for identifying unsafe schools and have established student
transfer policies — few schools have actually been identified as unsafe.
Section 1. Standards and Assessments1
The provisions of the Elementary and Secondary Education Act (ESEA) Title
I-A, as amended by the No Child Left Behind Act (NCLB), regarding standards and
assessments reinforced and expanded upon provisions initially adopted in the
Improving America’s Schools Act of 1994 (IASA). These standard and assessment
provisions are linked to the receipt of financial assistance under ESEA Title I-A; that
is, they apply only to states wishing to maintain eligibility for Title I-A grants.2
Requirements Initially Adopted Before the NCLB. The IASA of 1994
required states to adopt standards and assessments in the subjects of reading/language
arts and mathematics at three grade levels — at least once in each of the grade ranges
of 3-5, 6-9, and 10-12. States wishing to remain eligible for Title I-A grants are
required to develop or adopt curriculum content standards, as well as academic
achievement standards and assessments tied to the standards. States were given
several years to meet the IASA requirements; the full system of standards and
assessments was not required to be in place until the 2000-2001 school year. These
requirements continue under the NCLB.
Standard and Assessment Requirements Newly Adopted Under the
NCLB. In addition to the IASA’s requirement for states to implement standards and
assessments in reading/language arts and mathematics at three grade levels, the
NCLB required states participating in ESEA Title I-A to
1 This section of the report was written by Wayne C. Riddle. For additional information on
this topic, see CRS Report RL31407, Educational Testing: Implementation of ESEA Title
I-A Requirements Under the No Child Left Behind Act
, by Wayne C. Riddle.
2 This currently includes all states (including the District of Columbia and Puerto Rico,
which are generally treated as “states” under ESEA programs).

CRS-5
! develop and adopt standards and assessments in the subjects of
mathematics and reading/language arts in each of grades 3-8 by the
end of the 2005-2006 school year, assuming certain minimum levels
of annual federal funding are provided for state assessment grants;

! adopt standards in science (at three grade levels) by the end of the
2005-2006 school year; and
! adopt assessments in science (at three grade levels) by the end of the
2007-2008 school year.
To the extent practicable, limited English proficient (LEP) pupils are to be
assessed in the language and form most likely to yield accurate and reliable
information on what they know and can do in academic content areas (in subjects
other than English itself). However, pupils who have attended schools in the United
States (excluding Puerto Rico) for three or more consecutive school years are to be
assessed in English.3 In addition, “reasonable” adaptations and accommodations are
to be provided for students with disabilities, consistent with the provisions of the
Individuals with Disabilities Education Act (IDEA).4
Achievement standards must establish at least three performance levels for all
pupils — advanced, proficient, and partially proficient (or basic). If no agency or
entity in a state has authority to establish statewide standards or assessments (as is
generally assumed to be the case for Iowa and Nebraska), then the state may adopt
either (a) statewide standards and assessments applicable only to Title I-A pupils and
programs, or (b) a policy providing that each LEA receiving Title I-A grants will
adopt standards and assessments that meet the requirements of Title I-A and are
applicable to all pupils served by each such LEA.
State educational agencies (SEAs) must provide evidence from a test publisher
or other relevant source that their assessments are of adequate technical quality for
the purposes required under Title I-A. Several statutory constraints have been placed
on the authority of the Secretary of Education to enforce these standard and
assessment requirements. First, the ESEA states that nothing in Title I shall be
construed to authorize any federal official or agency to “mandate, direct, or control
a State, local educational agency, or school’s specific instructional content, academic
achievement standards and assessments, curriculum, or program of instruction”
(Sections 1905, 9526, and 9527). Second, states may not be required to submit their
standards to the U.S. Secretary of Education (Section 1111(b)(1)(A)) or to have their
content or achievement standards approved or certified by the federal government
(Section 9527(c)) in order to receive funds under the ESEA, other than the (limited)
review necessary in order to determine whether the state meets the Title I-A technical
3 LEAs may continue to administer assessments to pupils in non-English languages for up
to a total of five years if, on a case-by-case basis, they determine that this would likely yield
more accurate information on what the students know and can do.
4 For further information on this and related topics, see CRS Report RL32913, The
Individuals with Disabilities Education Act (IDEA): Interactions with Selected Provisions
of the No Child Left Behind Act (NCLBA)
, by Richard N. Apling and Nancy Lee Jones.

CRS-6
requirements. Finally, no state plan may be disapproved by ED on the basis of
specific content or achievement standards, or assessment items or instruments
(Section 1111(e)(1)(F)). Assessment results must be provided to LEAs, schools, and
teachers before the beginning of the subsequent school year so that they might be
available in a timely manner to make adequate yearly progress determinations for
schools and LEAs (discussed later in this report).
In addition, as discussed later in this report, states are to provide that their LEAs
will annually assess the English language proficiency of their LEP pupils, including
pupils’ oral, reading, and writing skills. Finally, as is also discussed later in this
report, the NCLB requires states receiving grants under ESEA Title I-A to participate
in biennial state-level administrations of the National Assessment of Educational
Progress in 4th and 8th grade reading and mathematics. The timing of several of the
key requirements listed above is summarized in the following box.
Schedule for Implementation
of NCLB Standard and Assessment Requirements
School Year 2000-2001
! States were to have adopted content and performance standards, plus
assessments linked to these, at three grade levels in mathematics and
reading. These requirements were included in the 1994 reauthorization
of the ESEA.
School Year 2002-2003
! States were required to begin to annually assess the English language
proficiency of LEP pupils.
! States were first required to participate in biennial administration of the
NAEP.
School Year 2005-2006
! Standards-based assessments in reading and mathematics were to be
administered to pupils in each of grades 3-8 by the end of this year.
! States were required to adopt content and achievement standards at
three grade levels in science by the end of this year.
School Year 2007-2008
! States must begin to administer assessments at three grade levels in
science by the end of this year.
The ESEA authorizes (in Title VI-A-1) annual grants to the states to help pay
the costs of meeting the Title I-A standard and assessment requirements added by the
NCLB (i.e., assessments in science at three grade levels and at grades 3-8 in
mathematics and reading). These grants may be used by states for development of
standards and assessments or, if those have been developed, for assessment
administration and such related activities as developing or improving assessments of

CRS-7
the English language proficiency of LEP pupils. Enforcement of the state assessment
requirements that were newly adopted under the NCLB has been contingent upon the
appropriation of minimum annual amounts for these state assessment grants; their
implementation may be delayed by one year for each year that the following
minimum amounts are not appropriated: FY2002 — $370 million; FY2003 — $380
million; FY2004 — $390 million; and each of FY2005-FY2008 — $400 million.
For each of FY2002-FY2008, at least the minimum amounts have been appropriated
for these grants.5
Implementation Status. Although the focus of this report is on the
implementation of major new requirements of the NCLB, much of the
implementation activity regarding standards and assessments during the initial years
following enactment of the NCLB was focused on the extended process of
implementing the requirements adopted under the previous version of the ESEA, the
IASA. These requirements for standards and assessments in reading/language arts
and mathematics at three grade levels were supposed to have been met by the end of
the 2000-2001 school year, but few states met that initial deadline.6
In their reviews of state systems of standards and assessments, peer reviewers
and ED staff have been considering only various forms of “evidence” submitted by
the states that are intended to document that state standards and assessments meet the
specific Title I-A requirements; that is, they are not reviewing the standards and
assessments themselves. The peer reviews identified a number of common problem
areas, including the following: (a) a lack of adequate accommodation or
incorporation of alternate assessments for LEP and disabled pupils, (b) insufficient
documentation of the technical quality of assessments, and (c) inadequate timelines
for implementation of the assessments.
The Department’s final regulations on the NCLB standard and assessment
requirements were published in the Federal Register on July 5, 2002.7 ED published
supplementary “non-regulatory draft guidance” on all of the standard and assessment
requirements, as well as those related to NAEP participation, on March 10, 2003.8
This document was intended to provide more detailed guidance consistent with the
regulations discussed above. More recently, ED officials have published regulations
and other policy guidance on participation rates, plus the treatment of LEP pupils and
certain pupils with disabilities in assessments.9
5 The ESEA also authorizes competitive grants to states for the development of enhanced
assessment instruments. Funds appropriated each year for state assessment grants that are
in excess of the “trigger” amounts for assessment development grants listed above are to be
used for enhanced assessment grants.
6 See [http://www.ed.gov/admins/lead/account/finalassess/index.html] for details regarding
state compliance with the IASA assessment requirements.
7 Department of Education, “Elementary and Secondary Education: Disadvantaged children;
academic achievement improvement,” 67 Federal Register 45038-45047, July 5, 2002.
8 Available at [http://www.ed.gov/policy/elsec/guid/saaguidance03.doc].
9 For details, see CRS Report RL31407, Educational Testing: Implementation of ESEA Title
(continued...)

CRS-8
Beginning in the spring of 2006, ED has been conducting peer reviews for each
state’s assessment program, to determine if they meet the NCLB requirements to test
pupils in each of grades 3-8 in reading and mathematics, and to adopt content and
achievement standards in science. A letter sent to chief state school officers in April
2006 by the Assistant Secretary for Elementary and Secondary Education10 describes
the current categories of results from the state reviews. These categories, and the
number of states in each category as of January 3, 2008, include the following:
! Full Approval. Meets all statutory and regulatory requirements (24
states: Alabama, Alaska, Arkansas, Arizona, Delaware, Florida,
Georgia, Idaho, Iowa, Kansas, Kentucky, Maryland, Massachusetts,
Michigan, Missouri, Montana, North Dakota, Ohio, Oklahoma,
Pennsylvania, South Carolina, Tennessee, Virginia, and West
Virginia).
! Full Approval with Recommendations. Meets all statutory and
regulatory requirements, but ED makes selected recommendations
for improvement (three states: Indiana, North Carolina, and Utah).
! Approval Expected. “Evidence to date” suggests that the state’s
assessment system is fully compliant with the statutory and
regulatory requirements, but some elements of the system were not
complete as of July 1, 2006. The state must provide evidence of
compliance with remaining requirements before administering its
assessments for the 2006-2007 school year (five states:
Connecticut, Maine, New Mexico, New York, and Rhode Island,
plus the District of Columbia).
! Approval Pending. A limited number (generally one to three) of
fundamental components of the state assessment system fail to meet
the statutory or regulatory requirements (18 states: all of those not
listed in another category, plus Puerto Rico).
Finally, one state — Mississippi — has been granted a one-year waiver to meet the
assessment requirements, in recognition of delays arising from the 2005 Gulf Coast
hurricanes.
States in the last two categories above (Approval Pending and Not Approved)
face the possibility of loss of Title I-A administrative funds (25% in the case of the
two “not approved” states, 10% or 15% in the case of “approval pending” states),
plus the additional sanctions of limitations on approval of flexibility requests, and
heightened oversight by ED. According to ED, withheld funds (from the SEA)
would be distributed to LEAs in the state. In addition, states that persistently and
thoroughly fail to meet the standard and assessment requirements over an extended
period of time potentially may be subject to elimination of their Title I-A grants
9 (...continued)
I-A Requirements Under the No Child Left Behind Act, by Wayne C. Riddle.
10 See [http://www.ed.gov/admins/lead/account/saapr3.pdf].

CRS-9
altogether, since they would be out of compliance with a basic program
requirement.11
Bush Administration Reauthorization Proposals. The Bush
Administration’s Reauthorization Blueprint,12 released in January 2007, contains a
proposal regarding the ESEA Title I-A state assessment provisions. Participating
states would be required to develop content and performance standards in English
and math covering 2 additional years of high school by 2010-2011 and assessments
linked to these standards by 2012-2013. The assessments would include a pair of
11th grade assessments of college readiness in reading and math. However, states
would be required only to report the results of these assessments, not to use them for
adequate yearly progress determinations.
Implementation Issues. Issues raised in the implementation of the ESEA
Title I-A standards and assessment requirements thus far include the following:
! What is the financial cost of developing and implementing the
required assessments, and to what extent have federal grants been
sufficient to pay for them?
The addition of requirements to conduct
annual assessments in at least four more grades than required
previously, and to include standards and assessments at three grade
levels in science, have required most states to significantly increase
their expenditures for standard and test development and
administration. It is very difficult, if not impossible, to specify all
of these potential costs with precision. The NCLB conference report
directs the Government Accountability Office (GAO) to conduct a
study of the costs to each state of developing and administering the
assessments required under Title I-A; however, no information is yet
available from the study. Studies by private organizations of the
costs of meeting the NCLB assessment requirements, and of whether
those costs exceed the aggregate level of assessment development
funds provided under the NCLB, have reached contradictory results.
! What are the likely educational benefits and costs of the expanded
Title I-A assessment requirements? The primary benefit from annual
administration of a consistent series of standards-based tests would
be the provision of timely information on the performance of pupils,
schools, and LEAs throughout most of the elementary and middle
school grades. The availability of such consistent annual assessment
results would be of value for both diagnostic and accountability
purposes. Arguably, additional assessment information will improve
the quality of the adequate yearly progress (AYP) determinations
11 Thus far, the sanction of withholding 25% of state administration funds for failure to meet
the 1994 assessment requirements has been applied at least twice, to Georgia in 2003 and
the District of Columbia in 2005, for failure to administer assessments linked to state
content standards.
12 The document is available from the Department of Education at [http://www.ed.gov/
policy/elsec/leg/nclb/buildingonresults.pdf].

CRS-10
that are based primarily on the assessments, and help determine
whether Title I-A is meeting its primary goals, such as reducing
achievement gaps between disadvantaged and other pupils. At the
same time, the expanded Title I-A assessment requirements might
lead to a variety of educational “costs,” or unintended consequences.
One such cost would be expanded federal influence on state and
local education policies; for example, assessment requirements
attached to an aid program focused on disadvantaged pupils may
broadly influence policies regarding standards, assessments, and
accountability affecting all pupils in participating states. In the
majority of states that did not previously administer standards-based
assessments in each of grades 3-8, their policy may have resulted
primarily from cost or time constraints, or the states may have
determined that annual testing of this sort is not educationally
appropriate, or at least that its benefits are not equal to the relevant
costs. These costs may include an increased risk of “overemphasis”
on preparation for the tests, especially if the tests do not adequately
assess the full range of knowledge and skills that schools are
expected to impart.
Section 2. National Assessment
of Educational Progress13
The National Assessment of Educational Progress (NAEP) is a federally funded
series of assessments of the academic performance of elementary and secondary
students in the United States. NAEP tests generally are administered to a sample of
public and private school pupils in grades 4, 8, and 12 in a variety of subjects,
including reading, mathematics, science, writing, and less frequently, geography,
history, civics, social studies, and the arts. NAEP assessments have been conducted
since 1969.
NAEP is administered by the National Center for Education Statistics (NCES),
with oversight and several aspects of policy established by the National Assessment
Governing Board (NAGB). Both entities are part of the U.S. Department of
Education. Since 1983, the assessment has been developed primarily under a
cooperative agreement with the Educational Testing Service (ETS), a private, non-
profit organization that also develops and administers such assessments as the
Scholastic Assessment Test (SAT).
NAEP consists of two separate groups of tests. One is the main assessment, in
which test items (questions) are revised over time in both content and structure to
reflect more current instructional standards and practices. The main assessment also
reports pupil scores in relation to performance levels, which are standards for pupil
achievement that are based on score thresholds set by NAGB. The performance
13 This section was written by Wayne C. Riddle. For additional information on this topic,
see CRS Report RL31407, Educational Testing: Implementation of ESEA Title I-A
Requirements Under the No Child Left Behind Act
, by Wayne C. Riddle.

CRS-11
levels are considered to be “developmental,” and are intended to place NAEP scores
into context. They are based on determinations by NAGB of what pupils should
know and be able to do at basic (“partial mastery”), proficient (“solid academic
performance”), and advanced (“superior performance”) levels with respect to
challenging subject matter. The second group of NAEP tests form the long-term
trend assessment, which monitors trends over time in math and reading
achievement.14
All NAEP tests are administered to only a representative sample of pupils
enrolled in public and private K-12 schools, and the tests are designed so that no
pupil takes an entire NAEP test. The use of sampling is intended to minimize both
the costs of NAEP and test burdens on pupils. It also makes it possible to include a
broad range of items in each test. Since no individual pupil takes an entire NAEP
test, it is impossible for NAEP to report individual pupil scores.
The frameworks for NAEP tests provide a broad outline of the content on which
pupils are to be tested. Frameworks are developed by NAGB through a national
consensus approach involving teachers, curriculum specialists, policymakers,
business representatives, and the general public. In developing the test frameworks,
various state and national standards are taken into consideration, but the frameworks
are not intended to specifically reflect any particular set of standards. In addition,
pupils and school staff fill out background questionnaires.
Although NAEP, as currently structured, cannot provide assessment results for
individual pupils, the levels at which scores could be provided — the nation overall,
states, LEAs, or schools — depend on the size and specificity of the sample group
of pupils tested. NAEP has always provided scores for the nation as a whole and four
multistate regions. Beginning in 1990, NAEP has conducted a limited number of
state-level assessments in 4th and 8th grade mathematics, reading and, beginning in
1996, science. Under state NAEP, the sample of pupils tested in a state is increased
in order to provide reliable estimates of achievement scores for pupils in each
participating state. Beginning in 2002, NAEP mathematics and reading scores have
also been compiled for a small number of large, central city LEAs, as part of a Trial
Urban District Assessment program.
Until enactment of the NCLB, participation in NAEP was voluntary for states;
the additional cost associated with state NAEP administration was borne by the
states; and, after participating in any state NAEP test, states could separately decide
whether to allow the release of NAEP results for their state. As with other main
NAEP tests, state NAEP scores are reported with respect to performance levels —
basic, proficient, and advanced — developed by NAGB. In general, approximately
40 states participated in state-level NAEP assessments in reading and mathematics
between 1990 and 2000, and all “states” except two (South Dakota and Puerto Rico)
participated in state NAEP at least once during this period.
14 Since the long-term trend assessment is not involved with the ESEA Title I-A assessment
requirements, it will not be discussed further.

CRS-12
The NCLB provides that all states wishing to remain eligible for grants under
ESEA Title I-A are required to participate in state NAEP tests in 4th and 8th grade
reading and mathematics, which are administered every two years. The costs of
testing expanded pupil samples in the states are paid by the federal government. An
unstated but implicit purpose of this requirement is to “confirm” trends in pupil
achievement, as measured by state-selected assessments.15 In addition, agents of the
federal government are prohibited from using NAEP assessments to influence state
or LEA instructional programs or assessments.
Consistent with these NCLB provisions, appropriations for NAEP and NAGB
activities increased substantially from $40 million for FY2001 to $111.6 million for
FY2002. Funding for subsequent years has varied within a more narrow range:
$94.8 million for each of FY2003 and FY2004, $94.1 million for FY2005, $93.1
million for each of FY2006 and FY2007, and $104.1 million for FY2008.
Implementation Status. In the period since enactment of the NCLB, a
number of steps have been taken toward implementation of the new requirements for
state participation in NAEP. First, the schedule for test administration has been
revised to provide for administration of state NAEP tests in 4th and 8th grade reading
and mathematics every two years, beginning with the 2002-2003 school year (spring
2003). Initial NAEP 4th and 8th grade reading and mathematics results for all states
(but not Puerto Rico; see below) were released in November 2003. Subsequent
rounds of NAEP tests was administered in all states in 2005 and 2007.
In addition, several changes to NAEP policies and practices have been
implemented to complement the expanded role of NAEP under the NCLB.16 For
example, in recognition of the increased emphasis on measurement of performance
gaps among different demographic groups of pupils in the NCLB, more questions are
being added at the upper and lower ends of the difficulty range, so that achievement
gaps among pupil groups can be more reliably measured.
Special issues arise with respect to Puerto Rico, which is treated as a state under
ESEA Title I-A but did not participate in state NAEP tests prior to the enactment of
the NCLB. Questions have been raised about the comparability of tests administered
in different languages, especially in reading. NAEP tests in mathematics were
administered to 4th and 8th grade pupils in Puerto Rico in 2003 and 2005; results from
both test administrations were released in 2007.17
Bush Administration Reauthorization Proposals. The Bush
Administration’s Reauthorization Blueprint contains proposes that states receiving
Title I-A grants would be required to include NAEP results, along with results on
15 See the National Assessment Governing Board, Using the National Assessment of
Educational Progress to Confirm State Test Results
, report prepared by the Ad Hoc
Committee on Confirming Test Results, March 1, 2002, available at [http://www.nagb.org/
pubs/color_document.pdf].
16 See NAGB Adopts Policies to Implement the No Child Left Behind Act of 2001 at
[http://www.nagb.org/], plus [http://nces.ed.gov/nationsreportcard/about/current.asp].
17 See [http://nces.ed.gov/nationsreportcard/puertorico/].

CRS-13
state assessments, on state report cards, to facilitate cross-state comparisons of
achievement levels. Finally, the Administration has requested an increased FY2008
appropriation of $116.6 million for NAEP, in order to support expansion of biennial
state-level NAEP assessments in reading and math to the 12th grade in 2009.
Implementation Issues. Although the NAEP participation requirements of
the NCLB are being implemented fully and on schedule, a number of issues have
been raised regarding these requirements:
! Might the influence of NAEP on state standards and assessments be
increased as a result of the increased attention to, and required
participation in, NAEP tests?
State involvement with NAEP has
changed significantly under the NCLB, although the stakes for states
remain relatively low. State results are being published as an
implicit “confirmation” of test score trends on state assessments, but
these NAEP scores still have no direct impact on state eligibility for
federal assistance. Nevertheless, even a small increase in the stakes
associated with state performance on NAEP tests attracts attention
to the possibility that NAEP frameworks and test items might
influence state standards and assessments. To the extent that the
required participation in NAEP increases attention to state
performance on these tests, there might be a basis for concern that
states would have an incentive to modify their curriculum content
standards to more closely resemble the NAEP test frameworks. To
counteract this potential issue, the NCLB prohibits the use of NAEP
assessments by agents of the federal government to influence state
or LEA instructional programs or assessments. Nevertheless, subtle,
indirect, and/or unintended forms of influence may be virtually
impossible to detect or prohibit.
! Might a conflict arise between the requirement for NAEP
participation by states participating in ESEA Title I-A and the
provision that participation in NAEP tests is voluntary for pupils?
Some have expressed concern that, in a time of likely increased
assessment activity for pupils nationwide, resistance to participation
in NAEP might grow to an extent that it threatens the quality of the
national and state samples of tested pupils, leaving states stuck
between a requirement to participate in NAEP and an inability to
recruit a sufficiently large sample of pupils to participate in order to
produce valid and reliable assessment results. The primary counter
to this concern is that the policies regarding voluntary participation
in NAEP have, in practice, changed only modestly. While states or
LEAs previously could have mandated participation by pupils in the
past, apparently they generally attempted to avoid doing so.
! Can NAEP results be used to “confirm” state test score trends?
And, for the participating LEAs, can results under the Trial Urban
Assessment be used to “confirm” score trends on state tests?
An
unstated, but clearly implicit, purpose of the state NAEP
participation requirement is to “confirm” trends in pupil

CRS-14
achievement, as measured by state-selected assessments by
comparing them with trends in NAEP results. While still
“developmental,” NAEP performance standards are implicitly a
form of “nationally consistent” standards, in contrast to widely
varying state standards. Some have questioned whether it is possible
or appropriate to use results on one assessment to “confirm” results
on another assessment that may have been developed very
differently. State assessments vary widely in terms of several
important characteristics, such as the content and skills that they are
designed to assess, their format, the time of year that tests are
administered, the stakes associated with test performance, and
modes of response. State assessments are explicitly linked to state
content and performance standards, which is not the case with
NAEP. At the same time, there is substantial evidence that NAEP’s
pupil performance standards are significantly more challenging than
those of most states.18 As a result, some state assessments will be
much more similar to NAEP in these important respects than others,
and there will be consequent variation in the significance of
similarities or differences when comparing trends in NAEP versus
state assessment score trends for pupils.
Section 3. Adequate Yearly Progress19
Since the 1988 reauthorization of the ESEA (P.L. 100-297), the accountability
provisions of Title I-A have been increasingly focused on achievement and other
outcomes for participating pupils and schools. Since the subsequent ESEA
reauthorization in 1994 (the Improving America’s Schools Act of 1994, P.L. 103-
382), and particularly under the No Child Left Behind Act of 2001 (NCLB, P.L. 107-
110), a key concept embodied in these outcome accountability requirements is that
of “adequate yearly progress (AYP)” for schools, LEAs, and (with much less
emphasis) states overall. The primary purpose of AYP requirements is to serve as
the basis for identifying schools and LEAs where performance is inadequate, so that
these inadequacies may be addressed, first through provision of increased support
and, ultimately, through a variety of “corrective actions.” These actions are to be
taken with respect to schools or LEAs that fail to meet AYP for two consecutive
years or more
; no action need be taken with respect to a school or LEA failing to
meet AYP standards for only one year at a time. (See discussion below on “Outcome
Accountability under ESEA Title I-A.”)
Through the NCLB, the Title I-A requirements for state-developed standards of
AYP were substantially expanded in scope and specificity. The NCLB provisions
18 See, for example, “Keeping An Eye on State Standards, A Race to the Bottom?,” by Paul
E. Peterson and Frederick M. Hess, Education Next, Summer 2006, p. 28.
19 This section was written by Wayne C. Riddle. For additional information on this topic,
see CRS Report RL32495, Adequate Yearly Progress (AYP): Implementation of the No
Child Left Behind Act
; and CRS Report RL33032, Adequate Yearly Progress (AYP): Growth
Models Under the No Child Left Behind Act
, both by Wayne C. Riddle.

CRS-15
regarding AYP may be seen as an evolution of, and as a reaction to perceived
weaknesses in, the AYP requirements of the 1994 IASA. The latter were frequently
criticized as being insufficiently specific, detailed, or challenging, especially in their
failure to focus on specific disadvantaged pupil groups or to require continuous
improvement toward any ultimate goal.
Under the NCLB, AYP is defined primarily on the basis of multiple
aggregations of pupil scores on state assessments of academic achievement. State
AYP standards must also include at least one additional academic indicator. In the
case of high schools, this additional indicator must be the graduation rate; for
elementary and middle schools, the attendance rate is often used as the additional
indicator. The additional indicators may not be employed in a way that would reduce
the number of schools or LEAs identified as failing to meet AYP standards. In
addition, AYP calculations must be disaggregated; that is, they must be determined
separately and specifically for not only all pupils but also for several demographic
groups of pupils within each school, LEA, and state. The specified demographic
groups are
! economically disadvantaged pupils,
! LEP pupils,
! pupils with disabilities, and
! pupils in major racial and ethnic groups,
! as well as all pupils.
However, there are three major constraints on the consideration of these pupil
groups in AYP calculations. First, pupil groups need not be considered in cases
where their number is so relatively small that achievement results would not be
statistically significant or the identity of individual pupils might be divulged. The
selection of the minimum number (n) of pupils in a group to be considered in AYP
determinations has been left largely to state discretion, and state policies regarding
“n” have varied widely, from as few as 5 pupils to as many as 200 in some cases,
with consequent wide variation in the extent to which pupils in the groups listed
above are actually taken into specific consideration in AYP determinations for
schools and LEAs. Second, it has been left to the states to define the “major racial
and ethnic groups” on the basis of which AYP must be calculated; some states have
identified substantially more such groups than have other states. And third, pupils
who have not attended the same school for a full year need not be considered in
determining AYP at the school level, although they are still to be included in LEA
and state AYP determinations (if they attended schools in the same LEA or state for
the full academic year).
AYP standards under the NCLB must be applied to all public schools, LEAs,
and to states overall, if a state chooses to receive Title I-A grants. However,
corrective actions for failing to meet AYP standards need only be applied to schools
and LEAs participating in Title I-A, and there are no consequences for states failing
to meet AYP standards beyond the provision of technical assistance.
AYP standards developed by the states must incorporate concrete movement
toward meeting an ultimate goal of all pupils reaching a proficient or advanced level
of achievement by the end of the 2013-2014 school year. The steps — that is,

CRS-16
required levels of achievement — toward meeting this goal must increase in “equal
increments” over time. The first increase in the thresholds had to occur after no more
than two years, and remaining increases at least once every three years. Several
states have accommodated these requirements in ways that assume much more rapid
progress in the later years of the period leading up to 2013-2014 than in the earlier
period.
The primary basic structure for AYP under the NCLB is now specified in the
authorizing statute as a “group status model.”20 A “uniform bar” approach is
employed: states are to set a threshold percentage of pupils at proficient or advanced
levels of performance each year that is applicable to all pupil subgroups of sufficient
size to be considered in AYP determinations. The threshold levels of achievement
are to be set separately for reading and math, and may be set separately for each level
of K-12 education (elementary, middle, and high schools).
In determining whether scores for a group of pupils are at the required level, the
averaging of scores over two to three years is allowed. In addition, the NCLB statute
includes an alternative safe harbor provision, under which a school that does not
meet the standard AYP requirements may still be deemed to meet AYP if there is a
10% reduction, compared to the previous year, in the number of pupils in each of the
pupil groups failing to reach the standard requirement, and those groups also make
progress on at least one other academic indicator included in the state’s AYP
standards. This alternative provision adds “successive group improvement” as a
secondary type of AYP model under the NCLB.
A third basic type of AYP model, not explicitly mentioned in the NCLB statute,
is the individual/cohort growth model. The key characteristic of this model is a focus
on the rate of change over time in the level of achievement among cohorts of the
same pupils. Growth models are longitudinal, based upon the tracking of the same
pupils as they progress through their K-12 education careers. Although the progress
of pupils is tracked individually, results are typically aggregated when used for
accountability purposes. In general, growth models would give credit for meeting
steps along the way to proficiency in ways that a status model typically does not.
In November 2005, the Secretary of Education announced a growth model pilot
program under which up to 10 states would be allowed to use growth models to make
AYP determinations.21 Thus far, the proposals of nine states (Alaska, Arizona,
Arkansas, Delaware, Florida, Iowa, Ohio, North Carolina, and Tennessee) have been
approved to participate in this pilot program.22 In December 2007, the cap on the
20 For a discussion of the models of AYP, see CRS Report RL33032, Adequate Yearly
Progress (AYP): Growth Models Under the No Child Left Behind Act
, by Wayne C. Riddle.
21 U.S. Department of Education, “Secretary Spellings Announces Growth Model Pilot,
Addresses Chief State School Officers’ Annual Policy Forum in Richmond,” press release,
November 18, 2005, at [http://www.ed.gov/news/pressreleases/2005/11/11182005.html].
22 For details on the growth model pilot, see CRS Report RL33032, Adequate Yearly
Progress (AYP): Growth Models Under the No Child Left Behind Act
, by Wayne C. Riddle.

CRS-17
number of states that could participate in the growth model pilot was lifted by the
Secretary of Education.23
Finally, the NCLB AYP provisions include an assessment participation rate
requirement. In order for a school to meet AYP standards, at least 95% of all pupils,
as well as at least 95% of each of the demographic groups of pupils considered for
AYP determinations for the school or LEA, must participate in the assessments that
serve as the primary basis for AYP determinations.24
Implementation Status. States began determining AYP for schools, LEAs,
and the states overall based on the NCLB provisions beginning with the 2002-2003
school year. The deadline for states to submit to ED their AYP standards based on
the NCLB provisions was January 31, 2003, and all states met this deadline. On June
10, 2003, ED announced that accountability plans had been approved for all states.
However, many of the approved plans required states to take additional actions
following submission of their plan.25
In the period preceding ED’s review of state accountability plans under the
NCLB, the Department published regulations in the Federal Register on December
2, 2002, that essentially mirrored the relevant provisions in the authorizing statute.
Aspects of state AYP plans that apparently received special attention in ED’s reviews
of them included (1) the pace at which proficiency levels are expected to improve;
(2) whether schools or LEAs must fail to meet AYP with respect to the same pupil
group(s), grade level(s) and/or subject areas to be identified as needing improvement,
or whether two consecutive years of failure to meet AYP with respect to any of these
categories should lead to identification; (3) the length of time over which pupils
should be identified as being LEP; (4) the minimum size of pupil groups in order to
be considered in AYP determinations; (5) whether to allow schools credit for raising
pupil scores from below basic to basic in making AYP determinations; and (6)
whether to allow use of statistical techniques such as “confidence intervals” (i.e.,
whether scores are below the required level to a statistically significant extent) in
AYP determinations.
On several occasions, beginning in late 2003, ED officials have published
additional regulations and other policy guidance on selected aspects of AYP
determination and related assessment issues, in an effort to provide additional
clarification and, in many cases, increased flexibility. This guidance has addressed
several aspects of AYP implementation that have created particular difficulties for
many schools and LEAs: assessment participation rates, calculation of AYP with
respect to LEP pupils and pupils with disabilities, plus options for determining AYP
in targeted assistance Title I-A programs.26
23 See [http://www.ed.gov/policy/elsec/guid/secletter/071207.html].
24 These participation rates may be averaged over a two- or three-year period.
25 The plans have been posted by ED at [http://www.ed.gov/admins/lead/account/state
plans03/index.html].
26 For details on these policy changes, see CRS Report RL32495, Adequate Yearly Progress
(continued...)

CRS-18
Over the period following the initial submission and approval of state
accountability plans for AYP and related policies in 2003 through the present, many
states have proposed a number of revisions to their plans.27 The major aspects of
state accountability plans for which changes have been proposed and approved
include (a) changes to take advantage of revised federal regulations and policy
guidance regarding assessment of pupils with the most significant cognitive
disabilities, LEP pupils, and test participation rates; (b) limiting identification for
improvement to schools that fail to meet AYP in the same subject area for two or
more consecutive years, and limiting identification of LEAs for improvement to
those that failed to meet AYP in the same subject area and across all three grade
spans for two or more consecutive years; (c) using alternative methods to determine
AYP for schools with very low enrollment; (d) initiating or expanding use of
confidence intervals in AYP determinations; (e) changing (usually increasing)
minimum group size; and (f) changing graduation rate targets for high schools.
Accountability plan changes that have frequently been requested but not approved by
ED include (a) identification of schools for improvement only if they failed to meet
AYP with respect to the same pupil group and subject area for two or more
consecutive years, and (b) retroactive application of new forms of flexibility to
previous years.28
Bush Administration Reauthorization Proposals. The Bush
Administration’s Reauthorization Blueprint,29 released in January 2007, contains
three proposals regarding the ESEA Title I-A AYP provisions. First, all participating
states would be allowed to use growth models to make AYP determinations, subject
to conditions comparable to those applicable to the current pilot program. In
addition, by the end of the 2011-2012 school year, graduation rates used as the
additional academic indicator in AYP determinations for high schools would have
to be disaggregated according to the same demographic groups as achievement
levels. Further, states would be required to use a standard measure in calculating
graduation rates, known as the averaged freshman graduation rate (AFGR). Finally,
the Administration proposes that science test results to be included in AYP
determinations beginning in 2008-2009, although with a delayed goal for proficiency
(2019-2020), in contrast to the 2013-2014 goal for reading and math.
26 (...continued)
(AYP): Implementation of the No Child Left Behind Act, by Wayne C. Riddle.
27 For information on accountability plan revisions proposed by each state, see
[http://www.ed.gov/admins/lead/account/letters/index.html].
28 See Center on Education Policy, “Rule Changes Could Help More Schools Meet Test
Score Targets for the No Child Left Behind Act,” October 22, 2004, available at
[http://www.cep-dc.org/]; “Changes in Accountability Plans Dilute Standards, Critics Say,”
Title I Monitor, November 2004; Council of Chief State School Officers, “Revisiting
Statewide Educational Accountability Under NCLB,” September 2004, available at
[http://www.ccsso.org]; and “Requests Win More Leeway Under NCLB,” Education Week,
July 13, 2005, p. 1.
29 The document is available from the Department of Education at [http://www.ed.gov/
policy/elsec/leg/nclb/buildingonresults.pdf].

CRS-19
Implementation Issues. A number of issues have arisen during the
implementation of the NCLB provisions regarding AYP. They include the following.
! Have ED’s reviews of state AYP policies been appropriately
rigorous, transparent, flexible and consistent? As ED staff and
designated peer reviewers have examined initial and revised state
AYP policies, several observers have expressed concerns about: a
lack of transparency in the review procedures and criteria;
inconsistencies (especially over time) in the types of changes that
ED officials have approved (for example, approving a number of
requests to increase minimum group sizes during some periods of
time, but approving few or no such changes during other periods);
whether the net effect of the changes is to make the accountability
requirements more reasonable or to undesirably weaken them;
whether the changes may make an already complicated
accountability system even more complex; and timing — whether
decisions on proposed changes are being made in a timely manner
by ED.
! Is the ultimate goal embodied in the NCLB’s AYP provisions — all
pupils at a proficient or higher level of achievement within 12 years
of enactment — both desirable and achievable without a substantial
weakening by states of pupil achievement standards?
The required
incorporation of this ultimate goal is one of the most significant
differences between the AYP provisions of the NCLB and those
under the previous IASA. Without an ultimate goal of having all
pupils reach the proficient level of achievement by a specific date,
states might simply establish relative goals that provide no real
movement toward, or incentives for, significant improvement,
especially among disadvantaged pupil groups. Proponents of such
a demanding ultimate goal argue that schools and LEAs frequently
meet the goals established for them, even rather challenging goals,
if the goals are very clearly identified, defined, and established, and
are attainable. A demanding goal might maximize efforts toward
improvement by state public school systems, even if the goal is not
met. Nevertheless, a goal of having all pupils at a proficient or
higher level of achievement, within any specified period of time,
may be criticized as being “unrealistic,” if one assumes that
“proficiency” has been established at a challenging level. It is likely
that many states, schools and LEAs will not meet the NCLB’s
ultimate AYP goal, unless state standards of proficient performance
are significantly lowered and/or states aggressively pursue the use of
such statistical techniques as setting high minimum group sizes and
confidence intervals to substantially reduce the range of pupil groups
actually considered in AYP determinations and effectively lower
required achievement level thresholds.
! Are such statistical techniques as confidence intervals and data-
averaging being appropriately applied in state AYP policies? Many
states have used one or both of these statistical techniques to attempt

CRS-20
to improve the validity and reliability of AYP determinations, with
an effect in most cases of reducing the number of schools or LEAs
identified as failing to meet AYP standards. The averaging of test
score results for various pupil groups over two- or three-year periods
is explicitly authorized under the NCLB; the use of confidence
intervals was not explicitly envisioned in the drafting of the NCLB’s
AYP provisions, but has been approved by ED and widely adopted
by states. The use of confidence intervals to determine whether
group test scores fall below required thresholds to a statistically
significant degree
addresses the fact that test scores for any group of
pupils will vary from one test administration to another, and these
variations may be especially large for a relatively small group of
pupils. At the same time, the use of confidence intervals reduces the
likelihood that schools or LEAs will be identified as failing to make
AYP. For small pupil groups and high levels of desired accuracy,
the size of confidence intervals may be rather large. Ultimately, the
use of this technique may mean that the average achievement levels
of pupil groups in many schools will be below 100% proficiency by
2013-2014, yet the schools would still meet AYP standards because
the groups’ scores are within relevant confidence intervals.
! Are some states setting minimum group size levels so high that a
large proportion of some disadvantaged pupil groups is not being
considered in school-level AYP determinations?
Another important
technical factor in state AYP standards is the establishment of the
minimum size (n) for pupil groups to be considered in AYP
calculations. The NCLB recognizes that in the disaggregation of
pupil data for schools and LEAs, there might be pupil groups that are
so small that average test scores would not be statistically reliable,
or the dissemination of average scores for the group might risk
violation of pupils’ privacy rights. The selection of this minimum
number has been left to state discretion, and the range of selected
values for “n” is rather large. The higher the minimum group size,
the less likely that many pupil groups will be separately considered
in AYP determinations. This gives schools and LEAs fewer
thresholds to meet, and reduces the likelihood that they will be found
to have failed to meet AYP standards. At the same time, relatively
high levels for “n” weaken the NCLB’s specific focus on a variety
of pupil groups, many of them disadvantaged.
! Does the requirement for disaggregation of pupil groups in AYP
determinations make it too difficult for schools or LEAs with diverse
pupil populations to meet AYP standards?
All other relevant factors
(especially minimum group size) being equal, the more diverse its
pupil population, the more thresholds a school or LEA must meet in
order to make AYP. While this was an intended result of legislation
designed to focus on specific disadvantaged pupil groups, the impact
of making it more difficult for schools and LEAs serving diverse
populations to meet AYP standards may also be seen as an
unintended consequence of the NCLB. A number of studies have

CRS-21
concluded that, when comparing public schools with comparable
aggregate pupil achievement levels or aggregate percentages of
pupils from low-income families, schools with larger numbers of
different NCLB-relevant demographic groups are substantially less
likely to meet AYP standards. However, without specific
requirements for achievement gains by each of the major pupil
groups, it is possible that insufficient attention would be paid to the
performance of the disadvantaged pupil groups among whom
improvements are most needed, and for whose benefit the Title I-A
program was established, since it is possible for many schools and
LEAs to demonstrate improvements in achievement by their pupils
overall while the achievement of their disadvantaged pupils does not
improve significantly.
! Are “too many” schools and LEAs failing to meet AYP standards?
As is discussed in the following section of this report, relatively
large percentages of public schools and LEAs overall have failed to
meet state AYP standards. Future increases in performance
thresholds, as the ultimate goal of all pupils at the proficient or
higher level of achievement is approached, may result in higher
percentages of schools failing to make AYP. ED officials have
emphasized the importance of taking action to identify and improve
underperforming schools, no matter how numerous. They have also
emphasized the possibilities for flexibility in taking corrective
actions with respect to schools that fail to meet AYP, depending on
the extent to which they fail to meet those standards. Further, some
analysts argue that a set of AYP standards that a relatively high
percentage of public schools fails to meet may accurately reflect
pervasive weaknesses in public school systems, especially with
respect to disadvantaged pupil groups. Others have consistently
expressed concern about the accuracy, efficacy, and complexity of
an accountability system under which such a relatively high
percentage of schools is identified as failing to make adequate
progress, with consequent strain on financial and other resources
necessary to provide technical assistance, public school choice and
supplemental services options, as well as other corrective actions.
! Are the NCLB’s AYP provisions being undermined by wide
variations in state standards for pupil achievement? The percentage
of public schools and LEAs failing to meet AYP standards is not
only relatively large in the aggregate, but varies widely among the
states. It is likely that state variations in the percentage of schools
failing to meet AYP standards are based not only on underlying
differences in achievement levels, as well as a variety of technical
factors in state AYP provisions, but also on differences in the degree
of rigor or challenge in state pupil performance standards and
assessments. While the basic structure of AYP definitions is now
substantially more consistent across states than before enactment of
the NCLB, significant variations remain with respect to technical
factors such as minimum group size and confidence intervals, and

CRS-22
there appear to be substantial differences in the degree of challenge
embodied in state standards and assessments. Such variation
reflects, and may be the inevitable result of, federalism in education
policy-making. Nevertheless, as the NCLB is considered for
reauthorization by the 110th Congress, there may be interest in
attempting to make pupil performance expectations more consistent
across the nation.
! Is the 95% assessment participation requirement too high? In
several cases, schools or LEAs fail to meet AYP solely because
participation rates in assessments fall marginally below the required
level of 95% of all pupils, as well as 95% of pupils in each of the
relevant demographic groups meeting the minimum size threshold.
While few argue against having any participation rate requirement,
it may be questioned whether it needs to be as high as 95%. The
average percentage of enrolled pupils in attendance at public K-12
schools in recent years (93.5%) is below this level, and such
attendance rates are generally assumed to be substantially lower than
this national average in schools with high proportions of
disadvantaged pupils. Even though schools are explicitly allowed
to administer assessments on make-up days following the primary
date of test administration, and it is probable that more schools and
LEAs will meet this requirement as they become more fully aware
of its significance, it is likely to continue to be very difficult for
some schools and LEAs to meet a 95% test participation
requirement. According to the recent ED report, “National
Assessment of Title I: Interim Report,” 6% of the schools that failed
to meet AYP requirements for the 2003-2004 school year did so on
the basis of participation rates in addition to other factors.
Section 4. Outcome Accountability
Under ESEA Title I-A30
The No Child Left Behind Act (NCLB) strengthened the accountability
provisions of ESEA Title I-A over what was required under the Improving America’s
Schools Act (IASA) by requiring states to demonstrate in their state plans that they
have a single, statewide accountability system, applicable to all elementary and
secondary schools and LEAs in the state. Each state’s accountability system must be
based on the academic assessments and other academic indicators it uses to measure
academic progress. LEAs are required to annually review the status of each public
school in making adequate yearly progress (AYP) toward state standards of academic
achievement; and SEAs are required to annually review the status of each LEA in
making AYP. (Accountability provisions for charter schools must be implemented
to be consistent with state charter school laws.) The ESEA establishes a system of
rewards and sanctions designed to hold Title I-A schools and LEAs accountable for
30 This section was written by David P. Smole with contributions from Wayne C. Riddle.

CRS-23
their performance. Each year, states and LEAs are required to prepare and
disseminate report cards containing academic achievement and other data. States are
also required to prepare annual reports for submission to the Secretary. The
Secretary, in turn, is required to compile national and state-level data for presentation
in annual reports to Congress. While AYP determinations must be made with respect
to every public school and LEA in a state that receives Title I-A funds, states vary in
the extent to which they apply the sanctions to non-Title I-A schools or LEAs.
Rewards, Support, and Recognition
Each state participating in ESEA Title I-A is required to establish an Academic
Achievement Awards Program for purposes of making academic achievement awards
to schools that have either significantly closed academic achievement gaps between
student subgroups or exceeded their AYP requirements for two or more consecutive
years. States may also give awards to LEAs that have exceeded their AYP
requirements for two or more consecutive years. Under Academic Achievement
Award Programs, states may recognize and provide financial awards to teachers or
principals in schools that have significantly closed the academic achievement gap or
that have made AYP for two consecutive years. States may fund Academic
Achievement Awards for schools and LEAs by reserving up to 5% of any Title I-A
funding that is in excess of the state’s previous year’s allocation.31 States may fund
teacher and principal awards by reserving such sums as necessary from the amount
received under ESEA Title II-A-1 — Teacher and Principal Training and Recruiting
Fund, Grants to States.
As part of its role in overseeing the implementation of the ESEA, ED’s Office
of Elementary and Secondary Education, School Achievement and School
Accountability programs office (OESE/SASA) monitors various aspects of the
implementation of ESEA achievement and accountability requirements, including
procedures for reserving funds for school improvement and, if applicable, the State
Academic Achievement Awards program.32 California is an example of a state that
has implemented an extensive Academic Achievement Awards program.33
School and LEA Improvement,
Corrective Action, and Restructuring

When Title I-A schools do not make AYP for two or more consecutive years,
they become subject to a range of increasingly severe sanctions, which are coupled
31 Guidance on procedures for reserving funds for State Academic Achievement Awards
Programs is available in U.S. Department of Education, Office of Elementary and Secondary
Education, Guidance: State Educational Agency Procedures for Adjusting Basic,
Concentration, Targeted, and Education Finance Incentive Grant Allocations Determined
by the U.S. Department of Education
, May 23, 2003, pp. 32-34.
32 U.S. Department of Education, Office of Elementary and Secondary Education, “Title I
Program Monitoring,” at [http://www.ed.gov/admins/lead/account/monitoring/index.html].
33 California Department of Education, “Academic Achievement Awards,” at
[http://www.cde.ca.gov/ta/sr/aa/index.asp].

CRS-24
with technical assistance provided by the LEA. LEAs become subject to sanctions
— overseen by the SEA — in instances where they do not make AYP for two or
more consecutive years. Table 1 depicts the stages in which sanctions are applied to
schools and LEAs under ESEA Title I-A. Requirements for schools and LEAs are
described below for each accountability stage.
Table 1. ESEA Title I-A Accountability Stages
for Schools and LEAs
Cumulative Years
Accountability Stage
Not Making AYP
School
LEA
1
N/A
N/A
2
School improvement
LEA improvementa
3
2nd year of school improvementb
LEA improvementa
4
Corrective actionb
Corrective actionb
5
Plan for restructuringb
Corrective actionb
6
Implement restructuringb
Corrective actionb
Source: ESEA, § 1116.
N/A — Not Applicable.
a. SEAs may implement corrective action for an LEA identified for LEA improvement.
b. Accountability requirements associated with the 2nd year of school improvement, corrective action,
and restructuring may be delayed for up to one year for a school or LEA if it makes AYP for one
year, or if its failure to make AYP is due to a natural disaster or a significant decline in financial
resources.
Schools. After not making AYP for two consecutive years, a Title I-A school
is identified for school improvement. Being designated for school improvement
carries with it the requirement to develop or revise a school plan designed to result
in the improvement of the school. LEAs are required to provide schools within their
jurisdictions with technical assistance in the design and implementation of school
improvement plans. Schools identified for improvement must use at least 10% of
their Title I-A funding for professional development. All students attending Title I-A
schools identified for school improvement also must be offered public school choice
— the opportunity to transfer to another public school within the same LEA.34
Under public school choice, students must be afforded the opportunity to choose
from among two or more schools, located within the same LEA, that have not been
identified for school improvement, corrective action, or restructuring, and that also
have not been identified as persistently dangerous schools (described in Section 11).
LEAs are required to provide students who transfer to different schools with
transportation and must give priority in choosing schools to the lowest-achieving
children from low-income families. LEAs may not use lack of capacity as a reason
for denying students the opportunity to transfer to a school of choice.35 In instances
where there are no eligible schools in the student’s LEA, LEAs are encouraged to
34 For further information on public school choice, see CRS Report RL33506, School Choice
Under the ESEA: Programs and Requirements
, by David P. Smole.
35 34 CFR 200.44(d).

CRS-25
enter into cooperative agreements with surrounding LEAs to enable students to
transfer to an eligible public school.
If, after being identified for school improvement, a school does not make AYP
for another year, it must be identified for a second year of school improvement by the
end of that school year. All students attending a school identified for a second year
of school improvement must continue to be offered the option of attending another
eligible public school within the same LEA. In addition, students from low-income
families who continue to attend the school must be offered the opportunity to receive
supplemental educational services (SES).36 Supplemental educational services are
educational activities, such as tutoring, that are provided outside of normal school
hours and which are designed to augment or enhance the educational services
provided during regular periods of instruction. Supplemental educational services
may be provided by a non-profit entity, a for-profit entity, or the LEA, unless such
services are determined by the state education agency (SEA) to be unavailable in the
local area.37 The SEA is required to maintain a list of approved SES providers
(including those offering services through distance learning) from which parents can
select. LEAs may be required to expend up to an amount equal to 20% of their Title
I-A grants on transportation for public school choice and supplemental educational
services combined.
If a school fails to make AYP for a total of two years after being identified for
school improvement, it must be identified for corrective action by the end of the
school year. For schools identified for corrective action, LEAs must continue to
provide technical assistance, offer public school choice and supplemental educational
services, and must implement one of the following corrective actions: replacing
school staff relevant to the school not making AYP; implementing a new curriculum;
limiting management authority at the school level; appointing an expert advisor to
assist in implementing the school improvement plan; extending the school year or the
school day; or restructuring the school’s internal organization. If a school does not
make AYP for a third year after being identified for school improvement, by the end
of the school year the LEA must begin to plan for restructuring, while continuing to
implement the requirements of corrective action. Restructuring of the school must
involve implementation of some form of alternative governance structure, such as
reopening the school as a charter school, replacing all or most of the school staff,
contracting with an education management organization to operate the school, or
turning the school over to the SEA. If an additional year passes without the school
making AYP, the LEA must implement restructuring of the school.
Any of the sanctions described above may be delayed for up to one year if the
school makes AYP for a single year, or if the school’s failure to make AYP is due to
unforseen circumstances, such as a natural disaster or a significant decline in
36 For further information on supplemental educational services, see CRS Report RL31329,
Supplemental Educational Services for Children from Low-Income Families Under ESEA
Title I-A
, by David P. Smole.
37 Schools identified for improvement, corrective action, or restructuring, and LEAs
identified for improvement or corrective action, lose their eligibility to supplemental
educational services providers.

CRS-26
financial resources of the LEA or school. Schools that make AYP for two
consecutive years may no longer be identified for school improvement, nor subject
to the sanctions associated with school improvement, corrective action, or
restructuring.
LEAs. In instances where a Title I-A LEA fails to make AYP for two
consecutive years, the SEA must identify it for LEA improvement and require the
LEA to develop and implement a new or revised LEA education plan, with technical
assistance provided by the state. If two more years pass without the LEA making
AYP, the SEA must identify it for corrective action by the end of the school year.
Corrective action must consist of at least one of the following activities: deferring
programmatic funds or reducing administrative funds; implementing a new
curriculum; replacing staff relevant to the LEA not making AYP; removing schools
from the jurisdiction of the LEA; placing the LEA under receivership or trusteeship;
abolishing or restructuring the LEA; or (in conjunction with one of the
aforementioned activities), authorizing students attending a school in that LEA to
transfer to an eligible public school in another LEA, with transportation costs
provided by the sending LEA. SEAs also may implement the requirements of
corrective action for an LEA that has been identified for improvement.
Sanctions for LEAs may be delayed for up to one year if the LEA makes AYP
for a single year, or if failure to make AYP is due to unforseen circumstances, such
as a natural disaster or a significant decline in financial resources of the LEA. Once
an LEA makes AYP for two consecutive years, it is no longer identified for
improvement nor subject to corrective action.
Assistance for Local School Improvement. Currently, under ESEA §
1003(a), states are required to reserve 4% of their total Title I-A allocations for
school improvement grants; however, grants to individual LEAs are not supposed to
be reduced compared to the previous year as a result of reserving these funds in what
amounts to a “hold harmless” provision. In addition, ESEA § 1002(i) authorizes the
appropriation of such sums as may be necessary for grants to states under § 1003(g)
for LEA school improvement assistance subgrants. States are eligible to apply for
grants, which are allocated in proportion to each state’s share of funds provided
under ESEA Title I, Parts A, C, and D. Subgrants to LEAs must be between $50,000
and $500,000 for each school, and must be renewable for up to two additional years
if schools meet the goals of their school improvement plans. Subgrants must be used
by LEAs to support school improvement and recognition as required under ESEA §§
1116 and 1117. LEAs with the lowest-achieving schools must be given priority in
the awarding of subgrants.
In general, SEAs are required to allocate 95% of any funds reserved under §
1003(a) and received under § 1003(g) directly to LEAs for schools identified for
improvement, corrective action, or restructuring; however, with LEA approval, SEAs
may directly provide or arrange for the provision of school improvement activities
through other entities. The remaining 5% of funds may be used for state-level school
improvement activities and administration. For FY2007, $125 million was
appropriated for school improvement activities authorized under § 1003(g); for
FY2008, $491.3million has been appropriated for school improvement grants.

CRS-27
Reports
States and LEAs are required to prepare annual report cards containing
academic achievement information for the state, LEAs, and schools, and must make
them publicly available. LEAs are also required to provide parents of students
attending Title I-A schools with information on the professional qualifications of the
student’s teachers. Annual report cards must, at a minimum, contain the following
information:
! information on student achievement at each proficiency level on
state academic assessments, in the aggregate and disaggregated
according to each student subgroup;
! a comparison between actual student achievement levels and the
state’s AYP goal, for each student subgroup;
! the percentage of students not tested, in the aggregate and
disaggregated by student subgroup;
! trends in student achievement in each subject area for each grade
level assessed, for the most recent two-year period;
! aggregate information on any other indicators used in determining
AYP;
! secondary school graduation rates;
! AYP data for LEAs, including the number and names of schools
identified for school improvement; and
! information on the professional qualifications of teachers.
Each year, states are required to prepare reports for the Secretary containing
information on the implementation of academic assessments; student academic
achievement, in the aggregate, and disaggregated by student subgroup; information
on the acquisition of English proficiency by students with limited English
proficiency; information on each school identified for school improvement; the
number of students and schools participating in public school choice and
supplemental educational services; and information on teacher quality, including the
percentage of classes being taught by highly qualified teachers at the state, LEA, and
school levels. The Secretary is required to compile the data reported by states into
reports to be submitted annually to the House Committee on Education and the
Workforce and the Senate Committee on Health, Education, Labor, and Pensions.
Implementation Status
Data on Schools and LEAs Failing to Meet AYP and Identified for
Improvement. A substantial amount of data has become available on the number
of schools and LEAs that have failed to meet the AYP standards of the NCLB based
on assessments administered during the 2002-2003 through 2006-2007 school years.
A basic problem with these data, however, is that they frequently have been
incomplete and subject to change. Currently available compilations of state AYP
data are discussed below in two categories: reports focusing on the number and
percentage of schools failing to meet AYP standards for one or more years versus
reports on the number of public schools and LEAs identified for improvement — that
is, they had failed to meet AYP standards for at least two consecutive years.

CRS-28
Schools Failing to Meet AYP Standards for One Year. Beginning with
the 2002-2003 school year, data on the number of schools in each state that made or
did not make AYP have been reported by the states to ED, in a series of Consolidated
State Performance Reports. Until recently, these Reports were not disseminated by
ED; however, the Consolidated State Performance Reports for the 2005-2006 school
year have been made available by ED on its website.38
According to these Consolidated State Performance Reports,39 for the nation
overall, 28% of all public schools failed to make adequate yearly progress based on
assessment scores for the 2005-2006 school year. The percentage of public schools
failing to make adequate yearly progress for 2005-2006 varied widely among the
states, from 4% for Wisconsin and 9% for Louisiana to 86% for the District of
Columbia and 71% for Florida. Table 2 provides the percentage of schools failing
to make adequately yearly progress, based on 2005-2006 assessment results, for each
state.
LEAs Failing to Meet AYP Standards. Although most attention, in both
the statute and implementation activities, thus far has been focused on application of
the AYP concept to schools, a limited amount of information is becoming available
about LEAs that fail to meet AYP requirements, and the consequences for them.
According to the Consolidated State Performance Reports referred to above,
approximately 30% of all LEAs failed to meet AYP standards on the basis of
assessment results for the 2005-2006 school year.40 Among the states, there was even
greater variation for LEAs than for schools. Two states — Alabama and Michigan41
— reported that none of their LEAs failed to make adequate yearly progress, and
Wisconsin reported that only one of the state’s 426 LEAs failed to make AYP, while
100% of the LEAs in Florida and South Carolina, plus the single, statewide LEA in
Hawaii, failed to meet AYP standards.
38 See [http://www.ed.gov/admins/lead/account/consolidated/sy05-06/index.html].
39 For one state, Maine, these data were not available in the Consolidated State Performance
Report and were obtained directly from the state educational agency.
40 This calculation was based on data for all states except Maine.
41 See endnotes a and d in Table 2 regarding the LEA data for Alabama and Michigan.

CRS-29
Table 2. Reported Percentage of Public Schools and Local
Educational Agencies Failing to Make Adequate Yearly Progress
on the Basis of Spring 2006 Assessment Results
Reported Percentage
Reported Percentage of
State
of Rated Schools Not
LEAs Not Making AYP,
Making AYP, 2006
2006
Alabama
11
0
Alaska
38
54
Arizona
33
39
Arkansas
39
2
California
34
37
Colorado
25
40
Connecticut
34
19
Delaware
19
11
District of Columbia
86
94
Florida
71
100
Georgia
21
65
Hawaii
65
100
Idaho
27
48
Illinois
21
23
Indiana
51
27
Iowa
17
4
Kansas
14
12
Kentucky
34
56
Louisiana
9
39
Maineb
19
NAc
Maryland
23
13
Massachusetts
41
64
Michigan
14
0
Minnesota
31
46
Mississippi
16
48
Missouri
29
39
Montana
10
16
Nebraska
18
29
Nevada
47
12
New Hampshire
39
32
New Jersey
29
13
New Mexico
54
76
New York
29
45
North Carolina
56
97
North Dakota
10
11
Ohio
39
68
Oklahoma
11
19
Oregon
32
63
Pennsylvania
18
5
Rhode Island
32
39
South Carolina
62
100
South Dakota
20
4
Tennessee
17
7
Texas
19
13

CRS-30
Reported Percentage
Reported Percentage of
State
of Rated Schools Not
LEAs Not Making AYP,
Making AYP, 2006
2006
Utah
12
15
Vermont
25
26
Virginia
23
37
Washington
16
25
West Virginia
14
91
Wisconsin
4
0%
Wyoming
15
10
National Average
28
30
Source: State Consolidated Performance Reports [http://www.ed.gov/admins/lead/account/
consolidated/sy05-06/index.html] for all states except Maine.
a. While Alabama reports that all of its LEAs made AYP based on assessment scores for the 2005-
2006 school year, it also lists 43 LEAs, or 33% of the state total number of LEAs, as being in
improvement status for the 2006-2007 school year.
b. For Maine, the data for schools were acquired from the state educational agency
[http://www.maine.gov/education/pressreleases/ayp/ayplistmenu.htm].
c. NA — Not available. Thus, the national total for LEAs excludes Maine.
d. While Michigan reports that all of its LEAs made AYP based on assessment scores for the 2005-
2006 school year, it also lists 11 LEAs, or 2% of the state total number of LEAs, as being in
improvement status for the 2006-2007 school year.
e. Wisconsin reports one LEA as failing to make AYP out of a total of 426 LEAs.
Schools and LEAs Identified for Improvement, Corrective Action,
and Restructuring. Schools and LEAs are identified for improvement, corrective
action, or restructuring according to the cumulative number of years of not making
AYP. States are required to report the improvement status of schools and LEAs to
the Department of Education as part of their Consolidated State Performance
Reports. According ED, 10,676 schools were identified for improvement for the
2006-2007 school year, with 2,302 of these identified for restructuring.42 ED has
released more detailed data on the number of all schools identified for improvement
(any stage) and Title I schools identified for improvement, corrective action, or
restructuring for school years 2003-2004 through 2005-2006. These data are
presented in Table 3.
42 U.S. Department of Education, Mapping America’s Educational Progress, 2008, at
[http://www.ed.gov/nclb/accountability/results/progress/nation.html].

CRS-31
Table 3. Number of Schools and LEAs
Identified for Improvement
All
Title I
Title I Schools
Schools
LEAs
Year
Any
Corrective
Improvement
Restructuring
Total
Total
Stage
Action
2003-2004
11,531
4,199a
926a
838a
6,219
N/A
2004-2005
11,617
6,559a
977a
1,199a
9,333
1,511
2005-2006
11,648
6,068a
1,223a
1,683a
9,808
1,578
Sources: U.S. Department of Education, Institute of Education Sciences, National Center for
Education Evaluation and Regional Assistance, National Assessment of Title I, Final Report: Volume
I: Implementation
, 2007; and U.S. Department of Education, Office of Planning, Evaluation and
Policy Development, Policy and Program Studies Service, State and Local Implementation of the No
Child Left Behind Act, Volume III — Accountability Under NCLB: Interim Report
, 2007.
N/A = Not available.
a. Data for Puerto Rico are not provided by accountability stage but are included in the total.
Available data show that increasing numbers of schools have been identified for
improvement each year since the enactment of NCLB. For 2005-2006, of the 6,068
schools identified for improvement, 3,167 were identified for the first year of school
improvement. As more schools have been identified, increasing proportions are
becoming identified for the latter accountability stages of corrective action and
restructuring. However, ED reports that 23% of the Title I schools that had been
identified for improvement, corrective action, or restructuring for the 2003-2004
school year were no longer identified for the 2004-2005 school year.43 This shows
that a large portion of schools have been able to demonstrate sufficient improvement
to exit NCLB sanctions.
School Choice and Supplemental Educational Services. Data from
Consolidated State Performance Reports show that for 2006-2007, 5.4 million
students were eligible for public school choice and that 119,988 (2.2%) participated.
Data also show that more than 3.6 million students were eligible for supplemental
educational services and that 529,627 (14.5%) participated.44 Overall, since the
enactment of NCLB, relatively few students have taken advantage of the opportunity
to transfer to different schools under the public school choice option, with only
48,000 transferring during 2004-2005 and fewer in earlier years.45 Department of
43 U.S. Department of Education, Office of Planning, Evaluation and Policy Development,
Policy and Program Studies Service, State and Local Implementation of the No Child Left
Behind Act, Volume III — Accountability Under NCLB: Interim Report
, 2007, p. 60.
44 U.S. Department of Education, Mapping America’s Educational Progress 2008, at
[http://www.ed.gov/nclb/accountability/results/progress/nation.html].
45 U.S. Department of Education, Office of Planning, Evaluation and Policy Development,
(continued...)

CRS-32
Education data, drawn from multiple sources, indicate that during the first years of
implementation, supplemental educational services have been provided to increasing
numbers of students each year, rising from 42,000 in 2002-2003 to 446,000 in 2004-
2005 and to over half a million at present. ED estimates that for 2003-2004, LEAs
expended $24 million to support transportation for public school choice and $192
million for supplemental educational services.46
Implementation Issues
A number of issues have arisen as states and LEAs have proceeded to
implement the ESEA Title I-A accountability provisions. Some of the most notable
issues are discussed below.
Identification of Schools and LEAs for Improvement. Over the past
several years, states have sought, and in many cases have had approved by ED,
changes to their state accountability plans. In most instances, these changes have
facilitated a relaxation of accountability requirements. A result has been that some
schools and LEAs that would have been identified for improvement under state plans
as initially approved might not be so identified under amended versions of state
plans. Examples of approved changes include the use of confidence intervals,
increasing the minimum subgroup sizes used to calculate AYP, and specifying that
LEAs need to fail to make AYP for two consecutive years in the same subject area
and across each of the elementary, middle, and high school grade spans in order to
be identified for improvement or that schools should be identified for improvement
only if they fail to make AYP in the same subject area for two consecutive years.47
The expansion of the minimum subgroup sizes to large numbers, such as 40
students or 10% of school enrollment, may have the effect of excluding students from
subgroups with small populations from consideration in the determination of AYP
at the school level. Since beginning with the 2005-2006 school year, states are now
required to administer assessments in reading/language arts and mathematics to all
students in each of grades 3-8, it is expected that fewer student subgroups will be
excluded from AYP determinations on the basis of minimum subgroup size. Also,
at the LEA level, it is more likely that there will be sufficient students to include
more subgroups in the determination of AYP. Still, LEAs that have no schools
identified for improvement may, nonetheless, be identified for improvement, if low-
achieving students in subgroups with small populations are dispersed across several
schools. These LEAs incur no obligation to provide public school choice or
45 (...continued)
Policy and Program Studies Service, State and Local Implementation of the No Child Left
Behind Act, Volume III — Accountability Under NCLB: Interim Report
, 2007, p. 88.
46 U.S. Department of Education, Institute of Education Sciences, National Center for
Education Evaluation and Regional Assistance, National Assessment of Title I, Final
Report: Volume I: Implementation
, 2007, p. 91.
47 Naomi Chudowsky and Victor Chudowsky, States Test Limits of Federal AYP Flexibility,
Center on Education Policy, November 16, 2005, pp. 13-15.

CRS-33
supplemental educational services to their students because these sanctions are only
applicable at the school level.
Public School Choice. LEAs were first required to provide public school
choice to students attending schools identified for school improvement beginning
with the 2002-2003 school year. In instances where public school choice is required
to be offered to students at a particular school, it must be made available to all
students, regardless of their family income or academic achievement level. However,
LEAs are required to give priority to the lowest-achieving students from low-income
families. LEAs also may not deny students the opportunity to transfer to another
school on the basis of a lack of capacity. In instances where there are no eligible
schools to which a student could transfer (e.g., all schools at the applicable grade
level have been identified for school improvement, corrective action, or
restructuring), then the LEA must, to the extent practicable, establish a cooperative
agreement with one or more LEAs in the area (including public charter schools) to
provide school choice transfer options.
Concerns about the implementation of public school choice include that
information identifying schools for improvement has in many instances been released
during the summer only a short time before the new school year was about to begin,
and as a consequence parents have had limited time to select new schools for their
children to attend. Also, as more schools have been identified for school
improvement, corrective action, or restructuring, some LEAs have experienced
difficulty in making space available in schools that have not been identified for
improvement — many of which were already overcrowded. Meanwhile, LEAs with
few or no schools identified for improvement may decline to accept transfers from
neighboring districts. In very rural areas, public school choice can be difficult to
implement because of the great distances between schools. Finally, concerns have
been raised that despite provisions designed to give priority for public school choice
to low-achieving and low-income students, higher-achieving students and those from
moderate to upper-income families may be most alert to, or more willing to take
advantage of, school transfer options, thus leaving lower achieving students behind
in under-performing schools.48
Supplemental Educational Services. Beginning with the 2002-2003
school year, students attending schools identified for a second year of school
improvement, corrective action, or restructuring have been required to be offered the
opportunity to receive supplemental educational services. Issues regarding the
implementation of SES include the process through which parents are notified of the
availability of supplemental educational services; the availability of services to
students with special needs or with limited-English proficiency; approval of SES
providers and the negotiation of contracts with LEAs; student usage of supplemental
educational services; the sequencing of school choice and SES as sanctions; and how
providers are held accountable for performance.
48 Maria Glod, “High Achievers Leaving Schools Behind; Transfers in Fairfax and
Elsewhere Were Meant for Struggling Students,” The Washington Post, November 10,
2004, p. A01.

CRS-34
During the first few years of NCLB implementation, information on schools
being required to offer supplemental educational services has often been made
available late in the summer or even after the start of the school year.49 Also, notice
of the opportunity to receive these services has in many instances not been provided
to parents in a clear and concise manner. It appears that in some LEAs, this has
resulted in fewer students receiving supplemental educational services than might be
entitled to receive them.50 The availability of providers has also been a concern in
some LEAs, particularly in rural areas. Additionally, some providers may not be
willing or able to offer services in remote areas or in schools in which their services
may be selected by only one or two students. Some providers may not be able to
serve students with special needs or limited-English proficiency. However, in
instances where no providers are available to serve students with special needs or
limited-English proficiency, the LEA must arrange for these students to be served,
either by providing services directly or through a contract — even if the LEA is not
otherwise approved as an SES provider.
Responsibility for approving entities as eligible SES providers resides with
SEAs, although contracts for the provision of services must be negotiated between
each provider and each LEA in which it will offer services. Regulations promulgated
by ED prohibit schools and LEAs that have been identified for improvement from
being approved as SES providers. SES providers may not be required to hire only
staff who meet the highly qualified teacher and paraprofessional requirements of
ESEA § 1119. The process of approving and negotiating contracts for providers has
been challenging for many SEAs and LEAs, as each LEA has unique needs and
requirements, and the amount of funding available per student differs from one LEA
to another. The conditions that SEAs and LEAs may impose on SES providers is
often raised as an implementation issue. During the course of the past year, ED has
issued a number of policy letters clarifying the types of conditions that SEAs and
LEAs may impose. While SEAs have overall responsibility for overseeing SES
providers, LEAs may impose certain conditions on providers, such as requiring
background checks on personnel, requiring liability insurance, and charging for the
use of school facilities.51
The sequencing of public school choice and supplemental educational services
as sanctions has been raised as a policy issue. Intuitively, some believe that it may
make sense to offer supplemental educational services prior to being required to offer
public school choice. In 2005, 2006, and 2007, the Secretary of Education has
announced that flexibility in the provision of supplemental educational services
49 Michael Casserly, No Child Left Behind: A Status Report on Choice and Supplemental
Services in America’s Great City Schools
, Council of Great City Schools, January 2004.
50 Gail L. Sunderman and Jimmy Kim, Increasing Bureaucracy or Increasing
Opportunities? School District Experience with Supplemental Educational Services
,
(Cambridge, MA: The Civil Rights Project, Harvard University, February 2004), pp. 19-21,
at [http://www.civilrightsproject.harvard.edu/research/esea/increasing_bureaucracy.pdf].
51 U.S. Department of Education, Office of Elementary and Secondary Education and Office
of Innovation and Improvement, “Supplemental Educational Services (SES) Policy,” at
[http://www.ed.gov/policy/elsec/guid/stateletters/index.html].

CRS-35
would be provided in limited circumstances through SES Pilot Programs.52 Under
the SES Pilot Programs, ED has approved a reversal in the sequencing of SES and
school choice as sanctions in LEAs in Alaska, Delaware, Indiana, North Carolina,
and Virginia. Also as part of the SES Pilot Programs, ED has granted several LEAs
the flexibility to remain as SES providers even though they have been identified for
improvement.53 In addition, ED has clarified that certain entities loosely affiliated
with LEAs may be providers of supplemental educational services even if the LEA
has been identified for improvement. Examples of such entities include 21st Century
Community Learning Centers, community education programs, and parent
information and resource centers.54 Finally, while SEAs are required to withdraw
approval from providers that fail for two consecutive years to increase student
academic proficiency, little is known about the effectiveness of particular SES
providers. It is expected that better data on the effectiveness of supplemental
educational services will become available as LEAs participating in SES Pilot
Programs are required to provide ED with achievement data for students receiving
supplemental educational services.
Section 5. Education of
Limited English Proficient Pupils55
The No Child Left Behind Act of 2001 (NCLB) made several changes to ESEA
provisions regarding the education of limited English proficient (LEP) students. One
major change concerns the distribution and use of funds for LEP student instruction.
Namely, the NCLB converted the competitive grant programs for this purpose into
a single formula grant program based on enrollment of LEP and immigrant students.
A second set of changes enacted by the NCLB fall under the category of assessment
and accountability. As mentioned earlier in this report, these changes include
requirements for annual assessments of LEP students’ English language proficiency,
language accommodations for LEP students’ academic assessments in subjects other
than English, and separate adequate yearly progress (AYP) calculations for LEP
students as a subgroup.
52 U.S. Department of Education, “New Options For Families: Supplemental Educational
Services Pilot Programs,” at [http://www.ed.gov/nclb/choice/help/sespilot-2006.html].
53 U.S. Department of Education, “SES Flexibility Agreements 2007-2008,” at
[http://www.ed.gov/nclb/choice/help/ses/07agreements.html].
54 U.S. Department of Education, Office of Elementary and Secondary Education and Office
of Innovation and Improvement, “District-affiliated entities becoming SES providers,” May
10, 2006, at [http://www.ed.gov/policy/elsec/guid/stateletters/choice/ses051006.html].
55 This section was written by Rebecca R. Skinner, with previous contributions made by
Jeffrey J. Kuenzi. For more information on this issue, see CRS Report RL31315, Education
of Limited English Proficient and Recent Immigrant Students: Provisions in the No Child
Left Behind Act of 2001
, by Jeffrey J. Kuenzi.

CRS-36
Language Acquisition State Grants
Prior to the NCLB, Title VII of the ESEA supported two major competitive
grant programs that awarded funds to LEAs specifically for the education of LEP and
immigrant students and a third competitive grant program to institutions of higher
education (IHEs) for teacher professional development in this area.56 Under the
NCLB the first two of these programs were replaced by a single formula grant
program; provided that the total appropriation for this purpose exceeds $650 million
(which has been the case since the first grants were awarded in FY2002).
This new formula grant program, the English Language Acquisition State Grant
program (ESEA, Title III), distributes grants to states according to their share of the
LEP and recent immigrant populations. The formula allocates 80% of the program’s
funds according to the population of LEP students and 20% according to the
population of recently arriving immigrants. The Secretary is given the authority to
determine the most accurate data for these allocations from either state reported
enrollment counts or from data collected by the U.S. Census Bureau.
Prior to determining state grant allocations, statutory language provides for
several reservations of funds. These include reservations for national activities, for
schools serving Native American and Alaska Native students, and for the outlying
areas. A reservation is also specified for LEAs and other eligible recipients of pre-
NCLB Title VII multi-year grants to continue to receive funds until their grants
expire. Set-asides for these continuation grants were no longer needed after FY2004,
as all continuation grants had expired.
After reserving funds for these purposes, ED makes grants to states based on the
aforementioned population factors. The minimum state grant amount is $500,000.
States subsequently make subgrants to LEAs according to each LEA’s share of the
state’s LEP student enrollment.57 The minimum grant amount for an LEA is
$10,000.58
The NCLB also made changes to the activities supported by these grants. SEAs
and LEAs were given greater flexibility in the design and administration of language
instructional programs. The law also removed or weakened provisions that
encouraged bilingual instruction methods (i.e., curricula that develop proficiency in
English as well as students’ native language). Instead, the law placed an emphasis
on annual measurable increases in English language proficiency.
56 Prior to the enactment of the NCLB, the Bilingual Education Act (ESEA Title VII)
supported grants for instructional services, support services, and professional development.
57 Prior to making these subgrants, states are required to reserve up to 15% of their state
allocation to provide grants to LEAs that have experienced a significant increase in the
percentage or number of immigrant students enrolled.
58 Based on the subgrant calculations, the SEA may not make an award to an LEA if the
amount of the award would be less than $10,000.

CRS-37
Implementation Status. All states, including Puerto Rico and the District
of Columbia, submitted plans for implementing Title III programs beginning in
2002.59 According to the Secretary’s report on the implementation of Title III, the
new formula grant awarding process appears to have been implemented without
major problems.60 However, after the issuance of the Secretary’s report, a problem
arose in the allocation of funds in the 2005 fiscal year because some states
experienced sizable changes in their award amount from the previous year.
Grants were distributed to states in fiscal years 2002, 2003, and 2004 based on
the Census 2000 LEP and immigrant estimates. Starting in 2005, these state grants
were allocated using LEP and immigrant estimates from the American Community
Survey (ACS), also administered by the Census Bureau. The distribution of FY2005
grants showed large changes in the amount awarded to states compared to their
FY2004 grants. The fluctuation between FY2004 and FY2005 is in contrast to the
relative stability of grant awards from FY2002 to FY2003 and from FY2003 to
FY2004.61 With the change to reliance on ACS data to determine FY2005 grants, 20
states experienced increases in grant awards of 20% or higher; 13 of these increases
were 30% or higher, and 1 increase exceeded 98%. Four states lost more than 10%
of their funds, and one lost more than 20% of its funds. These declines occurred
despite an 8% increase in the funds allocated by formula created by a decrease in the
amount of funding needed to support continuation grant awards.62 From FY2005 to
FY2006, nine states lost 20% or more of their funding, while 15 states received
increases of 20% or more. Again, these declines occurred despite an almost 7%
increase in the funds for state grants generated by a continued decrease in the amount
of funding needed to support continuation grants. Funding available to support state
grants did not change from FY2006 to FY2007, but eight states received increases
of 20% or more in their funding, while four states experienced decreases of 20% or
more. These types of fluctuations have continued through the FY2008 award year.63
For several states, increases in one year have been accompanied by decreases
either the preceding or succeeding year. In some cases, these changes have been
fairly substantial from year to year. For example, Nevada received a 26.3% increase
59 U.S. Department of Education, Biennial Evaluation Report to Congress on the
Implementation of the State Formula Grant Program
(Washington, DC: 2005).
60 U.S. Department of Education, Biennial Evaluation Report to Congress on the
Implementation of Title III, Part A of ESEA
(Washington, DC: 2005).
61 From FY2002 to FY2003, 11 states had increases in funding of 20% or more; 2 states had
funding increases in excess of 55%. From FY2003 to FY2004, six states experienced
increases of 20% or more; no increases exceeded 35%. Two states lost funding during this
time period — one lost less than 1% of its funding, the other lost just over 10% of its
funding.
62 The decrease in continuation award obligations helped to offset a 1% decrease in overall
appropriations that affected Title III and other education programs. FY2005 was the last
year in which continuation grant awards were made.
63 For more information about Title III state grants and data options, see CRS Report
RL34066, English Language Acquisition Grants Under the No Child Left Behind Act:
Analysis of State Grant Formula and Data Options
, by Rebecca R. Skinner.

CRS-38
from FY2005 to FY2006, a 30.7% decrease from FY2006 to FY2007, and a 21.1%
increase from FY2007 to FY2008.
Implementation Issues. Large year-to-year fluctuations in Title III awards
could pose problems for program implementation. Such shifts reflect instability in
the underlying population estimates on which the formula is based. Instability in the
current LEP data provided by the ACS derives primarily from the small and
concentrated nature of LEP populations. It had been anticipated that once the survey
was fully implemented that the LEP population estimates would become more stable
from year to year.64 The ACS was fully implemented in 2005 and used to make
FY2007 awards. While the number of states experiencing increases or decreases in
state grant amounts of 20% or more declined in FY2007 and FY2008, 12 states in
FY2007 and 13 states in FY2008 continued to see fluctuations of 20% or higher.
Changes in funding of the magnitude just discussed might make it difficult for
schools to provide service continuity to LEP students and may be particularly
challenging in states experiencing large increases in funding one year and large
decreases the next. Schools with a small number of LEP students and a single
teacher qualified to teach them might not be able to retain that teacher in consecutive
years under such budget conditions. An LEA that in a given year had an LEP
population large enough to exceed the minimum grant requirement may in the next
year fall below that requirement and be required to enter into a consortium of LEAs
that share service provision. Fluctuation in the amount of SEA administrative set-
asides may also disrupt professional development, program evaluation, and technical
assistance.
Another issue concerns the availability of qualified bilingual teachers. Many
LEAs report being unable to fill teaching positions in LEP classrooms. One study
found the LEP teacher shortage to be the number one complaint from LEAs in regard
to implementing Title III.65
LEP Assessments and Accountability
In addition to the general Title I assessment and accountability provisions, there
are specific provisions that apply to LEP students. As briefly mentioned earlier, LEP
students must be annually assessed for English language proficiency; may be given
accommodations for academic assessments in subjects other than reading/language
arts; and must be treated as a separate subgroup for state, LEA, and school AYP
calculations.
64 The survey employs a county-based sampling strategy and will not include all counties
until it is fully implemented. The Census Bureau’s plan for full implementation to occur for
the 2005 survey year would make LEP data available in the fall of 2006. For more
information on ACS sampling issues, see [http://www.census.gov/acs/www/Downloads/
OpsPlanfinal.pdf].
65 Center on Education Policy, From the Capital to the Classroom: Year 3 of the No Child
Left Behind Act
(Washington, DC: March 2005), p. 185.

CRS-39
States are required to demonstrate that their LEAs have conducted annual
assessments of LEP students’ English language proficiency beginning with the 2002-
2003 school year. These assessments must include measures of students’ proficiency
in speaking, reading, writing, listening, and comprehension of English. Assessments
used for this purpose need not be uniform across all LEAs in a state. If a state allows
multiple English language proficiency assessments, the SEA should (1) set technical
criteria for the assessments; (2) ensure the assessments are equivalent to one another
in their content, difficulty, and quality; (3) review and approve each assessment; (4)
ensure that the data from all assessments can be aggregated for comparison and
reporting purposes, as well as disaggregated by English language proficiency levels
and grade levels; and (5) ensure that the assessments are aligned with the state
English language proficiency standards. SEAs and LEAs may use Title III funds as
well as funds they receive by formula under ESEA section 6111 (Grants for State
Assessments Program) or competitively under section 6112 (Grants for Enhanced
Assessment Instruments Program) for developing English language assessments.
(These programs were discussed in Section 1 of this report).
States must establish annual measurable achievement objectives (AMAOs) for
LEP students’ development and attainment of English language proficiency.
AMAOs must include goals for increases in the number or percentage of children in
an LEA making progress in learning English and achieving English proficiency.
States must also establish an AMAO that specifies AYP targets for LEP student
achievement. Each state must ensure that all LEAs in the state meet the AMAO
requirements. If an LEA repeatedly misses its AMAOs, the LEA must develop an
improvement plan, and the SEA must provide relevant technical assistance to the
LEA. Continued failure to meet AMAOs will lead to state intervention, and can
result in the loss of funds to the LEA.
States must also include all LEP students in their state academic assessment
system.66 Inclusion of LEP students may involve providing appropriate linguistic
accommodations and/or using an assessment in the student’s native language that is
aligned to the state content and achievement standards. However, after three years
of attending a school in the United States (except for those residing in Puerto Rico),
students must be assessed for reading/language arts achievement in English. LEAs
can, on an individual basis, continue to provide accommodations for up to two
additional years for students who have not yet reached a level of English proficiency
sufficient to yield valid and reliable information on what the student knows and can
do on a reading/language arts assessment written in English.
Implementation Status. ED reports that states have made “significant
progress” in implementing programs to support LEAs in providing English language
assessments. Although many states had already developed language assessment
systems prior to enactment of the NCLB, many other states had not, as they had
historically enrolled low numbers of LEP students. During 2003 and 2004, 35 states
used grants under ESEA Title VI to develop English language standards and
66 However, as discussed in a subsequent section, LEP student participation varies based on
how long the student has been enrolled in a U.S. school.

CRS-40
assessments.67 By June 2004, 40 states reported having English language
assessments; although at different stages of readiness.68 Some states developed their
own English language assessments, some contracted with commercial assessment
developers, and others joined multi-state consortia that have developed assessments.
ED also reported that, by February 2005, all states had developed their AMAOs.
With the flexibility to extend language accommodations for academic
assessments an additional two years, 2007-2008 is the first school year in which LEP
students (who began receiving Title III services during the 2002-2003 school year)
will be required to take assessments in English. ED indicated that academic
assessments were implemented relatively smoothly; however, “in many cases,
[states] were not able to implement the data system fully” to report the results of
these assessments.69 For example, some states reported that they either did not have
data for LEP students available at the time of the report or that data on LEP students
served by Title III could not be separated from data on all LEP students.
In response to concerns that the transient nature of the LEP subgroup would
prevent states from reaching AYP goals, the Secretary issued guidance in June 2004
that provided additional flexibility in the treatment of the LEP subgroup.
Specifically, students determined to be LEP would not have their scores counted in
AYP calculations until after they have been in school in the United States for at least
10 months. In addition, LEP students that have attained proficiency in English may
continue to be counted in the LEP subgroup for another two years. In September
2006, ED issued final regulations that formalized and modified this guidance. States
are not required to include the scores of recently arrived LEP students on
reading/language arts assessments for AYP purposes.70 The regulations define a
newly arrived LEP student as an LEP student that has been enrolled in U.S. schools71
for less than 12 months. The regulations also clarify how the provision allowing
former LEP students to be counted with the LEP subgroup for two additional years
is to be implemented with respect to data reporting.
In July 2006, ED announced a new initiative to assist 20 states in their
development of reading and math assessments for LEP students. In the press release
announcing the initiative, the Secretary stated that, “These states submitted evidence
for the Department’s 2005-2006 peer review of state assessment systems, focused on
tests tailored to LEP students. In most cases the tests designed for LEP students have
not yet met with full approval under NCLB.” To comply with the law, these states
must agree to a Plan for Improvement, negotiated with the Department, that will
67 U.S. Department of Education, Biennial Evaluation Report to Congress on the
Implementation of Title III, Part A of ESEA
(Washington, DC: 2005), p. 6.
68 Center on Education Policy, From the Capital to the Classroom: Year 3 of the No Child
Left Behind Act
(Washington, DC: March 2005), p. 188.
69 U.S. Department of Education, Biennial Evaluation Report to Congress on the
Implementation of Title III, Part A of ESEA
(Washington, DC: 2005).
70 Federal Register, September 13, 2006, vol. 71, no. 177, 54187-54194.
71 This includes schools in the 50 states and the District of Columbia. It does not include
schools in Puerto Rico, the outlying areas, or the freely associated states.

CRS-41
result in the implementation of LEP content assessments and accommodations by the
time of the 2006-2007 administration of state assessments in reading and math.
The Government Accountability Office (GAO) recently completed a study of
how states are measuring the progress of LEP students.72 The study found that many
states have not ensured the validity or the reliability of their tests for LEP students.
It was also determined that only a few states used native language or alternative
assessments for LEP students. GAO noted, however, that the use of these
assessments is expensive and may not be appropriate for all LEP students. As a
result of the findings, GAO recommended that the Secretary of Education support
research on effective accommodations and identify and provide technical support that
states need to ensure the validity of the academic assessments they use. GAO also
recommended that the Secretary of Education publish additional guidance on
requirements for assessing English language proficiency, and explore ways to provide
additional flexibility in measuring annual progress for LEP students.
In October 2007, ED released a draft version of a Framework for High-Quality
English Language Proficiency Standards and Assessments.73 The framework is
designed to provide states with information to assist them in evaluating and
improving the quality of their current English language proficiency (ELP) standards
and assessments, including the establishment of “rigorous, valid, and accessible State
ELP standards and assessments that support effective instruction” (p. 3). More
specifically, the framework addresses the validity and reliability of ELP standards
and assessments for placement and proficiency determinations. Adherence to the
framework by states is not required. However, in December 2007, ED sent a letter
to all Chief State School Officers asking them to conduct an independent and
voluntary self-directed review of ELP standards and assessments.74 Through the LEP
Partnership,75 ED offered to provide technical assistance for four to six months in
2008 to states interested in engaging in the self-review process. The review activities
conducted by peer experts and technical assistance providers to assist states will not
be reported to ED. State engagement in the self-review process is not considered to
be a formal review of state ELP standards and assessments by ED.
Implementation Issues. The requirement that LEP students no longer
receive accommodations when taking academic assessments after their third year in
U.S. schools may not be appropriate at all grade levels. Many analysts point to
research suggesting that non-English speakers’ ability to attain English proficiency
diminishes with age. Older LEP students may continue to have difficulty taking
academic assessments at their grade-level in English even after three years in U.S.
72 Government Accountability Office. (2006). Assistance from Education Could Help
States Better Measure Progress of Students with Limited English Proficiency
(GAO-06-
815). Available online at [http://www.gao.gov/cgi-bin/getrpt?GAO-06-815].
73 The draft framework is available at [http://www.ed.gov/about/inits/ed/lep-partnership/
framework.doc].
74 The letter is available at [http://www.ed.gov/policy/elsec/guid/secletter/071217.html].
75 The LEP partnership is an ED initiative to improve assessments of ELP, reading, and
mathematics for LEP students.

CRS-42
schools. Some argue that the two additional years of flexibility in determining when
LEP students should be tested in English is insufficient.
The group size issue, discussed earlier in this report, is particularly relevant in
the application of AYP requirements to LEP students. In some instances, the state
may set a minimum group size at a level so high that few LEAs or schools are held
accountable for properly serving LEP students. On the other hand, setting group size
too low may make it difficult for LEAs, particularly those with diverse student
populations, to meet NCLB’s annual requirements for improvement.
States face additional difficulty implementing AYP requirements for LEP
students because these students are likely to fall into several subgroups. There exists
a high correlation between membership in the LEP subgroup and membership in the
Hispanic and the economically disadvantaged groups. In light of this, some have
advocated that students classified in two or more categories should be counted only
in one category or weighted proportionally in each of the categories in which they are
classified. For example, students who are LEP, Hispanic, and poor could have their
scores given one-third weight in each subgroup.

CRS-43
Section 6. Teacher Quality76
The No Child Left Behind Act of 2001 (NCLBA) made two major amendments
to the teacher quality provisions of the Elementary and Secondary Education Act of
1965 (ESEA). First, the NCLB established a requirement that all teachers be highly
qualified
by the end of the 2005-2006 school year. Second, the NCLB replaced the
ESEA Eisenhower Professional Development and Class Size Reduction programs
with a new Teacher and Principal Training and Recruiting Fund.
Requirement That All Teachers Be Highly Qualified
Each state educational agency (SEA) receiving ESEA Title I-A funding must
have a plan to ensure that no later than the end of the 2005-2006 school year, all
teachers teaching in core academic subjects within the state will meet the definition
of a highly qualified teacher (HQT).
As defined in the ESEA, Section 9101(23), to be highly qualified, a public
elementary or secondary school teacher must meet the following requirements:
! Every public elementary or secondary school teacher, regardless of
whether he or she is new or experienced, (1) must have full state
certification (a charter school teacher must meet the requirements in
the state charter school law), (2) must not have had any certification
requirements waived on an emergency, temporary, or provisional
basis, and (3) must have at least a baccalaureate degree;
! Each new public elementary school teacher must pass a rigorous
state test demonstrating subject knowledge and teaching skills in
reading, writing, math, and other basic elementary school curricular
areas (such tests may include state certification exams in these
areas);
! Each new public middle or secondary school teacher must
demonstrate a high level of competency in all subjects taught by: (1)
passing rigorous state academic tests in those subjects (may include
state certification exams in those subjects), or (2) completing an
academic major (or equivalent course work), graduate degree, or
advanced certification in each subject taught;
! Each experienced public elementary, middle, or secondary school
teacher must meet: (1) the requirements just described for a new
teacher (depending upon his or her level of instruction), or (2)
76 This section was written by Jeffrey J. Kuenzi. For a discussion of teacher quality issues
in general, see CRS Report RL33333, A Highly Qualified Teacher in Every Classroom:
Implementation of the No Child Left Behind Act
, by Jeffrey J. Kuenzi.

CRS-44
demonstrate competency in all subjects taught using a “high
objective uniform state standard of evaluation” (HOUSSE).77
Implementation Status. The NCLB required each state to submit its plan
to meet the HQT deadline along with its Consolidated State Application for State
Grants on July 12, 2002.78 This plan must establish annual measurable objectives for
each local educational agency (LEA) and school that, at a minimum, include annual
increases in the percentage of HQTs at each LEA and school to ensure that the 2005-
2006 deadline is met, and an annual increase in the percentage of teachers receiving
high quality professional development. In turn, each LEA must also have a plan to
meet this deadline. In addition, beginning with the first day of the 2002-2003 school
year, any LEA receiving Title I funding must ensure that all teachers hired after that
date who are teaching in Title I-supported programs are highly qualified.
States and LEAs must also submit annual reports to ED describing progress on
the state-set annual objectives. The most recent of these Consolidated State
Performance Reports (CSPR), for the 2003-2004 school year, consisted of two parts.
The first part, providing information on the status of the HQT requirement, was to
be submitted to ED by January 31, 2005. Although states appear to have met the
reporting deadline, significant problems with the detailed data requirements of the
CSPR have prompted ED to delay enforcement of the 2005-2006 deadline.
In an October 21, 2005 policy letter to chief state school officers, ED reported
widespread problems in state data systems and offered a series of regional data
workshops to support states in collecting data.79 The letter also announced additional
flexibility in meeting the HQT deadline. The Secretary stated that the letter’s
purpose was “to assure you that States that do not quite reach the 100% goal by the
end of the 2005-2006 school year will not lose federal funds if they are implementing
the law and making a good-faith effort to reach the HQT goal in NCLB as soon as
possible.” Instead, states that “meet the law’s requirements and the Department’s
expectations in these areas but fall short of having highly qualified teachers in every
classroom” would be given an additional year to reach the 100% goal. No state is
expected to meet that goal and as many as nine states may not be considered as
having made a “good-faith” effort.80
The most recently published data on meeting the HQT goal became available
in January 2008. As part of the congressionally mandated assessment of NCLB, the
77 Among other requirements, the state-set HOUSSE must provide objective information
about teachers’ content knowledge in all subjects taught; be aligned with challenging state
academic and student achievement standards; be applied uniformly statewide to all teachers
in the same subjects and grade levels; and consider, but not be based primarily on, time
teaching those subjects. It may use multiple measures of teacher competency.
78 Although some states have made their plans available to the public, the Secretary has yet
to release the plans of any state.
79 The Secretary’s letter is available online at [http://www.ed.gov/policy/elsec/guid/secletter/
051021.html].
80 Education Week, No State Meeting Teacher Provision of ‘No Child’ Law, May 24, 2006.

CRS-45
Department’s Institute for Educational Sciences found that “91 percent of classes
were taught by highly qualified teachers in 2004-05.” Two additional findings from
the study are (1) students in schools that have been identified for improvement were
more likely to be taught by teachers who said they were not highly qualified than
were students in non-identified schools and (2) among teachers who said they were
highly qualified under NCLB, those in high-poverty schools had less experience and
were more likely to be teaching out-of-field, compared with their peers in low-
poverty schools.81
Implementation Issues. Questions have been raised about the scope and
application of the HQT requirements, the meaning of some of the requirements, and
the ability of different kinds of LEAs to meet them. ED has sought to address some
of these concerns through regulation, non-regulatory guidance, and other means.
Early in the implementation of these provisions ED was asked whether they apply to
all teachers, including vocational education teachers, special education teachers, or
others not teaching core academic subjects. Final regulations for the Title I program
published December 2, 2002, in the Federal Register clarify that these requirements
only apply to core academic subject teachers. However, these requirements would
apply to a vocational education teacher or a special education teacher providing
instruction in a core academic subject.82
The final regulations also clarify that a teacher in an alternative certification
program will have a maximum of three years in which to become fully certified
without being in violation of the highly qualified requirements regarding
certification. This allowance is made only for a teacher in an alternative certification
program who is receiving high quality professional development, intensive
supervision, and making satisfactory progress toward full certification.
In March of 2004, ED announced that additional flexibility could be applied in
the implementation of the HQT requirements with regard to: teachers in small rural
school districts, science teachers, and teachers teaching multiple subjects.83 In small
rural districts, ED provided that teachers teaching core academic subjects who meet
the highly qualified requirements in at least one of the subject areas they teach may
have an additional three years to meet these requirements in the other subjects they
might teach. For current teachers, this three-year grace period began with the 2004-
2005 school year, meaning that rather than facing a deadline of the end of the 2005-
2006 school year to be highly qualified in all core subjects taught, current rural
teachers may have until the end of the 2006-2007 school year. For newly hired
teachers, a full three-year grace period can be provided from the date of hiring. But
81 U.S. Department of Education, National Assessment of Title I: Final Report, January 3,
2008, available at [http://ies.ed.gov/ncee/pubs/20084012/].
82 According to ESEA Section 9101(11), “The term ‘core academic subjects’ means English,
reading or language arts, mathematics, science, foreign languages, civics and government,
economics, arts, history, and geography.”
83 A two-page fact sheet on these policies is available on the web at [http://www.ed.
gov/nclb/methods/teachers/hqtflexibility.html]. A more detailed letter to each of the chief
state school officers, dated March 31, 2004, is available at [http://www.ed.gov/
policy/elsec/guid/secletter/040331.html].

CRS-46
those newly hired teachers will have to be highly qualified in one of their core subject
areas when hired. States decide whether to offer this flexibility to eligible rural
districts.
The flexibility announced in March 2004 modified earlier non-regulatory
guidance (issued in January 2004) which stated that science teachers teaching more
than one field of science (e.g., biology and chemistry) would have to be highly
qualified in each of the fields being taught. Under the new flexibility, states
determine whether science teachers need to be highly qualified in each science field
they teach or highly qualified in science in general, based on how the state currently
certifies teachers in these subject areas.
Finally, ED allowed states to design their HOUSSE procedures to allow a
teacher to go through the process a single time to demonstrate competency in
multiple subjects. This new flexibility, along with other changes, were incorporated
into revised non-regulatory guidance issued on August 3, 2005.84
Teacher and Principal Training and Recruiting Fund
This new formula grant program replaced the former Eisenhower Professional
Development and Class Size Reduction formula grant programs. The allocation
formula provides each state with a base guarantee of funding equal to the amount it
received for FY2001 under the Eisenhower and Class Size Reduction programs. Any
excess funding is allocated by formula among the states based 35% on school-aged
population (5-17), and 65% on school-aged population in poverty. The allocation of
subgrants to LEAs follows the same procedure except that the excess is distributed
by a formula based 20% on school-aged population, and 80% on school-aged
population in poverty. Additional grants under this program are awarded
competitively to partnerships lead by State Agencies of Higher Education (SAHEs)
that must include a higher education institution and its division preparing teachers
and principals; a higher education school of arts and sciences; and a high-need
LEA.85
Partnerships must use their funds for professional development in the core
academic subjects for teachers, highly qualified paraprofessionals, and principals.
LEAs are authorized to use their funding for one or more of various specified
activities. Among the authorized activities are the following:
! assistance to schools in the recruitment and retention of highly
qualified teachers, principals, and, under certain conditions, pupil
services personnel;
84 The most recent version of the ESEA Title II non-regulatory guidance is available on the
Web at [http://www.ed.gov/programs/teacherqual/guidance.doc].
85 In this program, a high-need LEA is defined as one with at least 10,000 poor children or
a child poverty rate of at least 20%, and in addition, has either a high percentage of
out-of-field teachers or a high percentage of teachers with emergency, provisional, or
temporary certificates.

CRS-47
! assistance in recruiting and hiring highly qualified teachers through
such means as scholarships and signing bonuses; use of these
teachers to reduce class sizes;
! initiatives to increase retention of highly qualified teachers and
principals, particularly in schools with high percentages of
low-achieving students, through mentoring, induction services
during the initial three years of service, and financial incentives for
those effectively serving all students;
! professional development, including professional development that
involves technology in teaching and curriculum and professional
development delivered through technology;
! improvement of the quality of the teaching force through such
activities as tenure reform, merit pay, and teacher testing in their
subject areas; and
! professional development for principals and superintendents.
Implementation Status. Since the grant allocation procedures of the new
program are largely based on those of the Eisenhower and Class Size Reduction
programs, the distribution of funds to SEAs and LEAs has occurred smoothly and
according to the requirements of the law. That is, a higher proportion of funds have
been directed to large and high-poverty districts. Although the new program allows
for a much wider set of activities, ED reports that most of the Title II funds have been
spent on class size reduction and professional development.86 Districts reported
spending 58% of their Title II funds on teacher salaries to reduce class size and 25%
on professional development. Preliminary evidence from a more recent ED survey
of grantees indicates that funds are being increasingly shifted toward the latter.
Implementation Issues. In a departure from the Eisenhower program, which
targeted professional development primarily toward math and science, districts are
using their Title II funds to support professional development in a number of areas.
ED reported that LEAs are spending 39% of their Title II funds for professional
development in math and science, an equal share (39%) in reading and English, 8%
in history, and 7% in technology. ED officials have stated that this shift is likely a
response to the HQT requirements and the need to reduce the number of out-of-field
teachers in other fields.87
86 U.S. Department of Education, Policy and Program Studies Service, Improving Teacher
Quality in U.S. School Districts: Districts’ Use of Title II, Part A, Funds in 2002-2003
,
Policy and Program brief, February 6, 2004, available at [http://www.ed.gov/programs/
teacherqual/uof.doc].
87 November 28, 2005, phone conversation with Robert Stonehill, Deputy Director,
Academic Improvement and Teacher Quality Programs Office of Elementary and Secondary
Education, U.S. Department of Education.

CRS-48
A shift away from math and science professional development in the Title II
program might concern some observers who see serious deficits in the U.S.
educational system in these areas. The National Academy of Sciences recently
released a report on this issue in which upgrading current math and science teachers’
skills was among its top recommendations.88 The flexibility provided by ED which
eases the HQT requirements for science teachers (mentioned earlier) may be seen by
some as a loophole that will allow districts to hire and retain sub-par teachers.
Section 7. Reading Skills Improvement Grants89
Reading First90
The Reading First program was authorized as part of the Reading Skills
Improvement Grants, Title I-B of the No Child Left Behind Act of 2001 (NCLB).
Reading First was drafted with the intent of incorporating the latest scientific
understanding on what works in teaching reading to improve and expand K-3 reading
programs to address concerns about student reading achievement and to try and reach
children at younger ages.
The Reading First program includes both formula grants (states are allocated
funds in proportion to the number of children, aged 5 to 17, who reside within the
state from families with incomes below the poverty line) and targeted assistance
grants to states.91 States then competitively award grants to eligible local educational
agencies (LEAs). LEAs that receive Reading First grants shall use those funds for
the following purposes:
! selecting and administering screening, diagnostic, and classroom-
based instructional reading assessments;
! selecting and implementing a learning system or program of reading
instruction based on scientifically based reading research that
includes the essential components of reading instruction;
88 “Rising Above the Gathering Storm: Energizing and Employing America for a Brighter
Economic Future,” a report by the National Academies Committee on Science, Engineering,
and Public Policy (Washington, DC: National Academies Press, October 2005).
89 This section was written by Gail McCallion. For more information on program
implementation issues, see CRS Report RL33246, Reading First: Implementation Issues
and Controversies
, by Gail McCallion.
90 For more information on the structure of the Reading First Program, see CRS Report
RL31241, Reading First and Early Reading First: Background and Funding, by Gail
McCallion.
91 The NCLB specifies that beginning with FY2004, 10% of funds in excess of the FY2003
appropriation or $90 million, whichever is less, is to be reserved for targeted assistance state
grants. Targeted assistance grants are intended to reward schools that are achieving the
goals of increasing the percentage of 3rd graders who are proficient readers and improving
the reading skills of 1st and 2nd graders.

CRS-49
! procuring and implementing classroom instructional materials based
on scientifically based reading research;
! providing professional development for teachers of grades K-3, and
special education teachers of grades K-12;
! collecting and summarizing data to document the effectiveness of
these programs, and to accelerate improvement of reading
instruction by identifying successful schools;
! reporting student progress by detailed demographic characteristics;
and
! promoting reading and library programs that provide access to
stimulating reading material.
Implementation Status. The Reading First program required significant
startup time on the part of states. Because the program is complex and many of its
requirements are new, it took time for states and LEAs to put together the necessary
staff, curriculum, assessment, and evaluation components for the program. By the
end of October 2003, all states and the District of Columbia had received their
FY2002 and FY2003 Reading First awards. The Virgin Islands received its first
Reading First funds in September 2004. Reading First state grants are awarded for
a six-year period, pending a satisfactory midterm review. Puerto Rico’s situation is
unique because it did not spend the first Reading First funds it received (for
FY2003), and it declined funds for FY2004 because of disagreements with ED over
instruction and methods to be employed. Puerto Rico’s application for FY2005 was
not found acceptable by ED. Puerto Rico reapplied for FY2006 funds, however its
application was not approved. Puerto Rico received the Reading First Advisory
Committee’s comments on its application in November of 2007.92 ED has notified
Puerto Rico that it may revise its application to incorporate responses to the
Committee’s comments and resubmit it for FY2007 funds.
The awarding of the first targeted assistance grants was delayed so that there
would be more states meeting the requirement of having one year of baseline data
and two years of follow-up data showing improvement. The first Reading First
targeted assistance award of $3 million was awarded to Massachusetts in September
of 2005 (out of FY2004 funds). Tennessee was the only state to receive a FY2005
targeted assistance grant; it received $4.81 million. FY2006 awards were given to
Massachusetts ($950 thousand), Tennessee ($1.4 million), and Virginia ($1.2
million).
92 ED published a notice in the Federal Register on March 1, 2007, announcing the
establishment of the Reading First Advisory Committee. The panel evaluates state Reading
First applications and mid-term progress reports. The committee consists of individuals
selected from each of the following agencies: ED, the National Institute for Literacy, the
National Research council of the National Academy of Sciences, and the National Institute
of Child Health and Human Development. The committee members will serve for three
years or until the date of reauthorization of the ESEA, whichever comes first.

CRS-50
The NCLB specifies that a mid-term peer review of states’ performance in the
Reading First program be conducted after the completion of the program’s third grant
period (which would have meant a review in the fall of 2005). Because of the time
involved in initial implementation of the program, ED made adjustments to the time
line to provide states with sufficient time to have participated in three grant cycles,
as envisioned by the statute, before undergoing a midterm peer review. ED
established November 2006 as the deadline for states’ submission of their midterm
progress reports.
The Reading First program is required to meet relatively extensive evaluation
and reporting standards. In addition to midterm reviews of states performance,
districts are required to track the progress of individual students, and states are
required to submit annual evaluations to ED with data on overall school, district, and
state progress. ED has also contracted to have both an impact and implementation
study of the program conducted. It is anticipated that the first report from the impact
study, which is being conducted by Abt Associates and MDRC, will be available
sometime in 2008. The implementation study is also being conducted by Abt
Associates. The interim implementation report was issued in July of 2006; the final
implementation report is expected to be issued in the summer of 2008.
Implementation Issues. Information from ED’s April 2007 report on state
performance data; the 2006 Reading First Implementation Evaluation: Interim
Report
; an October 2007 Center on Education Policy report, Reading First: Locally
Appreciated, Nationally Troubled
; and a February 2007 Government Accountability
Office (GAO) report have all provided relatively positive information about the states
and local school districts opinions of the impact of the Reading First program on
student achievement.93 ED’s report, The Reading First Annual Performance Report
Data,
provided encouraging news based on state data, about the impact of RF on
student achievement. According to these data, on average, between 2004 and 2006,
the 26 states with baseline data increased the performance of students meeting or
exceeding proficiency on fluency outcome measures by 16% for 1st graders, 14% for
2nd graders, and 15% for 3rd graders. In addition, these 26 states also increased the
performance of students meeting or exceeding proficiency on comprehension
outcome measures — by 15% for 1st graders, 6% for 2nd graders, and 12% for 3rd
graders.94 However, state assessment measures and cut-off scores for determining
reading proficiency vary from state to state, making it difficult to draw definitive
conclusions on Reading First’s performance from these data.
93 For more information see CRS Report RL33246, Reading First: Implementation issues
and Controversies
, by Gail McCallion.
94 Reading First Annual Performance Report Data, is available online at
[http://www.ed.gov]. ED has also issued a report providing profiles of state implementation
of reading first, including data on the level of funding and the numbers of LEAs, schools,
students, and teachers who have participated in the program. This report, titled The Reading
First State Data Profiles
, is also available on ED’s website.

CRS-51
Scientifically Based Research Requirements
in the No Child Left Behind Act

The NCLB has endorsed the use of scientifically based research (SBR) in
funded activities, including over 100 references to the use of SBR in choosing
instructional and assessment programs as well as for professional training programs,
and other NCLB-funded activities. The emphasis is on experimental research,
particularly randomized controlled trials (RCTs).95
Programs in the NCLB affected by the requirement that funded educational
interventions be based on SBR include Title I, Part A grants for the education of the
disadvantaged, Reading First, Early Reading First, Even Start, Literacy Through
School Libraries, Comprehensive School Reform, Improving Teacher Quality State
Grants, Mathematics and Science Partnerships, English Language Acquisition State
Grants, and Safe and Drug-Free Schools and Communities. This discussion focuses
on the application of SBR to the Reading First program.
The NCLB language authorizing Reading First makes clear that the intent of the
program is to require recipients of Reading First funds to implement programs which
are based on scientifically based reading research (SBRR). ED’s application of
SBRR to the Reading First program draws extensively on the work conducted by the
National Reading Panel (NRP). In 2000, the NRP issued a report titled: Teaching
Children to Read.
The NRP was convened by the National Institute of Child Health
and Human Development (NICHD) in consultation with ED in response to a
congressional charge to review the literature on reading and use it to assess the
effectiveness of different techniques for teaching reading, and whether these
techniques were ready to be applied to classroom settings. Based on the NRP’s
research, the NCLB incorporated five essential components of reading as
requirements for reading instruction funded under the Reading First program. These
essential components are defined in the NCLB as “explicit and systematic instruction
in — (A) phonemic awareness; (B) phonics; (C) vocabulary development; (D)
reading fluency, including oral reading skills; and (E) reading comprehension
strategies.”96
95 For an in depth discussion of RCTs, see CRS Report RL33301, Congress and Program
Evaluation: An Overview of Randomized Controlled Trials (RCTs) and Related Issues
, by
Clinton Brass, Blas Nunez-Neto, and Erin D. Williams. Some authors argue that in the
context of encouraging basic educational research, SBR must be interpreted more broadly,
in contrast to the more prescriptive definition of SBR contained in the NCLB, “narrowly
conceived for service providers trying to justify their use of federal dollars.” Margaret
Eisenhart and Lisa Towne, “Contestation and Change in National Policy on ‘Scientifically
Based’ Education Research,” Educational Researcher, vol. 32, October 2003.
96 P.L. 107-110, Section 1207. [20 U.S.C. 6367]. CRS Report RL32145, Early Intervention
in Reading: An Overview of Research and Policy Issues
, by Gail McCallion.

CRS-52
Implementation Issues.
Criticisms of the Application of Scientifically Based Reading
Research to the Reading First Program. Some criticisms have been raised
regarding ED’s application of SBRR to the Reading First Program. For example,
Robert Slavin, of the Success for All Program, has argued that the NCLB’s
requirement that interventions be based on SBR does not differentiate between
programs that have themselves been rigorously evaluated and those programs that
have not been rigorously evaluated for efficacy, but can cite SBR that supports their
interventions. The Success for All Foundation argued in a letter to the Office of the
Inspector General of the U.S. Department of Education (OIG), that ED has
inappropriately narrowed the definition of scientifically based research in its
implementation of the Reading First program:
In essence, through the implementation of Reading First, the U.S. Department of
Education has narrowed the definition of SBRR to the five “essential
components” of reading as identified by the National Reading Panel. Research
on program efficacy has been ignored. Because Reading First was so closely
managed by the U.S. Department of Education, and because it contains such a
strong focus on the use of scientifically based research, it is paving the way for
how states, districts and schools are coming to understand the meaning of SBR,
and how they will apply it to other Federal programs.97
As a consequence of the alleged “narrowing” of the definition of SBRR, states
have been unnecessarily limited in their choices of reading programs, assessments
and professional development packages, according to critics of ED’s implementation
of Reading First.
Limitations of Existing Research. Some of the controversies that have
surrounded implementation of SBRR in the Reading First program reflect the current
state of SBRR and the difficulties of applying existing research to concrete
educational interventions. Some observers have noted that there are many areas of
education research with few if any RCT studies to draw upon.98
Some have argued that navigating the existing array of resources is difficult for
states and LEAs because much of the research is academic. In addition, although
there is more user-friendly material available than ever before, evaluations of the
application of SBRR to concrete educational interventions is still limited, and there
is no single federal website or resource that currently catalogs and evaluates all the
available user-friendly resources. The following discussion summarizes some of the
resources that are currently available.
Identifying Relevant Resources. A variety of federally funded offices and
resources provide information or technical assistance offering guidance more broadly
on SBR, including SBRR, to states and LEAs. There are also guides intended to
97 Robert Slavin, Letter to U.S. Department of Education, The Success for All Foundation,
May 27, 2005.
98 Lynn Olson, “Law Mandates Scientific Basis for Research,” Education Week, January 30,
2002.

CRS-53
provide user-friendly information on SBR that states and LEAs can access through
ED websites and publications. Online resources include an NCLB website with
information on SBR and related resources, a searchable ERIC database on education
research, and access to educational statistics and National Assessment of Educational
Progress (NAEP) data on ED’s National Center for Educational Statistics website.99
The Institute of Education Sciences (IES) has made publications and other resources
available on SBR. In December of 2003 IES published a report titled “Identifying
and Implementing Educational Practices Supported by Rigorous Evidence: A User
Friendly Guide.”
In addition, ED has awarded 20 five-year grants to comprehensive centers to
provide advice to states and LEAs on meeting the requirements of the NCLB. There
are also ten regional centers with functions defined in the Education Sciences Reform
Act of 2002.100
These resources are, however, not all centralized in one location, and relatively
few provide analysis of specific educational instruction or assessment packages that
might meet the SBR requirements of the NCLB. It can be difficult for states and
LEAs to sift through the volume of information that is available and find what they
need to choose effective curriculum and assessment programs.
ED’s IES created a What Works Clearinghouse (WWC) to address this need for
clear, user-friendly information on SBR, including evaluations of specific educational
interventions.101 The WWC publishes reviews of educational interventions that have
SBR to back up efficacy claims on education topics that the WWC has identified as
priorities. Initially, the WWC intended to issue only topic reports, but in May of
2006, the WWC modified its website to include new intervention reports. These
intervention reports have been introduced so that potentially useful information can
be made available as quickly as possible. After an intervention that meets WWC
standards is reviewed, an intervention report is posted on the website. After all such
interventions on a specific topic have been reviewed, a topic report will be posted on
the website. The information provided in intervention reports includes program
descriptions, costs of implementing the programs, and ratings of program
effectiveness — including a category of “potentially positive” for promising results.
Local Control. Perhaps in part because of the difficulties in finding specific
information on SBRR based educational interventions that meet the requirements of
the NCLB, many states have chosen to rely upon a limited number of instructional,
assessment and professional training programs. This raised concerns by some about
what they call the “overprescriptiveness” of ED’s application of SBRR to Reading
First and the potential infringement on states’ and LEAs’ ability to choose curricula.
99 See [http://www.ed.gov/nclb], [http://www.ed.gov/about/pubs/intro/pubdb.html], and
[http://www.nces.ed.gov].
100 The mission of the regional centers includes serving regional needs, disseminating SBR,
providing professional training and technical assistance, and responding to the needs of
stakeholders to ensure the academic success of all students. Responding to Regional Needs
and National Priorities,
Regional Educational Laboratories, 2004 Annual Report.
101 See [http://www.whatworks.ed.gov].

CRS-54
Some argue that this “overprescriptiveness” is not consistent with section 9527 of the
No Child Left Behind Act.102
A 2005 CEP report examined ED’s administration of the state application
process for Reading First grants, among other things.103 The CEP study found that
states were “remarkably consistent” in their choice of programs. It noted that many
states were required to revise their initial application for Reading First grants one or
more times before ultimately having their application accepted. It found that in their
final accepted applications, almost all included DIBELS on their approved
assessments list,104 and used the same program (Consumer’s Guide) for evaluating
and choosing a reading curriculum.105 CEP analysis of a sample of original and final
applications from 10 states found that 4 of the 10 switched to DIBELS and the
Consumers Guide after the initial review of their application.106 Additionally, the
CEP study found that state recommendations of specific reading programs appeared
to have influenced districts’ choice of reading programs. The survey of districts
receiving Reading First funds found that half changed the reading programs used by
the district to qualify for a grant from their state.
In addition, The Reading Recovery Council of North America submitted a
complaint to ED which, among other things, contends that ED has infringed on state
and local control to select curricula through its nonregulatory guidance and
discussions with state officials, which, Reading Recovery argues, effectively
excludes LEAs from using Reading First funds for one-to-one tutoring interventions,
like Reading Recovery.107
Advocates of Reading First believe that the program needs to be prescriptive in
order to produce significant results. They argue that the success of Reading First will
102 This section states: “(a) GENERAL PROHIBITION — Nothing in this Act shall be
construed to authorize an officer or employee of the Federal Government to mandate, direct,
or control a State, local educational agency, or school’s curriculum, program of instruction,
or allocation of State or local resources, or mandate a State or any subdivision thereof to
spend any funds or incur any costs not paid for under this Act. (B) PROHIBITION ON
ENDORSEMENT OF CURRICULUM — Notwithstanding any other prohibition of Federal
law, no funds provided to the Department under this Act may be used by the Department to
endorse, approve, or sanction any curriculum designed to be used in an elementary or
secondary school.” (Elementary and Secondary Education Act of 1965, Section 9527.)
103 The CEP report is based on state and district surveys and case studies conducted for its
2005 study on the No Child Left Behind Act, an overview of all state Reading First
applications, an in-depth review of 15 randomly selected state applications, and a review of
revisions to state applications based on 10 representative states.
104 DIBELS is the Dynamic Indicators of Basic Early Literacy Skills.
105 Both publications were produced by the University of Oregon.
106 Caitlin Scott and Tom Fagan, Ensuring Academic Rigor or Inducing Rigor Mortis?
Issues to Watch in Reading First
, Center on Education Policy (Washington, DC: 2005).
107 The Reading Recovery Council of North America, “Evidence Ignored, Learning Denied:
The Attack on Reading Recovery”, submitted to the Inspector General’s Office of the U.S.
Department of Education, (March 2006), available at [http://www.readingrecovery.com].

CRS-55
be in large part attributable to its strict requirement that programs implemented with
Reading First funds be supported by scientifically based research: “Advocates have
long argued that ‘entitlement’ programs like Title I failed to improve reading scores
because of a lack of quality control on how the money was spent.”108
Three groups representing different reading programs filed separate complaints
with ED’s Office of Inspector General (OIG), asking that the Reading First program
be investigated. The three groups that filed complaints are Dr. Cupp’s Readers and
Journal Writers, Success For All, and the Reading Recovery Council of North
America. In response, the OIG conducted several audits of the Reading First
program. It issued its first report on the federal Reading First program, specifically
on Reading First’s grant application process, in September of 2006. In addition,
several audits of state Reading First programs have been issued, and audits have been
conducted on ED’s administration of the Reading First program and on the RMC
Research Corporation’s Reading First Contract.109 These three reports on the federal
Reading First program essentially validated many of the concerns that had been
raised in complaints filed with the OIG. ED concurred with the OIG’s
recommendations in all three reports, and has addressed all of the recommendations.
In addition, the House Committee on Education and Labor has held oversight
hearings on Reading First, and the Senate Committee on Health, Education, Labor
and Pensions issued a report on Reading First Technical Assistance directors with
financial ties to publishers.
Section 8. Parental Involvement Requirements110
Requiring or encouraging parents’ involvement in decisions affecting the
education of their children and in their actual education has been a long-standing goal
of the Elementary and Secondary Education Act (ESEA). One of the purposes of the
No Child Left Behind Act (NCLB) was to continue and expand certain aspects of
ESEA parental involvement provisions, for example by requiring that parents receive
information on school performance. As the House committee report on H.R. 1 (the
originating bill of NCLB in the House) points out,
... the No Child Left Behind Act of 2001 expands upon current provisions of the
Elementary and Secondary Education Act, which require schools to collect and
report to the public information on the academic quality of Title I schools, in
order to empower parents with information about their schools. Reporting this
information is crucial to empowering parents to hold schools accountable and
108 Andrew Brownstein and Travis Hicks, “Reading First Under Fire,” Title I Monitor,
Education Funding Research Council, (Thompson Publishing Group, September 2005), p.
4.
109 The state audits were issued on October 3, 2005 (Alabama), October 20, 2006
(Wisconsin), November 3, 2006 (New York), and January 18, 2007 (Georgia).
110 This section was written by Richard N. Apling and Wayne C. Riddle.

CRS-56
getting them involved, and helping fix schools that fail and choose another public
school if their child’s school fails.111
Most of the ESEA parental involvement requirements are contained in Title I,
Part A and are linked to local educational agencies (LEAs) and schools receiving
Title I, Part A funding.112 As a result, most of this section dwells on Title I, Part A
parental involvement requirements. However, examples of other, non-Title I, Part
A requirements are mentioned.
ESEA Title I, Part A Requirements
Increasing the involvement of parents in the education of their educationally
disadvantaged children has been a stated goal of ESEA Title I, Part A, since the
beginning of the program in 1965. For many years, representative advisory
committees of parents at the school and LEA level were a major, concrete aspect of
these parental involvement requirements. The statutory requirement for these
committees was dropped under the Education Consolidation and Improvement Act
of 1981, although some schools and LEAs have continued to support such
committees at their own discretion. The parental advisory committee requirement
was dropped, in part, in response to program studies that found the role and authority
of the committees to be ambiguous, leading to occasional tension between parent
groups and school administrators.113
The relatively numerous current ESEA Title I-A statutory requirements for
parental involvement114 are summarized below. They include many broad statements
about the importance of parental involvement activities and their effective
implementation, but comparatively few concrete requirements. Many of the specific
requirements deal more with notification of parents than with more active forms of
involvement, or specific authorities or rights for parents. Of the latter, several apply
only to parents of limited English proficient pupils assigned to language instruction
programs funded under Title I-A.
Section 1118, “Parental Involvement”. Many, but by no means all, of the
relevant Title I-A requirements may be found in Section 1118. Section 1118(a)
requires all LEAs receiving grants under Title I-A to have a written policy on parental
involvement, prepared jointly with parents of pupils participating in the program.
Among other provisions, the policy must: describe how the LEA will involve parents
in the development of the overall LEA plan for Title I-A (Section 1112), and in
school identification and improvement procedures (Section 1116); support school-
111 H.Rept. 107-63, p. 275.
112 The vast majority of LEAs qualify for Title I-A funding. Only those with very few poor
children (fewer than 10) or very low poverty rates (under 2%) fail to qualify. However, only
about 60% of all public schools receive Title I-A funding.
113 See U.S. Office of Education, Compensatory Education Study, A Final Report from the
National Institute of Education
, Chapter IV, 1978.
114 Current non-regulatory guidance on the Title I-A parental involvement requirements may
be found at [http://www.ed.gov/programs/titleiparta/parentinvguid.pdf].

CRS-57
level efforts to implement parental involvement activities; coordinate parental
involvement activities under Title I-A with those of other relevant federal
programs;115 and conduct an annual evaluation of the effectiveness of the LEA’s
parental involvement activities in improving the quality of schools receiving Title I-A
funds.
Similarly, each school participating in Title I-A must have a written policy on
parental involvement. Schools as well as LEAs may substitute parental involvement
policies applicable to all parents of pupils in the school for specific Title I-A policies
in meeting these requirements.
Each LEA participating in Title I-A is required to reserve at least 1% of its grant
for parental involvement activities, with at least 95% of these funds distributed to
individual Title I-A schools. However, if 1% of the LEA’s Title I-A grant would be
equal to $5,000 or less, then this requirement does not apply.
Schools participating in Title I-A are required to convene at least one annual
meeting to which parents of participating pupils are to be invited, to explain the
program’s requirements and the right of parents to be involved. Participating schools
must also offer to parents a “flexible number” of additional meetings, including
“regular meetings” to participate in decisions relating to the education of their
children, if requested by parents. Participating schools must involve parents in
planning, improvement, and review of Title I-A programs, and provide to them
information on the curricula and assessments used at the school.
More specifically, each participating school is to develop a school-parent
compact of shared responsibilities to improve student achievement. The compact is
to: describe the responsibilities of the school and parents for activities to support
children’s learning; and provide for communication through parent-teacher
conferences held at least annually for elementary school pupils, frequent reports to
parents on their children’s progress, and “reasonable” access for parents to school
staff and opportunities for classroom observation.
Participating schools and LEAs are further required to “build capacity” for
parental involvement through activities such as: helping parents understand state
academic content and pupil performance standards and how to help their children
meet them; providing materials and training to parents to help them work with their
children; educating teachers and other school staff in the value of parental
involvement activities; providing literacy training to parents using Title I-A funds,
if necessary and other sources of funding are unavailable; and providing
transportation and child care in order to facilitate participation in parent involvement
activities. In addition, LEAs may establish a district wide parental advisory council.
All of the parental involvement activities supported or required under Section
1118 are to be provided, “to the extent practicable,” in a format and language that is
115 These include the Reading First program and the Even Start program (see below for
discussions of the parental involvement requirements for these ESEA programs) as well as
Head Start, which is not an ESEA program.

CRS-58
accessible to parents with disabilities, with limited English proficiency, or who are
migratory. In states where Parental Information and Resource Centers116 are located,
LEAs and schools are to inform the parents of participating pupils about these
Centers and the services they provide. State educational agencies (SEAs) are to
review LEA parental involvement policies to assure that they meet the requirements
of Section 1118.
Significant Title I-A Parental Involvement Requirements Outside
Section 1118 . There are a number of important parental involvement provisions
in portions of ESEA Title I-A other than Section 1118. First, under Section 1111(d),
state plans for Title I-A must include information on how the SEA will collect and
disseminate information on effective parental involvement practices, based on the
“most current research.”
Second, as discussed in Section 4 of this report, under Section 1111(h)(2), states
and LEAs participating in Title I-A must report assessment results and certain other
data to parents and the public through report cards. States are to publish report cards
for the state overall, and LEAs (including charter schools which are treated under
state law as individual LEAs) are to publish report cards for the LEA and individual
schools. The report cards must generally include information on pupils’ academic
performance disaggregated by race, ethnicity, and gender, as well as disability,
migrant, English proficiency, and economic disadvantage status. The report cards
must also include information on pupil progress toward meeting any other
educational indicators included in the state’s adequate yearly progress (AYP)
standards, plus secondary school student graduation rates, the number and identity
of any schools failing to meet AYP standards, and aggregate information on the
qualifications of teachers. The report cards may include additional information, such
as the extent and type of parental involvement in schools, average class size, or the
incidence of school violence. LEA and school report cards are to be disseminated to
parents of public school pupils and to the public at large; there are no specific
provisions regarding dissemination of the state report cards.
Third, under Section 1111(h)(6), “Parents Right-To-Know,” the parents of any
pupil attending a school participating in Title I-A must be provided, upon request,
with information on the professional qualifications of their child’s teachers. The
information provided must include whether the teacher meets state licensing criteria
for the grades and subject areas they teach; whether any such criteria have been
waived for the teacher; and the postsecondary degree(s) held by the teacher, including
their major area(s) of study. The qualifications of any paraprofessionals who serve
their child must be provided to parents, upon request, as well. In addition,
participating schools are required to provide to each parent information on the
performance of their child on state academic assessments, and to notify parents if
their child is taught for four or more consecutive weeks by a teacher who is not
“highly qualified.”117
116 Parental Information and Resource Centers are funded by IDEA grants and run by parent
organizations to provide training and information to parents of children with disabilities.
117 For the definition of “highly qualified” teacher under the ESEA, see Section 6 of this
(continued...)

CRS-59
Fourth, under Section 1112(g), Local Educational Agency Plans, LEAs using
Title I-A funds to provide a language instruction program for limited English
proficient (LEP) pupils must notify the parents of the pupils served by this program
within 30 days of the beginning of the school year (or within two weeks if
identification occurs during the school year).118 The parental notification must
include the basis for identifying their child as LEP, including the assessment method
and the child’s level of English proficiency; the instructional methods that will be
used in the language instruction program, as well as other programs that might be
available; the exit requirements of the language instruction program; how the
program meets the objectives of the individualized education program of the child
(if the child has a disability); and information on the rights of the parents to remove
their child from the program, and to receive guidance on the selection of alternative
language instruction programs. In addition, a school that is using Title I-A funds to
provide a language instruction program for LEP pupils, and that fails to meet the
annual measurable achievement objectives specified under ESEA Title III, Section
3122,119 must separately notify the parents of participating pupils within 30 days of
such failure.
Section 1112(g)(4) also includes a separate series of parental participation
requirements applicable specifically to parents of pupils receiving language
instruction for LEP pupils funded under Title I-A. These essentially duplicate some
of the provisions in Section 1118 applicable to the parents of all participating pupils
(e.g., outreach to inform parents how they may help their children meet state
academic content and achievement standards, or holding meetings with parents),
though with a specific emphasis on helping pupils attain proficiency in English. In
addition, Section 1112(g)(5) prohibits the admission to, or exclusion from, any
federally-assisted education program on the basis of a pupil’s surname or language
minority status.
Fifth, under Section 1116(b)(6), LEAs are required to inform parents of all
pupils attending a school that has been identified for improvement, corrective action,
or restructuring under Section 1116; that is, the school has failed to meet AYP
standards for two or more consecutive years.120 The notice is to include the reasons
for and an explanation of the identification; how the school’s performance compares
to that of other schools in the LEA and state; an explanation of actions being taken
in response to the identification, and how parents can become involved in these
activities; and an explanation of the parents’ right to transfer their child to another
public school or, where relevant, to obtain supplemental educational services for their
117 (...continued)
report.
118 There are virtually identical requirements for LEAs and other eligible entities using funds
under ESEA Title III (Language Instruction for Limited English Proficient and Immigrant
Students) “to provide a language instruction educational program” (§3302).
119 For information on these objectives, see Section 6 of this report as well as CRS Report
RL31315, Education of Limited English Proficient and Recent Immigrant Students:
Provisions in the No Child Left Behind Act of 2001
, by Jeffrey J. Kuenzi.
120 For more information, see Section 4 of this report.

CRS-60
child. Similarly, under Section 1116(c)(6), SEAs must inform parents when the LEA
serving their child has been identified for improvement (fails to meet AYP standards
for LEAs for two or more consecutive years), the reasons for the identification, and
how parents can become involved in improving the LEA’s instructional programs.
Similarly, under Section 1116(C)(10)(E), SEAs must inform parents of all children
attending schools of an LEA that has been identified for corrective action (may be
taken any time after identification of the LEA for improvement, but must be taken
after four consecutive years of failing to meet AYP standards for LEAs).
Examples of Other ESEA Parental Involvement Requirements
As noted above, Title I-A has the most extensive ESEA requirements for
parental involvement. At the same time, other programs authorized under the ESEA
require some form of parental involvement or permit funds to be spent on parental
involvement activities. The following are some examples.
! The Reading First program (Subpart 1 of ESEA Title I-B). As
discussed in Section 7 of this report, the Reading First program aids
programs “based on scientifically based reading research” for pupils
in kindergarten through 3rd grade. Section 1202(c)(7)(B) permits
LEAs to use funds for certain family literacy programs and to
provide training and assistance to parents to encourage their children
to read.
! The William F. Goodling Even Start Family Literacy Programs
(Subpart 3 of ESEA Title I-B). This program supports efforts to
integrate “early childhood education, adult literacy or adult basic
education, and parenting education into a unified family literacy
program...” (§ 1231(1)) In general, Even Start funds support
“intensive family literacy services that involve parents and children,
from birth through age seven, in a cooperative effort to help parents
become full partners in the education of their children and to assist
children in reaching their full potential as learners.” (§ 1234(a))
! Parental notification requirements under Language Instruction for
Limited English Proficient and Immigrant Students (§3302). As
noted above, LEAs and other eligible entities using funds under
ESEA Title III must notify parents of limited English proficient
pupils of certain information and rights. Among these are the
reasons why their child has been identified as needing a language
instruction educational program and the parents’ rights to decline
enrolling their child in such program and to withdraw their child
from the program if services have already started.
! Parental consent regarding armed forces recruiter access to student
information (ESEA §9528(a)(2)). The ESEA, as amended by the
NCLB requires LEAs receiving assistance under ESEA to provide
secondary school students’ names, addresses, and telephone numbers

CRS-61
to military recruiters.121 This section further provides that the
student or the parent of the student may request that this information
“not be released without prior written parental consent,” that the
LEA notify parents of this option, and that the LEA comply with the
parent’s written request.122
Implementation Issues
National studies on the implementation of current parental involvement
provisions are not yet available. It is possible that such studies may find some issues
similar to those found by past national studies. For example, the most recent
published ESEA Title I-A evaluation that focused, in part, on the impact of the
parental involvement provisions was “Promising Results, Continuing Challenges:
The Final Report of the National Assessment of Title I,” published by the
Department of Education’s Planning and Evaluation Service in 1999.123 Chapter 7
of this report reviews the parental involvement provisions of Title I-A, particularly
those that were newly adopted in the 1994 reauthorization of the ESEA (the
Improving America’s Schools Act). These provisions included the requirements
(described above) for school-parent compacts and, in most cases, the reservation of
at least 1% of LEA Title I-A grants for parental involvement activities. This report
chapter also provides a review of research on the importance and effects of increased
parental involvement in relatively high poverty schools.
Key findings of this study included the following:
! The staff at a high proportion of schools, especially schools with the
highest poverty rates among pupils’ families, found the school-
parent compacts to be helpful in promoting desired behaviors among
pupils, such as homework completion. However, only 75% of a
representative sample of Title I-A schools had actually implemented
this provision.
! Overall, parents remained less involved with their children’s Title I-
A schools than is desirable. Major obstacles to increased
involvement included a failure of many schools to offer outreach and
assistance to parents, lack of time on the part of both school staff and
parents, and lack of education on the part of parents of children
attending high poverty schools.
! School staff found there to be substantial overlaps and duplication
in the parental involvement requirements of multiple federal
programs.
121 Without this provision, LEAs would be prevented from providing such information under
§444 of the General Education Provisions Act (GEPA).
122 For more information, see Section 9 of this report.
123 Available at [http://www.ed.gov/rschstat/eval/disadv/promisingresults/natirpt.pdf].

CRS-62
! In many cases, parents were not receiving the desired level or types
of information from school “report cards.”124
Information from some education advocacy groups suggest that similar
implementation issues are still relevant. In a press release announcing an open letter
to President Bush and Secretary of Education Spellings, the Public Education
Network (PEN) maintained that
[i]n most school districts parents reported having met resistance from school
officials when they tried to get involved, and 75 percent of survey respondents
had not been involved in any NCLB-related activity. By enforcing provisions
already in the law, the federal government can send a strong signal to states and
school districts that parents can and should be active partners in school
improvement, and can build public ownership of the schools.125
Another press release, issued jointly by PEN and Campaign for Fiscal Equity, Inc.
(CFE), reported on testimony by parents in the New York City area. Among other
concerns, PEN and CFE reported that
[p]arents expressed frustration over the lack of communication and timely
information from schools about school performance and services available to
students. While NCLB requires reporting of school and student performance
through annual report cards, parents testified that school information rarely
reaches parents, and when it does, it is often late and difficult to understand. In
particular, parents expressed the need for timely information about supplemental
educational services and recommended that school data be made available in
multiple languages. Parents also testified that they often feel unwelcome by
administrators at the school and district levels, making it difficult for them to get
involved.126
Since one rationale for current parental involvement provisions is to “empower”
parents so that they can influence school improvement, a central implementation
issue is the degree to which parents’ involvement improves schools and improves
student achievement. The Washington Post reported on a recent study based on data
from 257 California elementary schools with high numbers of low-income pupils.
The study found that parent involvement is positively correlated with student
achievement. However, the study also found that other factors (most of which are
also emphasized in current law), such as emphasizing student achievement, aligning
124 It should be noted that this report was published before enactment, under the No Child
Left Behind Act of 2001, or implementation of the school and LEA report card requirements
discussed earlier in this memorandum.
125 “American Public Calls on Bush Administration, Congress To Strengthen Public
Information & Involvement in NCLB & To Hold States Accountable for Progress,” press
release from the Public Education Network, March 16, 2005, available at [http://www.public
education.org/doc/ press_releases/march16_2005.doc].
126 “New Yorkers Speak Out on No Child Left Behind Act Bring Community Voice to
Federal Law,” press release, Public Education Network and Campaign for Fiscal Equity,
Inc., September 30, 2005, available at [http://www.publiceducation.org/pdf/NCLB/hearings/
NY_News_Release.pdf].

CRS-63
curriculum with state academic standards, and providing experienced teachers and
principals, are more highly correlated with achievement test scores.127
Section 9. Military Recruitment
at Secondary Schools128
The Elementary and Secondary Education Act (ESEA), as amended by the No
Child Left Behind Act of 2001 (NCLBA; P.L. 107-110), contains several provisions
requiring local educational agencies (LEAs) receiving funding under the ESEA to
provide military recruiters with the same access to secondary school students as they
provide to institutions of higher education (IHEs) or prospective employers (Section
9528).129 For example, if postsecondary institutions are permitted to come to the
school to provide information to students about their programs, military recruiters
must be afforded the same opportunities. In addition, upon request by military
recruiters or IHEs, LEAs receiving ESEA funding must provide access to secondary
school students’ names, addresses, and telephone numbers, unless a parent or the
student has opted out of providing this information. Failure to meet these
requirements may result in the loss of ESEA funding. An exception to these
requirements may be granted, however, to a private secondary school that maintains
religious objections to military service.
Under the Family Educational Rights and Privacy Act (FERPA), LEAs are
generally required to obtain written consent from parents prior to disclosing
personally identifiable information from a child’s education records. However, LEAs
are permitted to disclose data designated as “directory information” without prior
written consent, unless a parent or the student has specifically asked the LEA not to
release the information.130 Under FERPA, directory information, such as students’
names and addresses, may be provided to outside organizations, such as yearbook
publishers and military recruiters, if the LEA chooses to do so. FERPA does require
127 The study was led by the nonprofit EdSource group in Mountain View, CA. (“Parents’
Effect on Achievement Shaky,” Washington Post, November 22, 2005, p. A10.)
128 This section was written by Rebecca R. Skinner.
129 Similar requirements are also contained in 10 U.S.C. § 503, as amended by Section 544
of the National Defense Authorization Act for Fiscal Year 2002 (P.L. 107-107). While there
are some differences between this provision and those contained in the ESEA, both statutes
are designed to accomplish the same goal. The provisions authorized by NCLBA and 10
U.S.C. § 503 are unrelated to similar provisions at the postsecondary level which require
colleges and universities that receive federal funds to allow military recruiters on campus.
The requirements at the postsecondary level are commonly referred to as the Solomon
Amendment. For more information on the Solomon Amendment, see CRS Report RS22405,
Military Recruiting and the Solomon Amendment: The Supreme Court Ruling in Rumsfeld
v. FAIR
, by Charles V. Dale.
130 Directory information is information that is generally not considered harmful or an
invasion of privacy if released. (U.S. Department of Education, October 9, 2002, “Family
Educational Rights and Privacy Act (FERPA) Model Notice for Directory Information,”
available online at [https://www.ed.gov/policy/gen/guid/fpco/hottopics/ht10-09-02a.html].

CRS-64
an LEA to notify parents about the types of student information the LEA releases
publicly.131 This notice must include an explanation of a parent’s right to request that
directory information not be disclosed to a third party without prior written consent.
LEAs must also notify parents that they routinely disclose the names, addresses, and
telephone numbers of secondary school students to military recruiters, unless a parent
has requested that such information not be distributed without written consent.
Parents must be notified about how to opt out of the public disclosure of directory
information, including the process for doing so and associated deadlines.132 It should
be noted that even if an LEA does not disclose any directory information to any third
parties, it is still required to provide military recruiters with access to secondary
students’ names, addresses, and telephone numbers.133 Parents must be notified of
their option to opt out of this disclosure of information, as well.
Implementation Status. The Department of Defense (DoD) has developed
a national high school data base to track whether military recruiters are provided with
access. As of fall 2002, 95% of approximately 22,000 secondary schools in the
United States provide military recruiters with student access that is in compliance
with statutory requirements.134 In an interview about military recruiters and high
school access that took place in spring 2003, a spokeswoman for the DOD indicated
that there were only six high schools that they were aware of that were not providing
access to military recruiters.135 Even prior to the passage of the NCLBA, military
recruiters generally had access to high schools with only 12% of high schools
denying recruiters access to students’ directory information.136
DOD has also developed a database to collect personal information (e.g., contact
information) of individuals who meet age and minimum school requirements for
military service to support DOD’s recruitment efforts to maintain the nation’s all-
volunteer military. Although DOD has been able to populate its database, the
database has raised controversy and been subject to a lawsuit.137 In May 2003, DOD
131 For more information about parental notification requirements, see [http://www.ed.gov/
policy/gen/guid/fpco/pdf/ht100902b.pdf].
132 An LEA may provide a single notice that informs parents about the types of information
that are publicly released, public disclosure of student information to military recruiters, and
process for requesting that information not be disclosed without prior written consent. (U.S.
Department of Education, October 9, 2002, “Policy Guidance - Access to High School
Students and Information on Students by Military Recruiters,” available at
[http://www.ed.gov/policy/gen/guid/fpco/hottopics/ht-10-09-02a.html]. Hereafter referred
to as ED, Policy Guidance.)
133 ESEA, § 9528.
134 ED, Policy Guidance.
135 Ibid.
136 “Military Recruiters Meet Pockets of Resistance,” Education Week, April 23, 2003. Also
see “Military Faces Parental Counterattack: High School Recruitment, a Longtime
Tradition, Raises Worries in Wartime,” Washington Post, November 1, 2005, p. B01.
137 The New York Civil Liberties Union (NYCLU) sued the DOD on behalf of several high
school students after the JAMRS database began collecting, maintaining, and distributing
(continued...)

CRS-65
issued a Federal Register announcement regarding the proposed database referred to
as the Joint Advertising and Market Research Recruiting (JAMRS) Database.138 The
system was designed “to provide a single central facility within the Department of
Defense to compile, process and distribute files of individuals who meet age and
minimum school requirements for military service. The information will be provided
to the Services to assist them in their direct marketing recruiting efforts.”139
The announcement indicated that the JAMRS Database would include data on
high school students ages 16-18, current college students, and Selective Service
System registrants, as well as individuals who had taken the Armed Services
Vocational Aptitude Battery (ASVAB) test and individuals who had responded to
various paid/non-paid advertising campaigns seeking enlistment information since
July 1992. Also included in the database were military personnel on active duty or
in the reserves, individuals in the process of enlisting, and individuals who have
asked to be eliminated from any future recruiting lists. Data collection was planned
to collect, among other items, students’ contact information, social security numbers,
ethnicity, high school name, and grade point averages. Individual data were to be
maintained for five years.
In January 2007, DOD issued a second Federal Register announcement to
provide “further explanation and clarification of the manner in which the JAMRS
Database is maintained” in response to comments received during the comment
period for the May 2005 Federal Register announcement.140 The following are
examples of changes or clarifications that were made regarding the database.
! Student data will only be stored for three years.
! Social security numbers will only be collected through the Selective
Service System and not from any other governmental or private
database.
! Student information will be used for military recruiting purposes and
will not be disseminated to law enforcement, intelligence, or other
agencies.
! A process will be available for students to have their names removed
from the JAMRS list provided to the Services for recruiting
purposes.141
137 (...continued)
individuals’ personal information.
138 Federal Register, vol. 70, no. 98, 29486-29487. A subsequent Federal Register notice
(vol. 72, no. 5, 952-956) changed the name of the database to the Joint Advertising, Market
Research & Studies Recruiting Database. The acronym, JAMRS, was not changed.
139 Federal Register, vol. 70, no. 98, 29486.
140 The NYCLU notes that DOD made changes to the JAMRS database to settle the lawsuit
it filed on behalf of several high school students after the JAMRS database began collecting,
maintaining, and distributing individuals’ personal information. For more information, see
[http://milrec.nyclu.org/archive/00000016.html].
141 Specific information on how to opt out of the JAMRS database is available at
(continued...)

CRS-66
Implementation Issues. Although there has been general compliance with
requirements that high schools grant access to military recruiters, there has also been
confusion and controversy over the implementation of the requirements.142
According to the U.S. Department of Education (ED), two areas of concern have
focused on the application of 10 U.S.C. § 503 and the “opt out” provision in the
ESEA.143 Requirements similar to those included in the ESEA for the release of
student information are included in 10 U.S.C. § 503, as amended by Section 544 of
the National Defense Authorization Act for Fiscal Year 2002 (P.L. 107-107).144 In
addition, 10 U.S.C. § 503 only applies to LEAs (including private secondary schools)
that receive funds under the ESEA. Under 10 U.S.C. § 503, the governing body of
an LEA (e.g., school board) could vote to have a policy to deny military recruiters
access to students or students’ “directory information.” The ESEA requirements on
providing access to military recruiters do not include this policy exemption. Thus,
while denying access to military recruiters would be permitted under 10 U.S.C. § 503
and not subject the LEA to any sanctions,145 the action would violate the statutory
requirements of the ESEA and the LEA could be subject to sanctions by ED.
The second major issue, according to ED, focuses on compliance with the
provision specifying that a secondary school student or the parent of the student may
request that directory information not be released without prior written parental
consent.146 According to ED, this provision has been “misapplied” by LEAs that
require written parental consent before they will provide information to military
recruiters; thereby creating an “opt in” rather than an “opt out” policy. Statutory
requirements do not permit LEAs to establish an “opt in” policy.
LEAs have interpreted the “opt out” provision in different ways regarding what
type of notification is provided, how a parent’s response to notification is interpreted,
and whether they have actually implemented an “opt in” policy rather than an “opt
out” policy. For example, some schools have provided a general notice to parents
141 (...continued)
[http://www.defenselink.mil/sites/jamrs_optout.html].
142 There are also related legal concerns. For example, critics of the policy have questioned
whether the provisions violate a student’s right to privacy. For more information on legal
issues related to the military recruitment policy, see CRS Report RS22362, Military
Recruitment Provisions Under the No Child Left Behind Act: A Legal Analysis
, by Jody
Feder.
143 U.S. Department of Education, July 2, 2003, “Key Policy Letters Signed by the
Education Secretary or Deputy Secretary,” available at [http://www.ed.gov/print/policy/
gen/guid/secletter/030702.html].
144 For more information about 10 U.S.C. § 503, see [http://www.ed.gov/policy/gen/guid/
fpco/pdf/ht100902b.pdf].
145 Under 10 U.S.C. § 503, within 120 days a senior military officer must visit an LEA
failing to comply with the statutory requirements. If the access problem cannot be resolved,
the state governor is notified. If after a year, the Secretary of Defense determines that the
LEA is denying recruiting access to at least two of the armed forces, the issue is reported
to Congress.
146 ESEA, Section 9528.

CRS-67
about their rights to opt out of the release of student directory information without
mentioning how the information will be used, while other schools have issued more
explicit notices informing parents that student information may be shared with
military recruiters unless they opt out. In some cases, a lack of response to the
notification of the opportunity to opt out is interpreted as a willingness to have
information released, while other LEAs interpret a lack of response as an indication
that parents do not want to have their children’s information released.147 The latter
interpretation requires parents to “opt in” to have information released.
Some parents and organizations have criticized schools for failing to make the
opt out policy clearer to parents. For example, the National Parent-Teacher
Association (PTA) is asking that statutory language be changed to require that
parents provide explicit permission for military recruiters to access students’
information.148 Some schools and school districts are examining their current
policies on military recruiters.149 For example, Tucson Unified School District in
Arizona established a policy limiting military recruiters to one visit per month at each
school in response to complaints received from parents about the number of days
military recruiters were spending at schools.150 The Seattle school district is also re-
evaluating its policy with respect to military recruiters after a parent-teacher-student
association at a local high school passed a resolution stating that military recruiters
were not welcome at the school.151
Section 10. Participation of Children
Enrolled in Private Schools152
Under the ESEA, services are provided to private school students according to
the “child benefit” model. Accordingly, children enrolled in private schools may
benefit from publicly-funded services, yet funding for and the provision of these
services remain under public control. Children enrolled in private elementary and
secondary schools have been eligible to be served under the ESEA in some capacity
since its inception in 1965.153 The NCLB made a number of amendments to the
ESEA requirements applicable to the participation of children enrolled in private
schools. The most significant changes address how services to eligible children must
147 “Schools and Military Face Off; Privacy Rights Clash With Required Release of Student
Information,” Washington Post, June 19, 2005.
148 “Military Faces Parental Counterattack: High School Recruitment, a Longtime Tradition,
Raises Worries in Wartime,” Washington Post, November 1, 2005.
149 “Schools and Military Face Off; Privacy Rights Clash With Required Release of Student
Information,” Washington Post, June 19, 2005.
150 “Recruiting in Schools, a Priority for Military, Is Targeted by Critics,” Education Week,
June 22, 2005.
151 Ibid. Also see “Schools and Military Face Off; Privacy Rights Clash With Required
Release of Student Information,” Washington Post, June 19, 2005.
152 This section was written by David P. Smole.
153 P.L. 89-10, § 205(a)(2).

CRS-68
be arranged between LEAs and the private schools in which eligible children are
enrolled; the specific programs under which services must be provided; and how the
effectiveness of these services must be assessed. Also, in response to the U.S.
Supreme Court’s ruling in Agostini v. Felton,154 which permits ESEA funded services
to be provided on the premises of private religious schools, ESEA Title I-A funding
for capital expenses to support the provision of equitable services to private school
students on neutral sites was terminated.
Private school students are eligible to be served under the following ESEA
programs: Title I-A (Education for the Disadvantaged), Title I-B-1 (Reading First),
Title I-B-3 (Even Start Family Literacy), Title I-C (Migrant Education), Title II-A
(Teacher and Principal Training and Recruiting Fund), Title II-B (Mathematics and
Science Partnerships), Title II-D (Enhancing Education Though Technology), Title
III-A (English Language Acquisition, Language Enhancement and Academic
Achievement), Title IV-A (Safe and Drug-Free Schools and Communities), Title IV-
B (21st Century Community Learning Centers), Title V-A (Innovative Programs), and
Title V-D-6 (Gifted and Talented Students). Under these programs, services typically
are provided to private school students either directly by the LEA responsible for the
geographic area in which a private school student resides, or by a third-party
contractor. LEAs must consult with private school officials as they establish the
terms according to which private school students will be served.
Key changes under the NCLB to the requirements for the participation of
children enrolled in private schools in ESEA programs include the following:
! Expanding the equitable participation requirement to apply to the
teachers and families of eligible private school children with respect
to services and activities developed pursuant to ESEA § 1118
(Parental Involvement) and § 1119 (Qualifications for Teachers and
Paraprofessionals);155
! Expanding the topics and strengthening the requirements according
to which LEAs must consult with private school officials during the
design and development of the ESEA programs under which they
will serve private school children;
! Requiring documentation of private school officials’ affirmation that
required consultation with the LEA has taken place, and establishing
procedures through which private school officials may raise
procedural complaints through the SEA;
! Modifying the assessment provisions for Title I-A services to private
school pupils to require that services be academically assessed, and
that the results of these assessments be used to improve services; and
154 521 U.S. 203 (1997). Agostini v. Felton overturned a previous ruling by the U.S.
Supreme Court, Aguilar v. Felton, 473 U.S. 402 (1985), which had held that the provision
of ESEA services to students in private religious schools by public school teachers
necessitated an excessive entanglement between church and state in violation of the
Establishment Clause of the First Amendment to the Constitution.
155 This does not mean that the “highly qualified” requirements for teachers and
paraprofessionals apply to private school staff.

CRS-69
! Adding new requirements specifying how eligible private school
children may be counted to ensure the provision of equitable
services.
Implementation Status. Following the enactment of the NCLBA, the
Department of Education issued ESEA Title I regulations in December 2002,156 and
non-regulatory guidance in October 2003,157 that address the provision of Title I
services to private school children. ED also issued non-regulatory guidance in
August 2005158 on the provision of services to private school children under ESEA
programs covered under Title IX; and in August 2002 on the provision of services
under Title V-A.159 In implementing the ESEA, ED has taken steps to ensure that
states and LEAs are adhering to the requirements to serve private school students.
For example, ED’s Student Achievement and School Accountability (SASA)
Programs Team has conducted compliance reviews of SEA implementation of
NCLBA requirements — including those pertaining to the equitable participation of
private school children.160 Also, ED’s Office of Inspector General has conducted
audits of LEA compliance with NCLBA requirements to provide equitable services
to private school students.161 According to data from Consolidated State
Performance Reports, in school year 2004-2005, of the more than 20 million students
served under Title I, 187,951, (0.94%) were students enrolled in private schools.162
156 34 CFR §§ 200.64-200.67.
157 U.S. Department of Education, Office of Innovation and Improvement, Office of Non-
Public Education, Title I Services to Private School Children: Non-Regulatory Guidance,
October 17, 2003.
158 U.S. Department of Education, Office of Innovation and Improvement, Office of Non-
Public Education, Equitable Services to Eligible Private School Students, Teachers, and
Other Educational Personnel: Non-Regulatory Guidance
, August 2005.
159 U.S. Department of Education, Office of Innovation and Improvement, Guidance for Title
V, Part A of the Elementary and Secondary Education Act, as reauthorized by the No Child
Left Behind (NCLB) Act (State Grants for Innovative Programs)
, August 2002.
160 U.S. Department of Education, Office of Elementary and Secondary Education, Student
Achievement and School Accountability Program (SASA) Monitoring Plan for Formula
Grant Programs for October 1, 2005 to September 30, 2006
, (see item 3.3, Within District
Allocation Procedures), available at [http://www.ed.gov/admins/lead/account/monitoring/
indicators0506.pdf].
161 See, for example, U.S. Department of Education, Office of Inspector General, 2004 Audit
Reports, Office of Elementary and Secondary Education, Detroit City School District’s
administration of Title I, Part A of the Elementary and Secondary Education Act of 1965,
as amended (the Act), Set-Aside programs for the period July 1, 2002, through May 31,
2003
, (ACN# A05D0021), available at [http://www.ed.gov/about/offices/list/oig/areports
2004.html].
162 U.S. Department of Education, Institute of Education Sciences, National Center for
Education Evaluation and Regional Assistance, National Assessment of Title I, Final
Report: Volume I: Implementation
, 2007, p. 153.

CRS-70
Implementation Issues. Participating LEAs have been required to provide
equitable services to eligible private school students under the ESEA since 1965.
However, despite both the long history of the requirement and the recent
strengthening of the provisions for LEAs to consult with private school officials, it
appears that timely consultation continues to remain a concern. In response, ED and
many SEAs have highlighted the equitable participation requirements to address
concerns about consultation and the availability of services.163 The amount of
funding available to serve eligible private school students under Title I-A also has
been raised as a concern. This is due to the statutory requirement (addressed earlier)
that LEAs reserve up to 20% of their Title I-A allocation to cover the costs of
transportation for public school choice and supplemental educational services for
students enrolled in public schools identified for improvement, corrective action, or
restructuring. Depending on how LEAs pay for school choice transportation and
supplemental educational services, this requirement may result in proportionally less
funding being available for services to private school students than before enactment
of the NCLBA, because private school students are not eligible for these services.
Several provisions of the ESEA provide that in instances where equitable services are
not provided to eligible private school students — either because of state
constitutional prohibitions or the failure of LEAs to comply — the LEA may be
“bypassed” and services provided through a third party.164 Bypass arrangements for
certain ESEA programs are currently used in Missouri, Nebraska, and Virginia.
Section 11. Unsafe School Choice Option165
The NCLBA established a new Unsafe School Choice Option (USCO) policy
under ESEA Title IX-E-2, § 9532. The USCO policy is administered by the U.S.
Department of Education (ED), Office of Safe and Drug-Free Schools. Under the
USCO policy, in order to be eligible to receive ESEA funding, states are required to
establish statewide policies under which students who attend persistently dangerous
public elementary or secondary schools, or who become victims of a violent crime
while in or on the grounds of the public elementary or secondary schools they attend,
must be offered the opportunity to transfer to another public school within the same
LEA. Each year, states must certify their compliance with USCO requirements prior
to receiving ESEA funding for the next year.
ED has issued non-regulatory guidance outlining the steps that states must take
to comply with the USCO policy. These steps include the following:
! Establish a state USCO policy,
! Identify persistently dangerous schools,
! Identify types of offenses that are considered to be violent criminal
offenses,
163 “Private School Consultation Takes More Than Two Days’ Notice, ED Says,” Title I
Monitor
, vol. 8, no. 9 (September 2003), pp. 1, 9-12.
164 See, for example, ESEA §§ 1120(e), 1307, 5142(i), and 9502.
165 This section was written by David P. Smole.

CRS-71
! Provide a safe public school choice option, and
! Certify compliance with USCO.166
States are required to develop their USCO policies in consultation with a
representative sample of LEAs within the state.
Implementation Status. States were required to implement the USCO
beginning with the 2002-2003 school year. Each state’s USCO policy is somewhat
different. While most states establish some threshold number of violent offenses
relative to school enrollment that must be exceeded for either two or three
consecutive years in order for a school to be designated as persistently dangerous, the
definitions of violent offenses or incidents measured tend to vary considerably across
states. Typically, states’ USCO policies identify which crimes or types of crimes
constitute violent offenses, although these tend to differ from state to state. Some
state policies reference primarily felony offenses (e.g., homicide, manslaughter,
aggravated assault, or sexual assault). Others also reference violation of weapon
possession laws (e.g., Gun-Free Schools Acts), or drug possession laws. Some state
policies also consider student expulsions for offenses such as drug or alcohol
possession, or violence.167
Limited information is available on both schools being determined to be
persistently dangerous and on students transferring to different schools under the
USCO policy. However, it does appear that differences in the criteria used by states
to identify schools as persistently dangerous may be a factor that has led to
considerable variation among states in the number of schools determined to be
persistently dangerous. The U.S. Department of Education, Office of Safe and Drug-
Free Schools reports that for 2003-2004, 47 schools were identified as persistently
dangerous (5 states); for 2004-2005, 39 schools were identified as persistently
dangerous (4 states); and for 2005-2006, according to preliminary reports, there were
41 schools identified (7 states).168 A review of the implementation of the USCO
policy found that in Pennsylvania (under whose USCO policy there has been a
comparatively large number of determinations) during the 2003-2004 school year, 75
students in the Philadelphia school district transferred from the 27 schools identified
as persistently dangerous, and 58 students transferred from the one school so
identified in the Chester-Upland school district.
Implementation Issues. An examination by the Education Commission of
the States (ECS) of states’ implementation of USCO requirements reveals that by
March 2004, nearly all states had established criteria for identifying unsafe schools,
166 U.S. Department of Education, Unsafe School Choice Option: Non-Regulatory Guidance,
May 2004, p. 6, available at [http://www.ed.gov/policy/elsec/guid/unsafeschoolchoice.pdf].
167 For a compilation of criteria used by the various states to identify persistently dangerous
schools, see Education Commission of the States, Persistently Dangerous School Criteria,
compiled by Gloria Zradicka, September 2004, available at [http://www.ecs.org/clearing
house/52/98/5298.pdf].
168 U.S. Department of Education, Safe and Drug-Free Schools and Communities Advisory
Committee, Unsafe School Choice Option Report, 2006, at [http://www.ed.gov/about/
bdscomm/list/sdfscac/topics.html#Unsafe].

CRS-72
and had implemented policies under which students in unsafe schools and students
who were victims of violent crimes could transfer to other public schools within the
same LEA.169 The U.S. Department of Education, Office of Inspector General (OIG),
has examined five states for compliance with the USCO requirements and has found
that, in general, four of these states (California, Georgia, Iowa, and New Jersey) are
complying with the requirements of the law. (A common weakness identified in the
audit findings of these four states was inconsistency in the implementation of the
USCO by LEAs.) The OIG found that Texas had not adequately implemented the
USCO policy at the state and LEA levels.170
While as noted above, relatively few schools have been identified as unsafe,
school safety remains an important concern for students and their families. For
example, according to the School Survey on Crime and Safety (SSOCS), the
percentage of schools experiencing one or more violent incidents (e.g., rape, sexual
battery other than rape, physical attacks or fights with or without a weapon, threat of
physical attack with or without a weapon, and robbery with or without a weapon)
increased from 71% in 1999-2000 to 81% in 2003-2004.171 In 2003-2004, 18% of
schools reported at least one serious violent offense (e.g., rape, sexual battery other
than rape, physical attack or fight with a weapon, threat of physical attack with a
weapon, and robbery with or without a weapon).172 The SSOCS survey found that
on a per-student basis, middle schools had the highest rates of crime, with 53 violent
crimes per 1,000 students versus 28 violent crimes per 1,000 students in elementary
schools and 28 violent crimes per 1,000 students in high schools.173 Given these
findings, it appears that the small number of schools identified as persistently
dangerous may be a result in part of the high thresholds for identification set by many
states, which often must be met for two or three consecutive years.
169 Education Commission of the States, ECS Report to The Nation: State Implementation
of the No Child Left Behind Act, Indicator 4 — Safe Schools
, (2004), available at
[http://www.ecs.org/html/Special/NCLB/ReportToTheNation/docs/Indicator_4.pdf].
170 U.S. Department of Education, Office of Inspector General, 2005 Audit Reports, Safe
and Drug-Free Schools, (ACN#s: A09E0025, A04E0007, A07E0027, A03E0008, and
A06E0028), available at [http://www.ed.gov/about/offices/list/oig/areports.html].
171 U.S. Departments of Education and Justice, Indicators of School Crime and Safety: 2006
(NCES 2007 — 003/NCJ 214262)
, by R. Dinkes, E.F. Cataldi, G. Kena, and K. Baum
(2006), p. 24, at [http://www.nces.ed.gov/pubs2007/2007003.pdf].
172 Ibid.
173 Ibid.