Opinion
Civil Action No. 6:71-CV-5281 WWJ.
July 24, 2008.
Andrew Ryan Cogar, U.S. Dept. of Justice, Washington, DC, for Plaintiff.
MEMORANDUM OPINION
Before the Court for consideration is Intervenors', GI Forum's and the League of Latin American Citizens' (LULAC), Motion to Amend Findings of Fact and Conclusions of Law and to Alter or Amend Judgment in the above numbered and styled civil action pursuant to Federal Rules of Civil Procedure 52(b) and 59(e). (Docket No. 730.) The Court entered its Memorandum Opinion and Judgment on July 30, 2007. (Docket No. 729.) Intervenors' timely filed their motion on August 13, 2007. Intervenors claim that the Court committed manifest errors of law and fact by "(1) concluding that, under the EEOA, the failure of language programs for LEP students at the secondary level can be ignored . . . if language programs at the elementary level demonstrate success . . . (2) determining that [Intervenors] bear the burden of identifying [alternative] evaluation. . . .," and (3) the blanket claim that the Court committed manifest errors of law and fact by "denying all relief entitled to Plaintiff-Intervenors under the EEOA. . . ." (Intvs.' Mot. Amend 4.)
Federal Rule of Civil Procedure 52(b) provides that "[o]n a party's motion filed no later than 10 days after the entry of judgment, the court may amend its findings — or make additional findings — and may amend the judgment accordingly." Rule 59(e) provides that "[a] motion to alter or amend a judgment must be filed no later than 10 days after the entry of the judgment." The purpose of both Rule 52(b) and Rule 59(e) is to allow courts to "correct manifest errors of law or fact." Templet v. Hydrochem, Inc., 367 F.3d 473, 479 (5th Cir. 2004) (addressing Rule 59(e)); Fontenot v. Mesa Petroleum Co., 791 F.2d 1207, 1219 (5th Cir. 1986) (addressing Rule 52(b)).
Under Rule 52(b), rulings on motions to amend findings are committed to the sound discretion of the district court. 9 James Wm. Moore et al., Moore's Federal Practice § 52.60[2] (3d ed. 2000); 9C Charles Alan Wright et al., Federal Practice and Procedure § 2582 (3d ed. 1998). "[A] party may move to amend the findings of fact even if the modified or additional findings in effect reverse the judgment. `If the trial court has entered an erroneous judgment, it should correct it.'" Fontenot, 791 F.2d at 1219 (quoting 5A James Wm. Moore et al., Moore's Federal Practice ¶ 52.11 (2d ed. 1985)). This directive to correct erroneous judgments appears particularly clear where, as here, the parties have not contributed to the court's error. See Templet, 367 F.3d at 479 (cautioning against granting motions to amend based upon evidence available at trial but not proffered, relitigation of old issues, or to secure a rehearing on the merits); Fontentot, 791 F.2d at 1219 (same). The Court GRANTS the motion in order to correct its erroneous judgment, which was based upon manifest errors of law and fact. In the exercise of its discretion and for purposes of judicial economy, the Court also reviews and amends the clear and manifest errors in its findings of fact and conclusions of law that relate to the Court's Modified Order, which were not challenged in Intervenors' Motion to Amend. Golden Blount, Inc. v. Robert H. Peterson Co., 438 F.3d 1354, 1358 (Fed. Cir. 2006) (holding that "a Rule 52(b) motion provides the district court discretion to amend any of its own findings"); 9 James Wm. Moore et al., Moore's Federal Practice § 52.60[2] (3d ed. 2000) ("The court, in the exercise of its discretion, may also review and amend, any of its own findings and conclusions.")
Regarding Rule 59(e), a district court has considerable discretion to alter or amend a judgment but not limitless discretion. Templet, 367 F.3d at 479. In determining whether to grant a Rule 59(e) motion, a court must strike the proper balance between the need to bring litigation to an end and the need to render just decisions on the basis of all the facts. Id. As new persuasive authority demonstrates a clear and manifest error of law in integral conclusions of the Court, and as the Court committed other clear and manifest errors in its conclusions of law and findings of fact, the Court finds that the need to render a just decision on the basis of all the facts vastly outweighs the momentary delay in concluding the litigation before this Court. Accordingly, the Court GRANTS the motion in order to correct clear and manifest errors of law and fact upon which the judgment is based. 11 Charles Alan Wright et al., Federal Practice and Procedure § 2810.1 (stating that one ground on which a Rule 59(e) motion may be granted is if it is "necessary to correct manifest errors of law or fact upon which the judgment is based").
To conform to the opinion set out below, the Court amends its findings of fact and conclusions of law and alters its judgment. For the sake of clarity, no portion of the previous July 30, 2007 Memorandum Opinion and the Order attached therewith has been retained; the previous opinion and order are, in effect, vacated in full. After reconsidering all of the evidence, arguments, and briefs, the Intervenors' Motion for Further Relief and the United States of America's request for relief are GRANTED in part and DENIED in part.
I. Procedural Posture
The complex factual and procedural background of this case begins thirty-seven years ago, with a suit filed in the United States District Court for the Eastern District of Texas. That action involved nine all-black school districts located in northeastern Texas and resulted in a comprehensive order directed to the Texas Education Agency ("TEA"), concerning its responsibilities with regard to all Texas school districts. The Court entered a permanent injunctive order and retained jurisdiction over TEA and thereby, indirectly, over the Texas public education system. See United States v. Texas, 321 F. Supp 1043 (E.D. Tex. 1970), aff'd as modified, 447 F.2d 441 (5th Cir. 1971).
The Court crafted the injunctive order to ensure that "no child w[ould] be effectively denied equal educational opportunities on account of race, color or national origin." Id. at 1056. The original injunctive order was modified by this Court, United States v. Texas, 330 F. Supp. 235 (E.D. Tex. 1971), and later by the United States Court of Appeals for the Fifth Circuit, United States v. Texas, 447 F.2d 441 (5th Cir. 1971). The original injunctive order as modified will be referred to herein as the "Modified Order."
Section G of the Modified Order, entitled "Curriculum and Compensatory Education," provides that the State of Texas, the TEA, its officers, agents, and employees:
(1) . . . shall [ensure] that school districts are providing equal education opportunities in all schools. The [TEA] through its consulting facilities and personnel, shall assist school districts in achieving a comprehensive balance[d] curriculum on all school campuses. . . .
***
(2) [TEA] shall institute a study of the educational needs of minority children in order to [ensure] equal educational opportunities of all students. The [TEA] shall request the assistance of the United States Office of Education and any other educational experts whom they choose to consult in making this study. . . . [A] report on this study shall be filed by the [TEA] with the Court including:
(A) Recommendations of specific curricular offerings and programs which will [ensure] equal educational opportunities for all students regardless of race, color, or national origin. These curricular offerings and programs shall include specific educational programs designed to compensate minority group children for unequal educational opportunities resulting from past or present racial and ethnic isolation, as well as programs and curriculum designed to meet the special educational needs of students whose primary language is other than English. . . .
A. 1981 Intervention
In 1975, Plaintiff-Intervenors GI Forum and LULAC filed a Motion to Enforce Decree and for Supplemental Relief under the Modified Order, seeking to address denials of equal education opportunity to Mexican-American students in Texas public schools. United States v. Texas (LULAC), 506 F. Supp. 405, 410 (E.D. Tex. 1981). That motion asserted the following bases for relief: Section G of the Modified Order, Title VI of the 1964 Civil Rights Act, the Equal Protection Clause of the Fourteenth Amendment to the United States Constitution, and the Equal Education Opportunity Act ("EEOA"), 20 U.S.C. § 1703(f). Id. In their demand for relief, Intervenors called for TEA to implement a plan that would provide all limited English proficiency ("LEP") students with bilingual instruction and compensatory programs to overcome the effects of past discrimination. Id. The United States also moved for enforcement of section G and for similar, but not identical, supplemental relief. Id.
The Court held that the Defendants had violated the Equal Protection Clause and section 1703(f) of the EEOA by failing to take appropriate action to address the language barriers of LEP students and by failing to remove the disabling vestiges of past de jure discrimination against Mexican-American students. Texas (LULAC), 506 F. Supp. at 428-34. The Court issued a remedial decree compelling Texas to take affirmative steps to remedy the EEOA and equal protection violations. Id.
However, the Court found that because no evidence of purposeful discrimination was present, Defendants had not violated Title VI. Id. at 431. The Court also found no violation of Section G of the Modified Order, explaining that the comprehensive bilingual program sought by Intervenors was not inherent in Section G. Id. Therefore, the Defendants were not bound, res judicata, to implement such a program under that section. Id.
Soon thereafter, Defendants moved to vacate the remedial decree. Defendants argued that a recently enacted state law, the Texas 1981 Bilingual and Special Language Programs Act ("S.B. 477"), created a new program for addressing the learning difficulties of LEP students and "must be given a chance to work before it can be evaluated for success or failure." See United States v. Texas, 523 F. Supp. 703, at 736 (E.D. Tex. 1981). This Court denied Defendants' motion, and Defendants appealed.
In 1982, the United States Court of Appeals for the Fifth Circuit reversed the Court's decision. United States v. Texas (LULAC), 680 F.2d 356 (5th Cir. 1982). First, the Fifth Circuit held that there was insufficient factual support for the Court's equal protection findings. Id. at 370-71. Regarding the EEOA findings, the Fifth Circuit explained that the case below was tried and decided prior to the Fifth Circuit's decision in Castaneda v. Pickard, 648 F.2d 989 (5th Cir. 1981), which laid down a three-step test for compliance with section 1703(f) of the EEOA. Id. at 371. Relying upon this new standard, the Fifth Circuit concluded that this Court's EEOA findings were moot. Id. The Fifth Circuit explained that
where the court erred . . . was in its denial of the state's post-trial motion to vacate the injunctive order on the ground of mootness. . . . The court's refusal to reconsider its injunctive order in light of the 1981 Act imposed a judicial gloss on the new legislative scheme without testing that scheme against the requirements of section 1703(f) as elaborated by Castaneda. In these circumstances, the court's judgment may not legitimately be sustained upon the section 1703(f) ground.Id. The Fifth Circuit remanded.
B. 2006 Intervention
On February 9, 2006, LULAC and GI-Forum filed a Motion for Further Relief under the Modified Order. The instant action is a successive motion in Intervenors' original 1981 intervention, lineally descending from the Fifth Circuit's remand in United States v. Texas (LULAC), 680 F.2d 356. Intervenors assert that TEA's actions deny LEP students equal educational opportunity and therefore violate section 1703(f) of the EEOA and the Modified Order. (Mot. Further Relief ¶ 63.) Intervenors claim (1) that in the years since Texas enacted S.B. 477, TEA has abandoned monitoring, enforcing, and supervising school districts to ensure compliance with Texas's bilingual education program and (2) that TEA has failed to provide equal educational opportunity to LEP students above the elementary level. Id. at ¶¶ 64, 65. On February 28, 2006, the United States intervened in a limited capacity. The United States reserved its position on the Intervenors' allegations, awaiting the Defendants' response as well as future factual developments.
Defendants immediately moved to dismiss the Intervenors' motion, asserting Eleventh Amendment immunity. They also argued that Intervenors had improperly invoked the forum of this Court by filing this action as a Motion for Further Relief under the Modified Order — directly challenging Intervenors' "successive motion" theory.
It is important, at this juncture, to clarify that although Defendants' motion to dismiss framed the issue as one of "jurisdiction," it is more accurately termed an objection to an allegedly factitious forum. It is undisputed that this Court has subject matter jurisdiction over Intervenors' EEOA claim. 20 U.S.C. § 1708. Instead, Defendants object that Intervenors improperly filed their Motion for Further Relief under the Modified Order when Intervenors' claim is exclusively an EEOA claim.
On August 11, 2006, the Court issued a lengthy written opinion denying Defendants' motion to dismiss. Therein, the Court rejected Defendants' Eleventh Amendment Claim. (April 11, 2006 Order 33.) Regarding Defendants' forum objection, the Court explained that Intervenors had pled not only a violation of the EEOA, section 1703(f), but also had pled separate violations of the Modified Order as a "source of law." Id. at 4-6. Intervenors invoked section G(1) of the Modified Order, which requires Defendants to ensure "that school districts are providing equal educational opportunities in all schools," and invoked the enforcement provision of Section J(1), which provides that "[t]his Court retains jurisdiction for all purposes, and especially for the purpose of entering any and all further orders which may become necessary to enforce or modify this decree." Id. The Court applied notice pleading principles to find that the facts and allegations in the Intervenors' Motion for Further Relief, if proven, could establish a violation of the Modified Order. Id.
In a subsequent motion to dismiss, the Defendants argued that Intervenors lacked associational standing because no LEP students are members of their organizations, and no member of either LULAC or GI-Forum would have standing to bring this action in their own right. Intervenors requested, and this Court granted, leave to amend their Motion for Further Relief in order to identify Texas LEP students who are members of their organizations. That same afternoon, Intervenors amended their complaint, identifying fourteen parents of LEP students enrolled in Texas schools who are members of Texas LULAC or whose children are members of Texas LULAC. The Court considered Intervenors' amended submissions and arguments and rejected Defendants' standing challenge by written order. (Mem. Op. and Order Den. Defs.' Mot. Dismiss, October 23, 2006.)
II. Findings of Fact
To the extent that these findings of fact are also deemed to be conclusions of law, they are hereby incorporated into the conclusions of law that follow.
A. TEA's monitoring program
1. LEP student population in Texas
The State of Texas seeks to educate one of the largest populations of LEP students in the country, and the population steadily grows. In 1979, TEA reported that 6.9%, 198,618 students out of 2,872,719, of total public school students were LEP students. (Intvs.' Ex. 1 # 1.) In the 2004-2005 school year 15.5%, 684,007 students out of 4,400,644, of total public school students were LEP students. Id. at # 2. 637,239 (93%) of these 684,007 LEP students are Hispanic. Id. at # 43. LEP students are present in nearly every school in Texas — 57% of the state's 1,227 school districts serve 20 or more LEP students. (Intvs.' Ex. 99 at 3.) At least one LEP student is enrolled in 1,070 district and charter schools statewide, resulting in the presence of at least one LEP student in approximately 87% of schools. Id.
Contrary to conventional wisdom, in 2005-2006 only 13.1% of LEP students were classified as immigrants — TEA classifies immigrants as those not born in the United States and who have not attended school in the United States for at least three years. (Intvs.' Ex. 1 # 7.) Therefore, 86.9% of Texas's LEP students are not immigrants as defined by TEA. Id.
2. State Administration of LEP Education
To educate Texas students, including LEP students, Texas has legislated a system of shared responsibilities between state and local educational entities. Under the Texas Education Code, the school districts shoulder "the primary responsibility for implementing the state's system of public education and ensuring student performance in accordance with this code." Tex. Educ. Code § 11.002 (Vernon 2008). Texas law enumerates TEA and the state board of education's powers and reserves all other functions for the school districts: "[a]n educational function not specifically delegated to the agency or the board under this code is reserved to and shall be performed by school districts. . . ." Id. at § 7.003. TEA's enumerated powers regarding LEP students are summarized below.
Under Texas law, TEA has fourteen enumerated "educational functions" that broadly apply to all public education in Texas, including education of LEP students. Id. at § 7.021. Directly relevant here, the agency "shall administer and monitor compliance with education programs required by federal or state law. . . ." Id. at § 7.021(b)(1).
In terms of LEP student education, Chapter 29 of the Texas Education Code mandates bilingual education and English as a second language ("ESL") programs in all Texas schools. Id. at § 29.051. The state policy controlling these programs unequivocally states:
English is the basic language of this state. Public schools are responsible for providing a full opportunity for all students to become competent in speaking, reading, writing, and comprehending the English language. Large numbers of students in the state come from environments in which the primary language is other than English. Experience has shown that public school classes in which instruction is given only in English are often inadequate for the education of those students. The mastery of basic English language skills is a prerequisite for effective participation in the state's educational program. Bilingual education and special language programs can meet the needs of those students and facilitate their integration into the regular school curriculum. Therefore, in accordance with the policy of the state to ensure equal educational opportunity to every student, and in recognition of the educational needs of students of limited English proficiency, this subchapter provides for the establishment of bilingual education and special language programs in the public schools and provides supplemental financial assistance to help school districts meet the extra costs of the programs.Id.
Texas law requires TEA to evaluate and monitor multiple aspects of the state's bilingual and special language programs. In 2003, the Texas legislature passed House Bill 3459 ("H.B. 3459"). Among other alterations to the Texas Education Code, H.B. 3459 altered the language of § 29.062(a) — which previously required TEA to monitor and inspect each school district's bilingual and special language programs — to establish a performance based monitoring system. Id. at 29.062(a). Section 29.062(a) now mandates that TEA "evaluate the effectiveness of programs . . . based on the academic excellence indicators adopted under Section 39.051(a)." Id. "Performance on the [academic excellence] indicators . . . shall be compared to state established standards. . . . and must include:" inter alia, drop-out rates, graduation rates, and standardized test (now TAKS) passing rates. Id. at 39.051(b)(1)-(3). In line with the shift from onsite inspections to performance based evaluations, H.B. 3459 also limited TEA's compliance monitoring function, stating that TEA "may monitor compliance with requirements applicable to a process or program provided by a school district . . . only as necessary to ensure: (1) compliance with federal law and regulations." Id. at 7.028 (emphasis added). Contrary to Defendants' assertion (Defs.' Post-Trial Br. 8), H.B. 3459 did not alter section 29.062(b); that section still requires TEA to monitor bilingual education and special language programs in the areas of "program content and design," "program coverage," "identification procedures," "classification procedures," "staffing," "learning materials," "testing materials," "reclassification of students," and TEA must monitor language proficiency assessment committees ("LPACs"), which undertake initial classification of LEP students. Id. at 29.062(b)(1)-(9). "If a school district . . . fails to satisfy appropriate standards[, including the academic excellence indicators,] . . . [TEA] shall apply sanctions. . . ." Id. at 29.062(e).
Defendants contend that H.B. 3459 modified § 29.062(b) by enacting § 7.028. (Defs.' Post-Trial Br. 8.) However, no language in H.B. 3459 indicates that § 7.028 modified § 29.062(b). See e.g., H.B. 3459 Sec. 4 (adding § 7.027, later recodified as § 7.028 but not amending any other portion of the Texas Education Code). Moreover, the context of the bill makes clear that the legislature did not modify § 29.062(b) because in H.B. 3459, the Legislature amended § 29.062(a) and (e) but did not disturb § 29.062(b). H.B. 3459 Sec. 19.
In terms of educating LEP students, TEA must also establish a procedure for identifying school districts that are required to offer a bilingual or ESL program. Id. at § 29.053(a). By law, every Texas school district with twenty or more LEP students in the same grade level in the district must offer a bilingual education or ESL program. Id. at § 29.053(c). The agency also must "establish standardized criteria for the identification, assessment and classification of students of limited English proficiency eligible for entry into the program or exit from the program." Id. at 29.056.
Bilingual programs are distinct from ESL programs. Bilingual education programs use "both the students' native language and English to teach content material while students are mastering English, with the ultimate goal of transition to all-English instruction." Erica Higgs, Specialized High Schools for Immigrant Students: A Promising New Idea, 34 J.L. EDUC. 331, 335 (2005); see also, Tex. Educ. Code. § 29.055(a)-(b) (defining bilingual and ESL programs). In contrast, "ESL instruction teaches all courses in modified English that is easier for [LEP students] to comprehend." Id.
Texas's bilingual program is also implemented differently than its ESL program. In school districts with twenty or more LEP students in the same grade level, the statute mandates bilingual education in kindergarten through sixth grade; bilingual, ESL, or other approved language instruction in post-elementary through eighth grade; and ESL in secondary school. Id. at § 29.053(d). In practice, for LEP students in grades seven through twelve enrolled in an LEP program, school districts use ESL instruction for all LEP students not in special education and without parental denials. (1 Tr. 33; 3 Tr. 46; Intvs.' Ex. 57.) For the 2005-2006 academic year, there were 376,170 LEP students in bilingual education programs and 280,324 students in ESL programs, 92% of the total 711,396 LEP population for that year. (Intvs.' Ex. 57.) If LEP students in special education only (15,717 students), LEP students with parental denials (34,971 students), and LEP students who are in no program (4,214), are added to the bilingual and ESL totals, then all 711,396 LEP students are taken into account. Id.
In addition, Texas law mandates standards for bilingual-ESL program content and instructional methods, id. at § 29.055; facilities and class sizes, id. at § 29.057; and bilingual-ESL teacher certification, id. at § 29.061.
3. PBMAS: Compliance Monitoring
a. The Previous District Effectiveness Compliance ("DEC") Monitoring System
Texas law requires TEA to evaluate the effectiveness of school districts' compliance with requirements of Chapter 29 of the Texas Education Code, Texas's bilingual-ESL statute. Id. at § 29.062. From 1995 to 2003, TEA monitored Chapter 29 compliance via the District Effectiveness and Compliance ("DEC") system. (Def. Ex. 3, at 1-2.) Under DEC, TEA conducted onsite monitoring of special programs, such as bilingual-ESL programs, in every school district within a five-year cycle and subsequently in a six year cycle. Id. at 1. Once onsite to monitor a bilingual-ESL program, TEA officials had to evaluate several substantive components ( e.g., design, materials, and in-class implementation) of the district's bilingual-ESL program. Tex. Educ. Code § 29.062 (Vernon 2002) (current version at Tex. Educ. Code § 29.062 (Vernon 2008)). The DEC program was never effectively implemented. By 1996, the Texas State Auditor's Office found that TEA had performed onsite monitoring in only 186 districts (18%) from 1991 through 1994, and 202 districts had not been visited in over six years. (Intvs.' Ex. 26 at 15-16.) The auditor found that "[n]onperformance of these monitoring visits reduces the Agency's ability to ensure that Bilingual Education Program funds are spent appropriately, that districts are properly classifying students, and that districts are providing equal educational opportunities for bilingual students." Id. at 16. The auditor made similar findings in 1998 and in 2002. (Intvs.' Ex. 27 at 5-6; Intvs. Ex. 28 at 6-7.)
b. PBMAS System
In 2003, TEA replaced the DEC with the Performance Based Monitoring Analysis System ("PBMAS"). (3 Tr. 160-61.) The 2003-2004 school year was a transition year from old to new monitoring systems, and 2004-2005 marked the first year of PBMAS in Texas schools. Id. The 2006-2007 school year was the third year of full implementation of PBMAS.
i. PBMAS Monitoring
Contrary to DEC, PBMAS does not use onsite visits as the primary monitoring tool to evaluate Chapter 29 compliance. Instead, PBMAS is essentially a result and data-driven system that evaluates performance in four program areas: bilingual education and ESL, career and technology education, No Child Left Behind Act of 2001 ("NCLB"), 20 U.S.C. § 6301, et seq., and special education. Id.
PBMAS annually generates a set of performance indicators on a district level. Most of the indicators are based upon student passage rates on a statewide achievement test known as TAKS, Texas Assessment of Knowledge and Skills. (Defs.' Ex. 3 at 6-7.) Each year the state sets standards for student passage rates in five subject areas and compares districts' performance to these standards to generate indicators. (Defs.' Ex. 2 at 7; Defs.' Ex. 3 at 7.) For instance, in the 2006-2007 academic year, the state target passage rate was 40% for mathematics, 60% for reading and English language arts, 35% for science, 60% for social studies, and 60% for writing. (Defs.' Ex. 3 at 7; see also Defs.' Ex. 2 at 7 (listing target rates for the 2005-2006 academic year).)
TEA assigns each district a performance level based upon the deviation of the district's passage rates from these passage rate standards. If a school district meets or exceeds the accountability standard for a given subject, their performance level is zero; the performance level is one for results that are between 0.1 and 5 percentage points below the accountability standard; the performance level is two for results that are between 5.1 and 10 percentage points below the accountability standard; and the performance level is three for results that are greater than 10 percentage points below the standard. (Defs.' Ex. 3 at 7.)
In terms of LEP students, PBMAS evaluates the bilingual and ESL program area through nine bilingual-ESL indicators divided into twenty-seven subparts. (Defs.' Ex. 3 at 23-38.) Twenty-three of these indicators are derived from district TAKS passage rates categorized by bilingual or ESL programs, subject areas, and the languages in which TAKS was administered. Id. at 23-32. The other indicators are derived from the percentage of LEP students taking TAKS or special education tests, the LEP annual drop-out rate, the LEP graduation rate, and LEP reading proficiency. Id. at 32-38.
ii. Gaps and Masking in PBMAS data (a) Under-identifying LEP students
TEA does not verify data or monitor school districts where significant statistical data indicates that school districts are likely under-identifying LEP students. This under-identification impacts the veracity of the data at the core of the PBMAS monitoring system. In 2005-2006, on a statewide basis, 4.9% of LEP students were reported as receiving parental denials to participation in bilingual and ESL programs. (Intvs.' Ex. 1 # 46.) However, in 2005-2006, some school districts reported five times or more the rate of denials than the statewide average. (Intvs.' Ex. 4.) For instance, Bastrop ISD reported 198 denials, a 23.21% rate of denial of bilingual-ESL education; Brazosport ISD reported 148 denials, a 12.44% rate of denial; Borger ISD reported 50 denials, a 21.65% rate of denial; Crowley ISD reported 200 denials, a 18.62% rate of denial; Crosby ISD reported 74 denials, a 31.76% rate of denial; Clint ISD reported 699 denials, a 17.11% rate of denial; El Paso ISD reported 5,094 denials, a 26.64% rate of denial; Garland ISD reported 1,860 denials, a 14.13% rate of denial; Fairfield ISD reported 206 denials, a 19.79% rate of denial; Harlandale ISD reported 314 denials, a 14.75% rate of denial; Grand Prairie ISD reported 772 denials, a 15.93% rate of denial; Kingsville ISD reported 109 denials, a 27.11% rate of denial; Killeen ISD reported 803 denials a 32.79% rate of denial; Lewisville ISD reported 589 denials, a 11.82% rate of denial; Lubbock ISD reported 155 denials, a 18.49% rate of denial; Los Fresnos CISD reported 424 denials, a 18.49% rate of denial; North East ISD reported 581 denials, a 16.79% rate of denial; Mount Pleasant ISD reported 321 denials, a 16.96% rate of denial; Port Arthur ISD reported 506 denials, a 43.55% rate of denial; Plainview ISD reported 162 denials, a 27.05% rate of denial; Torinollo ISD reported 220 denials, a 38.0% rate of denial; Uvalde CISD reported 100 denials, a 24.51% rate of denial; and Wichita Falls ISD reported 86 LEP denials, an 11.48% rate of parental denials. Id. In contrast, many districts with large numbers of LEP students reported large raw numbers of denials, but the rate of denial fell at or below the state average; Houston ISD, for instance, had 2,232 parental denials of bilingual-ESL education, a 3.80% rate of denial. Id.
Importantly, the numbers in many of these districts are large enough to not be anomalies, as may occur, for instance, if one family with many children had decided not to enroll their children in bilingual-ESL education in a small district. In light of expert testimony and common sense, it appears that in at least some of these schools, parents may not be well informed of the advantages of bilingual-ESL programs or may be subject to coercion. (1 Tr. 45-47.) However, Defendants "do not, independent of the Districts, verify parental denials." (Intvs.' Ex. 1 # 48.) As the director of PBMAS testified, TEA does not verify which students are placed in bilingual-ESL programs, and there is not a specific indicator for parental denials. (4 Tr. 6-7.) Though the rate of parental denials may be part of a focused data analysis after an intervention has been initiated, outside of interventions that are triggered by data, which would by distorted by under-identification of LEP students, TEA would never review under-identification in any district. Id.
Defendants contend that Intervenors only offer "anecdotal hearsay" that districts are under-identifying LEP students and that districts under the PBMAS system have more incentive to include LEP students in bilingual-ESL education. (Defs.' Post-Trial Br. 27-28.) However, the Court finds that the data, the same type of data relied upon by the PBMAS system, are not anecdotal hearsay. Moreover, whether or not the districts have increased incentive to accurately identify students does not conclusively demonstrate that they have actually done so. The Court finds that some districts are under-identifying LEP students and that TEA has not verified LEP identification in these suspect districts, undercutting the veracity of the data employed by the PBMAS system.
(b) Performance standards are not based on equal achievement with non-LEP students
The bilingual-ESL indicators do not compare the performance of LEP students to the performance of English proficient students. (Intvs.' Ex. 1 # 35.) Instead, to determine the performance level for bilingual or ESL students within a district, bilingual-ESL indicators are compared to the target passage rates that are assigned by the state. (Defs.' Ex. 2 at 7; Defs.' Ex. 3 at 7.)
(c) Data combined across multiple grade levels
Under PBMAS, many of the indicators are derived from the sum of TAKS scores across multiple grade levels. For instance, the indicators for TAKS passage rates in mathematics and reading-English language arts for students in bilingual education, ESL, and LEP year-after-exit are derived from the sum of all students' TAKS scores from grades 3-11. (Intvs.' Ex. 1 # 38; Defs.' Ex. 4 at 23-38.) The indicator for science is based upon the combined TAKS scores at grades 5, 10, and 11. Id. The indicator for writing is based upon the combined score of grades 4 and 7. Id.The LEP drop-out rate indicator is based upon combined drop-out rates from the middle school grades 7 and 8 and the high school grades 9-12. (Defs.' Ex. 4 at 35.) However, as Defendants' witnesses testified, students are much more likely to drop-out of high school than middle school. For instance, TEA's Associate Commissioner Cloudt testified that "you have very few drop-outs grade 7 through 8 to begin with . . . the vast majority of drop-outs in our state drop-out in grades 9 through 12 versus grades 7 and 8." (Intvs.' Ex. 81 at 71:9-10, 24-25, 72:1, 150:22, 151:6; see also Intvs.' Ex. 1 # 39, # 40.) PBMAS's current data aggregation likely results in masking the drop-out rates in grades 9-12. (1 Tr. 60-61, 115, 139-40, 185-87; Defs.' Ex. 38 at 132, 141; Intvs.' Ex. 6 at 10-11; see also 2 Tr. 200 (testimony from State Board of Education member indicating that including grades 7 and 8 in the drop-out calculation would likely distort the data).)
(d) District-wide aggregated data without information on specific campuses
Because they are based upon district-wide aggregated student data, PBMAS only evaluates district-wide data, not data for individual schools. (Intvs.' Ex. 1 # 37.) As a result, based upon PBMAS performance definitions, 277 schools attended by 54,963 LEP students were performing at a performance level lower than the stage of intervention required by only a district-wide data analysis. (Intvs.' Ex. 99 at 6; 5 Tr. 170.) Of these 277 schools, 248 were middle or high schools, and 48,069 LEP students attended these middle and high schools. Id.
The Court committed manifest error when it noted in its July, 30, 2007 Opinion that Intervenors' expert Roy Johnson had relied on outdated data. (July 30, 2007 Op. 24 n. 31.) The Court committed the error by neglecting to recognize that Johnson had supplemented his original findings with additional data provided by Defendants.
Though some indicators in the other program areas monitored by PBMAS — career and technology education, NCLB, and special education — relate to LEP student performance, interventions based upon unsatisfactory performance in these programs are not intended to ensure bilingual-ESL programs' compliance with state or federal standards. (3 Tr. 143-50; 4 Tr. 63-64.). TEA stages interventions separately for each program, and the agency has not intervened in districts that have substandard achievement across programs and has not developed any substantive guidelines to identify such districts. (4 Tr. 63-64.)
iii. Intervention
TEA identifies districts for intervention through a formula based upon substandard performance on the various indicators. Each stage of intervention requires an increasing amount of action by the district. (Defs.' Ex. 5 6; see also U.S. Post-trial Br. 10 (chart for intervention stages).) The Court will briefly define the triggering indicator performance level for each stage of intervention and then describe the required action each district must take under each increasing stage of intervention.
(a) Performance level triggers
Intervention stage 1A is triggered if one indicator is at performance level 3 (i.e. student passage rates for that indicator are 10% or more below the standard) and one to three indicators are at performance level 2. (Defs.' Ex. 5 6.)
Intervention stage 1B is triggered if one indicator is at performance level 3 and at least four indicators are at performance level 2. (Defs.' Ex. 5 6.) Alternatively, intervention stage 1B is triggered if two indicators are at performance level 3 and two indicators are at performance level 2. Id.
Intervention stage 2 is triggered if two indicators are at performance level 3 and at least three indicators are at performance level 2. Id. Alternatively, stage 2 intervention can be triggered if three or more indicators are at performance level 3. Id.
The standards that trigger the highest level of intervention, intervention stage 3, have recently changed. Under the criteria adopted for the 2005-2006 school year, onsite visits were necessary if TEA detected "substantial or imminent program effectiveness concerns . . . based on current and/or longitudinal data." (Defs.' Ex. 6.) To make this determination, however, TEA did not employ any quantitative formula or objective standards. See id. Moreover, in the 2004-2005 and 2005-2006 school years, TEA did not identify any district warranting a stage 3 intervention and thus did not conduct any onsite visits. (3 Tr. 185).
In the middle of trial, TEA indicated for the first time that it added two criteria to trigger stage 3 interventions for the 2006-2007 school year. Under the first new criterion, TEA will conduct an onsite review of a school district if it has been under stage 2 interventions for three continuous years. Id. at 189. Under the second new criterion, TEA will conduct an onsite review if a school district has six or more PBMAS indicators rated at performance level 3. Id. at 190. Using the new criteria, TEA selected 21 districts for stage 3 intervention in the Fall of 2006. Id. at 185.
(b) Intervention action
Districts must complete a focused data analysis ("FDA") and continuing improvement plan ("CIP") at intervention stages 1A, 1B, and 2. A FDA is a self-evaluative measure that requires the assembly of a team of district officials and community stakeholders, who "determine possible causes for areas of performance concerns" and "gather information to develop the continuous improvement plan." (Defs.' Ex. 8 at 1-2.) To complete an FDA, districts generally must examine PBMAS indicators rated 2 or 3 but may analyze other information or complete the "BE/ESL Optional Program Effectiveness Review." Id. at 1. Districts develop continuous improvement plans after completing the FDA. ( See Defs.' Ex. 11.) TEA provides districts a basic template for all CIPs, which requires the districts to articulate: (1) desired results with respect to identified areas of improvement; (2) anticipated measurable evidence of change; (3) activities to achieve desired results; (4) resources needed to implement activities; (5) timeline; and (6) follow-up activities if the initial plan does not work. (Defs.' Ex. 9.)
Under intervention stages 1B and 2, districts submit completed FDAs and CIPs to TEA for a desk review. (Defs.' Ex. 4.) The primary purpose of the desk review is to evaluate the effectiveness of the CIP, specifically focusing on whether it adequately targets the issues identified by the FDA. (3 Tr. 175, 181-84.) Once a CIP is approved, TEA monitors the district's progress based on the CIP goals and timeline. (3 Tr. 182-83; Defs.' Ex. 4.) If the CIP is not approved, however, the district is required to revise and resubmit it. (Defs.' Ex. 4.) To that end, districts can solicit technical assistance from regional education service centers or other individuals outside of the school district. Id. If the CIP is not approved a second time, TEA responds with additional "oversight, intervention, and/or sanctions." Id.
Stage 2 interventions require districts to hold a public meeting in addition to completion of a FDA and CIP. (Defs.' Ex. 5 6.) These meetings provide an opportunity to gather information and feedback from the public through comments and testimony. (Defs.' Ex. 10.) Comments and findings from the meeting are then submitted to TEA, which considers the information as part of the desk review process. (Defs.' Ex. 4.)
At the next and highest level of intervention, stage 3, TEA initiates a "targeted on-site review" of the substandard school district. (Defs.' Ex. 5 6.) As of the date of trial, TEA had never conducted such an onsite review of any school district under PBMAS but planned to do so in the 2007-2008 school year. (3 Tr. 185, 189.) The focus of these onsite investigations will purportedly vary with the identified performance concerns in the district. Id. at 190-91. At trial, TEA indicated that two to six monitors will comprise a site visit team (4 Tr. 37), which can interview district and school officials, conduct focus groups, observe classrooms, and inspect student records, among other possible activities. (3 Tr. 191; 4 Tr. 42-43.) These teams may make inquiries at specific campuses, but the ultimate purpose of the visit will be to address district problems. (4 Tr. 42-43, 63.) A school district's chronic failure to meet TEA's standards can result in a series of sanctions, the most severe of which is the dissolution of that district, accompanied by its annexation to an adjoining district. Tex. Educ. Code §§ 29.062(e), 39.131(a), 39.132.
However, TEA has an inadequate number of bilingual-ESL certified monitors to conduct onsite visits. The director of PBMAS, Dr. Laura Taylor, testified that at the time of trial, though there were eleven monitors available to monitor bilingual-ESL programs, PBMAS had no monitoring staff certified in bilingual-ESL education. (4 Tr. 5.) One bilingual-ESL certified monitor was scheduled to begin within a week of Taylor's October 23, 2006 testimony. (4 Tr. 38.)
The number of interventions at every intervention stage has increased each year since the transition year of 2003-2004, though only 21 interventions have been planned for stage 3 onsite monitoring. (3 Tr. 184-85, 191-92; 4 Tr. 49-50.) In the 2006-2007 school year, TEA staged 328 school districts for intervention based on under-performance in their bilingual-ESL programs. (4 Tr. 32.) Of those 328 school districts, 21 districts were staged for onsite inspection; the other interventions, therefore, consisted of focused data analyses and continuing improvement plans. Id. At the close of trial, TEA had yet to go onsite to these 21 school districts. Id. at 9. However, a TEA official testified that these onsite reviews would occur in the fall of the 2006-2007 school year. Id. 9-10.
4. Other monitoring
Although PBMAS is the primary means, it is not the sole means by which TEA monitors LEP students in Texas schools. TEA monitors the adequate yearly progress down to the campus level of LEP students served in bilingual-ESL programs under Title I of the No Child Left Behind Act of 2001, 20 U.S.C. § 6301, et seq. (3 Tr. 145-49.) TEA also monitors annual measurable achievement objectives ("AMAO") as required under Title III of NCLB. Id. However, a school or district's failure to achieve adequate yearly progress under NCLB does not mean that the bilingual-ESL program will be investigated. (5 Tr. 44.)
In addition, TEA monitors LEP student performance under the Texas accountability rating system. Under the accountability rating system, each school is assigned a state accountability rating, in which the data is disaggregated into categories, including socioeconomically disadvantaged students. (3 Tr. 204-05.) However, the accountability rating system does not disaggregate student performance for LEP students, initiate further action based upon the failure of LEP students as a disaggregated group, nor does it hold schools or districts accountable for failure to comply with standards and regulations governing LEP education. (3 Tr. 199, 205-06, 209-12, 216, 219.)
B. Achievement of LEP Students
1. Drop-out rates
LEP students in Texas dropped out of school at a rate greater than the "all-students" category statewide. (Intvs.' Ex. 1 # 9.) In 2003-2004, the annual drop-out rate for seventh and eighth grade LEP students was 0.5%, twice that of all students, 0.2%. (Intvs.' Ex. 1 # 10.) In the same academic year, for grades seven through twelve, LEP students dropped out at an annual rate of 2.0%, more than twice the rate for all students, 0.9%. (Intvs.' Ex. 1 # 11.) And this percentage likely underrepresents the drop-out rate in grades nine through twelve, where drop-outs are more prevalent than in grades seven and eight. (1 Tr. 60-61, 115, 139-40, 185-87; 2 Tr. 200; Defs.' Ex. 38 at 132, 141; Intvs.' Ex. 6 at 10-11; Intvs.' Ex. 81 at 71:9-10, 24-25, 72:1, 150:22-151:6.) For students who would have graduated with the class of 2004, 16.3% of LEP students dropped out of school statewide compared with 3.9% of all students. (Intvs.' Ex. 1 # 12.) For those originally in the class of 2005, only 55.2% of LEP students graduated with their class, whereas 84% of all students graduated with their class. (Intvs.' Ex. 38 at 145.)
2. Retention rates
Retention rates, the percentage of students who are held back from advancing a grade level, have consistently been higher for LEP students than for other students. In grades kindergarten through six the percentage margin between LEP retention rates and non-LEP students' retention rates gradually increased from 0.9% in 1994-1995 to 2.1% in 2003-2004.
Although not evidence, the Court notes that recent TEA data indicates that for kindergarten through sixth grade for the 2004-2005 academic year 5.3% of LEP students were retained compared to 2.9% of non-LEP students, for a margin of 2.4%, and for the 2005-2006 academic year 5.0% of LEP students were retained compared to 2.8% of non-LEP students, for a margin of 2.2%. Tex. Educ. Agency, 2006 Comprehensive Annual Report on Tex. Pub. Sch. 76 (2006), available at http://www.tea.state.tx.us/research/pdfs/2006_compannual.pdf.; Tex. Educ. Agency, 2007 Comprehensive Annual Report on Tex. Pub. Sch. 78 (2007), available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
The state's standardized achievement test, TAKS, demonstrates marginal success for LEP students in elementary grades and failure in the secondary grades. LEP test scores are compared to "all students," which includes LEP students, rather than to non-LEP students. (3 Tr. 199.) The all-students achievement is generally lowered by LEP students' inclusion, and therefore, an accurate comparison of LEP students to their non-LEP peers is distorted. Id.
a. Elementary LEP student passage rates
The TAKS data on elementary grades, where bilingual education is used exclusively, demonstrates that LEP students may be marginally overcoming language barriers.
See supra Section II.A.2.
i. Third grade comparison
For third grade LEP students taking TAKS in English, 63% passed the reading test in 2003 compared with 81% of all students, 77% in 2004 compared with 88% of all students, 78% in 2005 compared with 89% of all students, and 81% in 2006 compared with 89% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 8 percentage points in reading, though it steadily decreased. Id.
In mathematics, 62% of LEP students taking the test in English passed in 2003 compared with 74% of all students, 75% in 2004 compared with 83% of all students, 72% in 2005 compared with 82% of all students, and 75% in 2006 compared with 82% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 8 percentage points in mathematics, though it steadily decreased. Id.
Third grade LEP students who took TAKS in Spanish generally fared worse than the all-students category and LEP students taking the test in English. In reading, 67% passed in 2003 (all students 81%, LEP in English 63%), 78% in 2004 (all students 88%, LEP in English 77%), 74% in 2005 (all students 89%, LEP in English 78%), and 76% in 2006 (all students 89%, LEP in English 81%). Id. In mathematics, 57% passed in 2003 (all students 74%, LEP in English 62%), 68% in 2004 (all students 83%, LEP in English 75%), 67% in 2005 (all students 82%, LEP in English 72%), and 69% in 2006 (all students 82%, LEP in English 75%). Id.
The TAKS test in Spanish "is taken by students who are in bilingual education programs receiving a lot of their academic instruction in Spanish while they learn English. . . . [It tests] how well they're progressing in mathematics and science and writing and reading, . . . how well they're gaining those academic skills, those fundamental building blocks that they need as they . . . move from grade to grade in school as they're learning English." (5 Tr. 60-61.)
In 2006, at the third grade level, students who had exited the program two years previously ("non-LEP monitored +2") performed better than students overall, with 95% passing the reading test compared to 89% for all students, and 90% passing the math test compared to 82% for all students. Id.
However, the TAKS passage rates for current LEP students under the all tests standard for 2005 and 2006 are less encouraging. In 2005, only 62% of third grade LEP students taking the test in English passed all the TAKS subject areas and only 54% of those taking the test in Spanish passed, compared with 76% for all students. (Defs.' Ex. 51.) In 2006, only 65% of third grade LEP students taking the test in English passed all the TAKS subject areas and only 55% of those taking the test in Spanish passed, compared with 77% for all students — at best, an achievement gap of 12 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 68% of third grade LEP students taking the test in English passed all TAKS subject areas and 68% of LEP taking the tests in Spanish passed, compared to 78% of all students. Tex. Educ. Agency, 2007 Comprehensive Annual Report 7, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
ii. Fourth grade comparison
For fourth grade LEP students taking TAKS in English, 49% passed the reading test in 2003 compared with 76% of all students, 60% in 2004 compared with 81% of all students, 58% in 2005 compared with 79% of all students, and 63% in 2006 compared with 82% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 19 percentage points in reading, though it decreased. Id.In mathematics, 49% of LEP students taking the test in English passed in 2003 compared with 70% of all students, 64% in 2004 compared with 76% of all students, 68% in 2005 compared with 81% of all students, and 72% in 2006 compared with 83% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 11 percentage points in mathematics, though it decreased. Id.
In writing, 53% of LEP students taking the test in English passed in 2003 compared with 78% of all students, 73% in 2004 compared with 88% of all students, 80% in 2005 compared with 90% of all students, and 83% in 2006 compared with 92% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 9 percentage points in writing, though it decreased. Id.
Fourth grade LEP students who took TAKS in Spanish had mixed results in comparison to all students and LEP students taking the TAKS in English. In reading, 59% passed in 2003 (all students 76%, LEP in English 49%), 66% in 2004 (all students 81%, LEP in English 60%), 69% in 2005 (all students 79%, LEP in English 58%), and 76% in 2006 (all students 82%, LEP in English 63%). Id. In mathematics, 48% passed in 2003 (all students 70%, LEP in English 49%), 62% in 2004 (all students 78%, LEP in English 64%), 64% in 2005 (all students 81%, LEP in English 68%), and 69% in 2006 (all students 83%, LEP in English 72%). Id. In writing, 82% passed in 2003 (all students 78%, LEP in English 53%), 88% in 2004 (all students 88%, LEP in English 73%), 87% in 2005 (all students 90%, LEP in English 80%), and 90% in 2006 (all students 92%, LEP in English 83%). Id.
In 2006, at the fourth grade level, students who had exited the program two years previously (non-LEP monitored +2) outperformed students overall, with 91% passing the reading test compared to 82% for all students, 93% passing the math test compared to 83% for all students, and 97% passing the writing test compared with 92% overall. Id.
However, the TAKS passage rates for current LEP students under the all tests standard for 2005 and 2006 are less encouraging. In 2005, only 49% of fourth grade LEP students taking the test in English passed all the TAKS subject areas and only 56% of those taking the test in Spanish passed, compared with 70% for all students. (Defs.' Ex. 51.) In 2006, only 55% of fourth grade LEP students taking the test in English passed all the TAKS subject areas and only 63% of those taking the test in Spanish passed, compared with 74% for all students — at best, an achievement gap of 11 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 58% of fourth grade LEP students taking the test in English passed all TAKS subject areas and 65% of LEP taking the tests in Spanish passed, compared to 75% of all students. Tex. Educ. Agency, 2007 Comprehensive Annual Report 7, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
iii. Fifth grade comparison
For fifth grade LEP students taking TAKS in English, 32% passed the reading test in 2003 compared with 67% of all students, 34% in 2004 compared with 73% of all students, 37% in 2005 compared with 75% of all students, and 48% in 2006 compared with 60% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 12 percentage points in reading, though it decreased. Id.
In mathematics, 40% of LEP students taking the test in English passed in 2003 compared with 65% of all students, 47% in 2004 compared with 73% of all students, 58% in 2005 compared with 79% of all students, and 63% in 2006 compared with 81% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 18 percentage points in mathematics, though it decreased. Id.
In science, 10% of LEP students taking the test in English passed in 2003 compared with 39% of all students, 22% in 2004 compared with 55% of all students, 31% in 2005 compared with 64% of all students, and 46% in 2006 compared with 75% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 29 percentage points in science and the margin increased. Id.
Fifth grade LEP students who took TAKS in Spanish had mixed results in comparison to all students and LEP students taking the TAKS in English. In reading, 51% passed in 2003 (all students 67%, LEP in English 32%), 60% in 2004 (all students 73%, LEP in English 34%), 60% in 2005 (all students 75%, LEP in English 37%), and 65% in 2006 (all students 80%, LEP in English 48%). Id. In mathematics, 37% passed in 2003 (all students 65%, LEP in English 40%), 44% in 2004 (all students 73%, LEP in English 47%), 44% in 2005 (all students 79%, LEP in English 58%), and 47% in 2006 (all students 81%, LEP in English 63%). Id. In science, 6% passed in 2003 (all students 39%, LEP in English 10%), 20% in 2004 (all students 55%, LEP in English 22%), 23% in 2005 (all students 64%, LEP in English 31%), and 31% in 2006 (all students 75%, LEP in English 46%). Id.
In 2006, at the fifth grade level, students who had exited the program two years previously (non-LEP monitored +2) performed as well as students overall, with 79% passing the reading test compared to 80% for all students, 83% passing the math test compared to 81% for all students, and 74% passing the writing test compared with 75% overall. Id.
However, the TAKS passage rates for current LEP students under the all tests standard for 2005 and 2006 are less encouraging. In 2005, only 19% of fifth grade LEP students taking the test in English passed all the TAKS subject areas and only 13% of those taking the test in Spanish, compared with 55% for all students. (Defs.' Ex. 51.) In 2006, only 28% of fifth grade LEP students taking the test in English passed all the TAKS subject areas and only 16% of those taking the test in Spanish, compared with 64% for all students — at best, an achievement gap of 36 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 36% of fifth grade LEP students taking the test in English passed all TAKS subject areas and 44% of LEP taking the tests in Spanish passed, compared to 69% of all students. Tex. Educ. Agency, 2007 Comprehensive Annual Report 8, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
iv. Sixth grade comparison
For sixth grade LEP students taking TAKS in English, 26% passed the reading test in 2003 compared with 71% of all students, 34% in 2004 compared with 79% of all students, 51% in 2005 compared with 85% of all students, and 64% in 2006 compared with 91% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 27 percentage points in reading, though it decreased. Id.In mathematics, 27% of LEP students taking the test in English passed in 2003 compared with 60% of all students, 35% in 2004 compared with 67% of all students, 41% in 2005 compared with 72% of all students, and 54% in 2006 compared with 79% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 25 percentage points in mathematics, though it decreased. Id.
Sixth grade LEP students who took TAKS in Spanish had mixed results in comparison to all students and LEP students taking the TAKS in English. In reading, 60% passed in 2003 (all students 71%, LEP in English 26%), 58% in 2004 (all students 79%, LEP in English 34%), 59% in 2005 (all students 85%, LEP in English 51%), and 66% in 2006 (all students 91%, LEP in English 64%). Id. In mathematics, 28% passed in 2003 (all students 60%, LEP in English 27%), 36% in 2004 (all students 67%, LEP in English 35%), 44% in 2005 (all students 72%, LEP in English 41%), and 52% in 2006 (all students 79%, LEP in English 54%). Id.
In 2006, at the sixth grade level, students who had exited the program two years previously (non-LEP monitored +2) outperformed students overall, with 94% passing the reading test compared to 91% for all students and 81% passing the math test compared to 79% for all students. Id.
However, the TAKS passage rates for current LEP students under the all tests standard for 2005 and 2006 are less encouraging. In 2005, only 31% of sixth grade LEP students taking the test in English passed all the TAKS subject areas and only 43% of those taking the test in Spanish, compared with 69% for all students. (Defs.' Ex. 51.) In 2006, only 45% of sixth grade LEP students taking the test in English passed all the TAKS subject areas and only 50% of those taking the test in Spanish, compared with 78% for all students — at best, an achievement gap of 28 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 48% of sixth grade LEP students taking the test in English passed all TAKS subject areas and 59% of LEP taking the tests in Spanish passed, compared to 78% of all students. Tex. Educ. Agency, 2007 Comprehensive Annual Report 8, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
b. Secondary LEP student failure rates
Though not overwhelming, elementary LEP students demonstrated marginal progress, particularly among students who had exited the LEP program. But LEP students in secondary schools, who are educated in ESL programs, fare much worse on the TAKS test in comparison to all students.
See supra Section II.A.2.
i. Seventh grade comparison
For seventh grade LEP students taking TAKS in English, 21% passed the reading test in 2003 compared with 72% of all students, 28% in 2004 compared with 75% of all students, 33% in 2005 compared with 81% of all students, and 29% in 2006 compared with 79% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 47 percentage points in reading, and the gap in 2006 was greater than that in 2004 and 2005. Id.
In mathematics, 15% of LEP students taking the test in English passed in 2003 compared with 51% of all students, 24% in 2004 compared with 60% of all students, 25% in 2005 compared with 64% of all students, and 33% in 2006 compared with 70% for all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 36 percentage points in mathematics, and it increased slightly. Id.
In writing, 26% of LEP students taking the test in English passed in 2003 compared with 76% of all students, 52% in 2004 compared with 89% of all students, 52% in 2005 compared with 88% of all students, and 56% in 2006 compared with 90% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 34 percentage points in writing, though it decreased. Id.
In 2006, at the seventh grade level, students who had exited the program two years previously (non-LEP monitored +2) slightly outperformed students overall, with 80% passing the reading test compared to 79% for all students, 70% passing the math test compared to 70% for all students, and 92% passing the writing test compared with 90% overall. Id.
However, the TAKS passage rates for current students under the all tests standard for 2005 and 2006 are less encouraging. In 2005, only 16% of seventh grade LEP students passed all the TAKS subject areas, compared with 60% for all students. (Defs.' Ex. 51.) In 2006, only 18% of seventh grade LEP students passed all the TAKS subject areas, compared with 65% for all students — at best, an achievement gap of 46 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 27% of seventh grade LEP students passed all TAKS subject areas, compared to 71% of all students. Tex. Educ. Agency, 2007 Comprehensive Annual Report 9, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
ii. Eighth grade comparison
For eighth grade LEP students taking TAKS in English, 25% passed the reading test in 2003 compared with 77% of all students, 35% in 2004 compared with 83% of all students, 30% in 2005 compared with 83% of all students, and 32% in 2006 compared with 83% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 48 percentage points in reading, and the disparity increased over the last two years. Id.In mathematics, 15% of LEP students taking the test in English passed in 2003 compared with 51% of all students, 20% in 2004 compared with 57% of all students, 22% in 2005 compared with 61% of all students, and 29% in 2006 compared with 67% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 36 percentage points in mathematics, and the disparity increased. Id.
In social studies, 34% of LEP students taking the test in English passed in 2003 compared with 77% of all students, 42% in 2004 compared with 81% of all students, 50% in 2005 compared with 85% of all students, and 46% in 2006 compared with 83% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 35 percentage points in social studies, but it increased slightly over the final year. Id.
In science, 9% of LEP students taking the test in English passed in 2006 compared with 52% of all students. Id.
In 2006, at the eighth grade level, students who had exited the program two years previously (non-LEP monitored +2) performed worse than students overall, with 78% passing the reading test compared to 83% for all students, 58% passing the math test compared to 67% for all students, 78% passing the writing test compared with 83% overall, and 33% passing the science test compared with 52% for all students. Id.
The TAKS passage rates for current LEP students under the all tests standard for 2005 and 2006 are less encouraging. In 2005, only 14% of eighth grade LEP students passed all the TAKS subject areas, compared with 58% for all students. (Defs.' Ex. 51.) In 2006, only 12% of eighth grade LEP students passed all the TAKS subject areas, compared with 58% for all students — an achievement gap of 46 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 15% of eighth grade LEP students passed all TAKS subject areas, compared to 61% of all students. Tex. Educ. Agency, 2007 Comprehensive Annual Report 9, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
iii. Ninth grade comparison
For ninth grade LEP students taking TAKS in English, 14% passed the reading test in 2003 compared with 66% of all students, 24% in 2004 compared with 76% of all students, 30% in 2005 compared with 82% of all students, and 41% in 2006 compared with 87% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 46 percentage points in reading, though it decreased. Id.In mathematics, 11% of LEP students taking the test in English passed in 2003 compared with 44% of all students, 14% in 2004 compared with 50% of all students, 18% in 2005 compared with 56% of all students, and 19% in 2006 compared with 56% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 37 percentage points in mathematics, and it generally increased over that time. Id.
In 2006, at the ninth grade level, students who had exited the program two years previously (non-LEP monitored +2) performed worse than students overall, with 82% passing the reading test compared to 87% for all students and 43% passing the math test compared to 56% for all students. Id.
The TAKS in English passage rate under the all tests standard for 2005 and 2006 are even less encouraging. In 2005, only 13% of ninth grade LEP students passed all the TAKS subject areas, compared with 56% for all students. (Defs.' Ex. 51.) In 2006, only 16% of ninth grade LEP students passed all the TAKS subject areas, compared with 57% for all students — at best, an achievement gap of 41 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 16% of ninth grade LEP students passed all TAKS subject areas, compared to 60% of all students taking the test in English. Tex. Educ. Agency, 2007 Comprehensive Annual Report 9, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
iv. Tenth grade comparison
For tenth grade LEP students taking TAKS in English, 14% passed the English language arts test in 2003 compared with 66% of all students, 19% in 2004 compared with 72% of all students, 20% in 2005 compared with 67% of all students, and 32% in 2006 compared with 85% of all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 42 percentage points in English language arts, and it generally increased over time. Id.In mathematics, 17% of LEP students taking the test in English passed in 2003 compared with 48% of all students, 18% in 2004 compared with 52% of all students, 18% in 2005 compared with 58% of all students, and 23% in 2006 compared with 60% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 31 percentage points in mathematics, and it generally increased over time. Id.
In social studies, 29% of LEP students taking the test in English passed in 2003 compared with 71% of all students, 36% in 2004 compared with 80% of all students, 43% in 2005 compared with 84% of all students, and 41% in 2006 compared with 83% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 41 percentage points in social studies, and it stayed relatively constant. Id.
In science, 7% of LEP students taking the test in English passed in 2003 compared with 42% of all students, 10% in 2004 compared with 51% of all students, 11% in 2005 compared with 54% of all students, and 13% in 2006 compared with 60% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 35 percentage points in science, and the disparity increased every year. Id.
In 2006, at the tenth grade level, students who had exited the program two years previously (non-LEP monitored +2) did substantially worse than students overall, with 70% passing the English language arts test compared to 85% for all students, 44% passing the math test compared to 60% for all students, 70% passing the social studies test compared with 83% overall, and 34% passing the science test compared with 60% overall. Id.
The TAKS passage rates for current LEP students under the all tests standard for 2005 and 2006 are even less encouraging. In 2005, only 6% of tenth grade LEP students passed all the TAKS subject areas, compared with 40% for all students. (Defs.' Ex. 51.) In 2006, only 8% of tenth grade LEP students passed all the TAKS subject areas, compared with 50% for all students — at best, an achievement gap of 34 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 9% of tenth grade LEP students passed all TAKS subject areas, compared to 51% of all students. Tex. Educ. Agency, 2007 Comprehensive Annual Report 10, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
v. Eleventh grade comparison
For eleventh grade LEP students taking TAKS in English, 20% passed the English language arts test in 2003 compared with 61% of all students, 32% in 2004 compared with 83% of all students, 34% in 2005 compared with 87% of all students, and 35% in 2006 compared with 88% for all students. (Defs.' Ex. 15.) From 2003 through 2006, the achievement gap between LEP students and all students was never below 41 percentage points in English language arts, and it increased over time. Id.In mathematics, 15% of LEP students taking the test in English passed in 2003 compared with 44% of all students, 34% in 2004 compared with 67% of all students, 35% in 2005 compared with 72% of all students, and 43% in 2006 compared with 77% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 29 percentage points in mathematics, and it generally increased. Id.
In social studies, 34% of LEP students taking the test in English passed in 2003 compared with 78% of all students, 57% in 2004 compared with 91% of all students, 53% in 2005 compared with 91% of all students, and 64% in 2006 compared with 94% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 30 percentage points in social studies, though it decreased. Id.
In science, 12% of LEP students taking the test in English passed in 2003 compared with 47% of all students, 20% in 2004 compared with 63% of all students, 29% in 2005 compared with 71% of all students, and 30% in 2006 compared with 75% of all students. Id. From 2003 through 2006, the achievement gap between LEP students and all students was never below 35 percentage points in science, and it generally increased. Id.
In 2006, at the eleventh grade level, students who had exited the program two years previously (non-LEP monitored +2) performed worse than students overall, with 75% passing the English language arts test compared to 88% for all students, 66% passing the math test compared to 77% for all students, 87% passing the social studies test compared with 94% overall, and 55% passing the science test compared with 75% overall. Id.
A student must pass all the TAKS subject areas in English before graduating, therefore, the TAKS administered to eleventh grade students is particularly important. (5 Tr. 90.) In 2005, only 13% of eleventh grade LEP students passed all the TAKS subject areas, compared with 60% for all students. (Defs.' Ex. 51.) In 2006, only 16% of eleventh grade LEP students passed all the TAKS subject areas, compared with 66% for all students — at best, an achievement gap of 47 percentage points. Id.
Although not evidence, the Court notes that recent TEA data indicates that in 2007 17% of eleventh grade LEP students passed all TAKS subject areas, compared to 70% of all students. Tex. Educ. Agency, 2007 Comprehensive Annual Report 10, available at http://www.tea.state.tx.us/research/pdfs/2007_comp_annual.pdf.
The LEP student failure rate after they retake the eleventh grade TAKS is even more alarming. All students are "given additional chances to pass [the TAKS exam] at other times in their junior and senior years[.]" (5 Tr. 89.) After the July 2006 readministration to twelfth grade students who had previously failed to pass, only 53% of LEP students passed all the TAKS subject areas, compared with 78% of students who had completed LEP programs for one year previously, 82% of students who had completed LEP programs two years previously, and 90% for non-LEP students. (Defs.' Ex. 15-A; 5 Tr. 88-89.) Therefore, in order to graduate, those 47% of LEP twelfth grade students who failed are limited to two options: (1) continuing in high school if they have not completed their coursework and retaking the TAKS, or (2) if they have completed their coursework, "they could take advantage of additional opportunities to better their English[,]" such as taking community college classes, and retaking the TAKS when it is offered. (5 Tr. 90.) Importantly, most of these students were not recent immigrants; only 13.1% of LEP students in 2005-2006 were in U.S. schools for less than three full academic years. (Intvs.' Ex. 1 # 7.)
Though 13.1% is a percentage of all LEP students in Texas public schools, the data indicates that a low percentage of recent immigrants make up the twelfth grade LEP population. Twelfth grade students who took the Texas English Language Proficiency Assessment System ("TELPAS") exam serve as a proxy for twelfth grade LEP students. (Defs.' Ex. 17 (Grade 12).) Of the 8,253 students (compared to 9,955 LEP students who retook the TAKS exit exam, (Defs.' Ex. 15-A)), who were rated under the TELPAS composite rating, 6,684 students, or 80.9%, had attended U.S. schools for four or more years. (Defs.' Ex. 17 (Grade 12).) Therefore, according to this proxy, 19.1% of twelfth grade LEP students, had attended U.S. schools for less than four academic years.
As two state education employs testified, the statistics for secondary LEP students are undeniably egregious. Dr. Joe Benal, member of the Texas State Board of Education, testified that the test scores for the higher grade levels for the 2005-2006 term were "horribly bad." (2 Tr. 196-97.) At her deposition, Dr. Shirley Neeley, Texas's Commissioner of Education, while acknowledging other variables, stated that "[t]here's not anybody in their right mind that would say these [2005 test scores] are good scores." (Intvs.' Ex. 74 at 112-13.)
4. Long term LEP students
According to TEA's timeline, LEP students fail to progress through or exit LEP programs in a reasonable time. Laura Ayala, TEA's director of English language learner assessment, testified that LEP students are expected to progress at least one language level — beginning, intermediate, advanced, and advanced high — per school year. (5 Tr. 131). At the advanced high language level, students can meet the academic requirements of their grade level with minimal linguistic assistance, and students exit the LEP program. Id. at 103-05. Accordingly, TEA's goal is that all LEP students will leave LEP programs after three years.
In both 2005 and 2006, 32% of the students who took the Texas English Language Performance Assessment System ("TELPAS") examination rated as advanced high on the TELPAS composite rating (Defs.' Ex. 16 (Grades 3 through 12); Defs.' Ex. 17 (Grades 3 through 12).) Though nearly a third (i.e. 32%) of LEP students exited LEP programs in each of those years, the data indicates that it takes students longer than the goal set by TEA to progress through and exit the program.
In 2005, of the 32% of total students (approximately 105,942 of the total 331,069) that rated at the advanced high level, approximately 60% of those students (approximately 64,041 of approximately 105,942) had been in U.S. schools for four or more years. (Defs.' Ex. 16 (Grades 3 through 12).) Therefore, though 32% of students rated at the advanced high level in 2005, a significant majority, 60%, of those students (60% of the 32% of students that rated at the advanced high level) had been in LEP programs for one, two, three or more years longer than the three year goal set by TEA. Id.
In 2006, of the 32% of total students (approximately 109,370 of the total 341,780) that rated at the advanced high level, approximately 64% of those students (approximately 69,977 of approximately 109,370) had been in U.S. schools for four or more years. (Defs.' Ex. 17 (Grades 3 through 12).) Therefore, though 32% of students rated at the advanced high level in 2006, a significant majority, 64%, of those students (64% of the 32% of students that rated at advanced high level) had been in LEP programs for one, two, three or more years longer than the three year goal set by TEA. Id.
As might be expected from the above data, students also are slow to advance through language levels. In 2005, 63% of LEP students in grades three through twelve who had attended schools in the United States for one year advanced at least one language level, 68% of students who had attended U.S. schools for two years advanced one language level, 64% of students who had attend U.S. schools for three years advanced one language level, 57% of students who had attended U.S. schools for four years advanced one language level, and 47% of students who had attended U.S. schools for five years or more advanced one language level. (Defs.' Ex. 16 (Grades 3 through 12).) At best, 32% (students in U.S. schools for 3 years) did not advance one level; at worst 53% (students in U.S. schools for five or more years) did not advance one level. Id.
Students who attended a Texas school in the first semester of the academic year.
In 2006, 47% of LEP students in grades three through twelve who had attended schools in the United States for one year advanced at least one language level, 63% of students who had attended U.S. schools for two years advanced one language level, 70% of students who had attend U.S. schools for three years advanced one language level, 56% of students who had attended U.S. schools for four years advanced one language level, and 47% of students who had attended U.S. schools for five years or more advanced one language level. (Defs.' Ex. 17 (Grades 3 through 12).) At best, 30% (students in U.S. schools for 3 years) did not advance one level; at worst 53% (students in U.S. schools for five or more years) did not advance one level. Id.
Students who attended a Texas school in the first semester of the academic year.
5. Exclusion from advanced academic achievement
LEP students also complete advanced academic courses at a far lower rate than all students. In 2002-2003, 7.8% of LEP students and 19.7% of all students completed a dual enrollment course, in which students can take a college course and get credit for both high school and college. (1 Tr. 70; Intvs.' Ex. 14 at I.9.) In 2003-2004, 8.5% of LEP students and 19.9% of all students completed a dual enrollment course. (Intvs.' Ex. 14 at I.9.)
Advanced placement courses are college level courses that are offered in high school for college credit. (1 Tr. 70.) The international baccalaureate is a program that can be offered in Spanish, French, and English, and it is a diploma recognized worldwide, which requires completion of more difficult courses that count as college credit. Id. at 71, 104. In 2003, 16.1% of all students took examinations for advanced placement or international baccalaureates, with 56% of those students receiving passing scores. (Intvs.' Ex. 14 at I.9.) In 2004, 17.4% of all students took examinations for advanced placement or international baccalaureates, with 53.9% of those students receiving passing scores. Id. However, in both 2003 and 2004, the percentage for both LEP students who took the exam and for those who received passing scores is listed as "not applicable." Id.
III. Conclusions of Law
To the extent that these conclusions of law are also deemed to be findings of fact, they are hereby incorporated into the preceding findings of fact.
A. Modified Order
Intervenors' present claims are a continuation of Intervenors' 1981 action under the Modified Order. Due to the significant structural flaws in TEA's PBMAS monitoring system, Intervenors successfully prove a violation of the Modified Order, but because Defendants violations of the EEOA are broader than those of the Modified Order, the Court's remedial action is based upon the statute, not the remedial decree.
The Modified Order directs Defendants to take affirmative steps to eliminate all remaining vestiges of the former de jure segregated school system in Texas, to prevent the recurrence of a segregated system, and to achieve fully integrated schools. United States v. Texas (LULAC), 793 F.2d 636, 642 (5th Cir. 1986). Under the Modified Order, Defendants' duties are twofold: first, Defendants are required to "act at once to eliminate by positive means all vestiges of the dual school structure throughout the state; and second, to compensate for the abiding scars of past discrimination." (Modified Order 8.)
Under these two guiding precepts, section G of the Modified Order outlines Defendants' responsibilities in the realm of curriculum and compensatory education. Section G(1) requires that Defendants "shall [ensure] that school districts are providing equal education opportunities in all schools. Id. at 13. Under § J(1) of the Modified Order, the Court retains jurisdiction "for the purposes of entering any and all further orders [that] may become necessary to enforce or modify this decree."
In Milliken II, the Supreme Court held that in fashioning desegregation decrees federal courts must follow three equitable principles: (1) the remedy must be determined by the scope of the constitutional violation; (2) the remedy must restore the victims to the place they would have occupied without the unconstitutional discrimination; and (3) the court must consider the interests of state authorities in managing their own affairs, consistent with the Constitution. Milliken v. Bradley, 433 U.S. 267, 280-81 (1977); see also Missouri v. Jenkins, 515 U.S. 70, 97-98 (1995) (reaffirming the "bedrock principle" that a district court's remedial order must flow from the constitutional violation (the first equitable principle) and must consider the interests of state and local authorities as long as they are consistent with the Constitution (the third equitable principle)). In regard to the first equitable principle, the Fifth Circuit specifically addressed the breadth of the Modified Order, holding that that district courts retain "broad remedial jurisdiction over those facets of school operations [that] represent or flow from an earlier de jure discriminatory system, [though that] federal remedial jurisdiction goes only so far as the correction of the constitutional infirmity." United States v. Texas (Goodrich), 158 F.3d 299, 311 (5th Cir. 1998). Therefore, the continuing remedial breadth of the Modified Order is limited to school operations that "represent or flow" from the previous state mandated discriminatory system. Id.
Segregation on the school campus level and the progeny that flowed from it were integral facts that shaped the 1971 remedial order. The Modified Order "was issued in a suit [that] concerned the elimination of segregation . . . in the State's primary and secondary schools." United States v. Texas (LULAC), 793 F.2d 636, 643 (5th Cir. 1986). In the opinion that served as the basis for the Modified Order, this Court recounted that prior to 1954 the state had drawn district lines often enclosing a single school and often consisting of members of only one race. United States v. Texas, 321 F. Supp. 1043, 1047 (E.D. Tex. 1970). Though the number of districts was reduced by the time of the Court's ruling, the state nevertheless maintained a formula that favored small districts, and Texas failed to consolidate "all-black and educationally inferior districts into adjacent units under their jurisdiction." Id. at 1048.
As this Court found in 1981, Texas has a troubled history in regard to educating non-English speaking students. United States v. Texas (LULAC), 506 F. Supp. 405, 411-20 (E.D. Tex. 1981). In that decision, the Court summarized the three distinct forms of intentional de jure discrimination that the State of Texas had instituted: (1) children were restricted to so-called "Mexican schools;" the resources provided Mexican-American children "were vastly inferior to those of their Anglo counterparts;" and the native language and culture of the Mexican-American children were oppressed, in part as a vehicle to maintain their inferior position. Id. at 414. The Court found that "[i]n determining the presence of a constitutional violation, the remoteness in time of purposeful discrimination is not a viable defense. If a school system engaged in intentional discrimination on the basis of race or national origin at any time in the past, it bears an affirmative duty to eliminate all vestiges of that discrimination, root and branch." Id. at 413 (internal citations omitted). Despite this constitutional imperative, the Court found that Mexican-American children in Texas public schools continued to suffer from lingering discriminatory treatment rooted in the past. Id. at 415. Mexican-American students in Texas schools were severely behind their white counterparts in reading, far more frequently repeated grades, and dropped out of school at a high rate. Id.
Intervenors limit their present claims under the Modified Order to the basic contention that TEA failed to monitor the components of LEP programs and thereby failed to ensure that school districts were offering equal educational opportunities in schools. (Intvs.' Post-Trial Br. 7, 9, 29.) With slight semantic modification, when the Court refers to monitoring, it adopts Intervenors' definition of monitoring: "[T]he obligation of Defendants to ensure that the components of the special language programs for LEP students are actually in place in schools and classrooms and if not, that the enforcement authority of the TEA is used to [ensure] that those components are put into place." Id. at 7 (emphasis added). Intervenors allege that "defendants have functionally abandoned monitoring of the components of their bilingual and ESL program. . . ." (Intvs.' Resp. to Mot. Recons. 25; see also, Mot. Further Relief ¶ 45 (alleging that "[t]he failure to conduct meaningful on-site monitoring as described herein constitutes a violation of . . . the orders of this Court").) Citing expert opinions, Intervenors contend that only onsite monitoring can ensure compliance with the elements of bilingual and ESL programs. (Intvs.' Post-Trial Br. 9; see 1 Tr. 29-30; 2 Tr. 75.) Intervenors also claim that the PBMAS system is structurally flawed because it under-identifies LEP students; the achievement standards used for intervention are arbitrary and not based upon equal education opportunity; it masks drop-out rates of high school students by aggregating them with middle school students; personnel are not qualified to monitor bilingual-ESL programs; and the failure of individual campuses is masked by PBMAS monitoring on the district level. (Intvs.' Post-Trial Br. 29, 30-40.)
As Intervenors base their evidence and argument under G(1) of the Modified Order solely upon monitoring, the Court does not decide if the disproportionately poor academic achievement of LEP students violates section G(1)'s mandate for TEA to "ensure equal education opportunity."
This definition is equally applicable to the EEOA's implementation requirement, which requires that the programs and practices actually used by a school are "reasonably calculated to implement effectively the educational theory adopted by the school"; that is, whether "the system follows through with practices, resources, and personnel necessary to transform the theory into reality." Castaneda v. Pickard, 648 F.2d 989, 1010 (5th Cir. 1981).
Contrary to Intervenors' contention, the Modified Order does not require onsite monitoring. Section G(1) of the Modified Order directs that TEA "shall [ensure] that school districts are providing equal education opportunities in all schools." Nothing in this language mandates that TEA employ onsite monitoring in order to satisfy the order. This Court must give latitude to the state and its agency to choose the means to effectuate their constitutional responsibilities as long as those responsibilities are satisfied consistent with the Constitution. Jenkins, 515 U.S. at 98. A modified PBMAS system that relies primarily on data collection and review can pass constitutional muster without primarily relying upon onsite intervention.
Nevertheless, the PBMAS system in its current form fails to satisfy the mandate of the Modified Order. As Intervenors argue and the facts illustrate, there are significant and fatal flaws in the PBMAS data collection, analysis, and intervention system. As analyzed in more depth below, the Court finds significant gaps in the PBMAS system: TEA under-identifies LEP students; the achievement standards used for intervention are arbitrary and not based upon equal education opportunity; the failing achievement of higher grades is masked by passing scores of lower grades; and the failure of individual school campuses is masked by only analyzing data on the larger district level. These gaps and masking undermine the veracity of the data on which the PBMAS relies and thereby undermine the effectiveness of TEA's monitoring system. Because PBMAS is based upon this data and because the totality of the data is seriously flawed, TEA, through PBMAS in its present form, violates the directive of the Modified Order to "ensure equal education opportunities in all schools."
See infra Section III.B.3.b.ii.
The other TEA monitoring, through NCLB and Texas accountability rating system, that incidentally monitor the achievement of LEP students do not compensate for the flaws of PBMAS. None of the other monitoring methods are based upon providing equal educational opportunity or initiate intervention based upon failure to provide equal educational opportunity. See Flores v. Arizona, 515 F.3d 1140, 1172 (9th Cir. 2008) (finding that fulfilling requirements of the NCLB did not fulfill the requirements of the EEOA because NCLB is not based upon providing equal education).
Because this Court has already found, and the Fifth Circuit has upheld, that TEA and the State of Texas involved in intentional discrimination against Spanish speaking Mexican-American students that were represented by the same Intervenors, United States v. Texas (LULAC), 506 F. Supp. at 413-18, recent discriminatory intent is not required to establish liability for violation of the Modified Order. See United States v. Texas (Hearne/Mumford), 457 F.3d 472, 482-84 (5th Cir. 2006) (holding that original parties to the Modified Order whose motivations were "utterly benign" could still be subject to a remedial order, if the remedy was proportionate to the harm, but parties not subject to the Modified Order could not be exposed to liability without a finding of discriminatory intent).
Defendants' recent action and inaction flow from the remedial facts of the past de jure discriminatory system, and by their continued failure, Defendants have not met their affirmative duty to eliminate all vestiges of that discrimination. The Modified Order was issued to eliminate racial discrimination in the state's primary and secondary schools. United States v. Texas (LULAC), 793 F.2d at 643. The recent charges also flow from racial and ethnic discrimination in the state's primary and secondary schools. In the 1981 action brought by Intervenors, the Court found that TEA and the state had intentionally discriminated against Spanish speaking Mexican-Americans. United States v. Texas (LULAC), 506 F. Supp. at 414. As a result of those actions, the Court found that Mexican-American students in Texas schools were severely behind their white counterparts in reading, were far more frequently compelled to repeat grades, and dropped out of school at a high rate. Id. at 415. Today, Defendants have not upheld their affirmative duty, and the same ills — poor achievement, excessive retention rates, and excessive drop-out rates — continue to affect LEP students. The current harm to LEP students, perpetuated by TEA's actions and defaults, flow from the same remedial facts, and identical injuries to those found in 1981 persist.
However, the Court does not base its remedial action upon the enforcement powers of section J(1) of the Modified Order but upon the Equal Education Opportunity Act. As the arguments and evidence before the Court demonstrate that Defendants have violated the EEOA on broader grounds — inclusive of Defendant's violations of the Modified Order — the Court's remedial action is entirely based upon the statute rather than the remedial decree.
B. Equal Education Opportunity Act
The remaining issue is whether Defendants' administration of the state's chosen program for educating LEP students violates the Equal Education Opportunity Act, 20 U.S.C. § 1703(f) (2006). Identical to their argument under the Modified Order, Intervenors' first argue that Defendants fail to adequately monitor the components of the LEP program, as required by the EEOA. (Intvs.' Post-Trial Br. 4, 7, 8; Mot. Further Relief ¶¶ 63, 64); see Castaneda, 648 F.2d at 1010 (holding that in examining a violation of § 1703(f) of the EEOA, courts must determine "whether the programs and practices actually used by a school system are reasonably calculated to implement effectively the educational theory adopted by the school"). They argue that Defendants monitoring fails because TEA does not conduct onsite monitoring and because the PBMAS system is so structurally flawed that it does not fulfill Defendants' monitoring obligations. Id. at 7, 29. In their second argument, Intervenors contend that the poor performance of LEP students in secondary schools demonstrates that Defendants' LEP education policy, though appropriate when adopted, has been unsuccessful in practice. (Intvs.' Post-Trial Br. at 17, 18-29; Mot. Further Relief ¶¶ 63, 65); see Castaneda, 648 F.2d at 1010 (holding that in examining a violation of § 1703(f) of the EEOA, court's also must examine, after an appropriate time period, if "a legitimate educational theory[,] . . . implemented through the use of adequate techniques, fails . . . to produce results indicating that the language barriers confronting students are actually being overcome . . .").
The United States also asserts that Defendants have violated section 1703(f) of the EEOA. The United States contends that Defendants' monitoring efforts are deficient in two respects. First, Defendants do not intervene in low-performing individual campuses that are located within otherwise satisfactory school districts. (U.S. Post-Trial Br. 2.) Second, the United States asserts that because Defendants have abandoned DEC's cyclical onsite visits, Defendants have no mechanism to ensure compliance with state standards for LEP programs. Id.
1. EEOA statutory text and legislative history
The Equal Education Opportunity Act requires that "[n]o State shall deny equal educational opportunity to an individual on account of his or her race, color, sex, or national origin, by . . . the failure by an educational agency to take appropriate action to overcome language barriers that impede equal participation by its students in its instructional programs." 20 U.S.C. § 1703(f).
The EEOA was a floor amendment, and therefore, it has almost no legislative history. Castaneda v. Pickard, 648 F.2d 989, 1001 (5th Cir. 1981). In light of this scarcity of evidence of congressional intent, the Fifth Circuit has held that courts should "adhere closely to the ordinary meaning of the [statute's] language." Id.
2. Intervenors' statewide, rather than district based, claims
Defendants claim that Intervenors have failed to state an EEOA claim against the statewide entities — TEA, the Commissioner of Education, and the State of Texas — as opposed to claims against individual school districts. As the State of Texas has chosen a system of shared responsibilities between state actors and local officials, the statewide entities, and TEA in particular, are subject to the requirements of the EEOA in so far as the entities have failed to fulfill their responsibilities. ( See August 11, 2006 Order 7-10.)
In determining the responsibility of TEA under the EEOA, the Court is primarily guided by the Fifth Circuit's opinion reversing this Court's 1981 injunctive relief. United States v. Texas (LULAC), 680 F.2d 356 (5th Cir. 1982). As discussed in full in the Court's August 11, 2006 Order, adopted here by reference, the Fifth Circuit reasoned that by choosing the language "appropriate action" in the EEOA, "Congress left the state and local authorities substantial latitude to select programs and techniques of language remediation suitable to meet their individual goals." Id. at 374. The court found that as language problems will vary by district, "whether the effect of local language program, state-mandated or not, constitutes appropriate action to deal with language barriers faced by the students of a given school district will of necessity be an essentially local question." Id. In accord with the plain language of the EEOA, the Fifth Circuit, in effect, directed that where state law mandates that responsibilities be delegated to a district, a violation of the EEOA will be determined district by district.
However, the EEOA does not limit application of § 1703(f) to local agencies. Texas is prohibited under § 1703(f) from depriving individuals of equal educational opportunity "by . . . the failure by an educational agency to take appropriate action to overcome language barriers that impede equal participation by its students in its instructional programs." 20 U.S.C. § 1703(f) (emphasis added). The EEOA defines "educational agency" as "a local educational agency or a `State educational agency,'" and a state educational agency is "the agency primarily responsible for the State supervision of public elementary schools and secondary schools." 20 U.S.C. §§ 1720(a), 7801(41).
Based upon the Fifth Circuit's decisions in Castaneda v. Pickard, 648 F.2d 989 (5th Cir 1981) and in Texas (LULAC), 680 F.2d 356, the Seventh Circuit held that under § 1703(f) "the obligation to take `appropriate action' falls on both state and local educational agencies. . . . [Section] 1703(f) requires that state, as well as local, educational agencies ensure that the needs of LEP children are met." Gomez v. Ill. State Bd. of Educ., 811 F.2d 1030, 1042-43 (7th Cir. 1987); see also Idaho Migrant Council v. Bd. of Educ., 647 F.2d 69, 71 (9th Cir. 1981) (holding that the EEOA "imposes requirements on the State Agency to ensure that . . . language deficiencies are addressed"). The Seventh Circuit's Gomez opinion is consistent with the Fifth Circuit's opinion in Texas (LULAC). The Fifth Circuit recognized that if a state delegated LEP responsibilities to local districts, then within those responsibilities, the districts were obligated to take appropriate action to overcome language barriers. The Seventh Circuit recognized that where powers were retained by the state or its educational agency, the state was obligated to take appropriate action to overcome language barriers. However, the Seventh Circuit warned that "State agencies cannot, in the guise of deferring to local conditions, completely delegate in practice their obligations under the EEOA; otherwise, the term `educational agency' no longer includes those at the state level." Gomez, 811 F.2d at 1043.
The intensity of judicial review of a state educational agency's obligations under § 1703(f) is necessarily focused upon the domain of that agency. Id. at 1042. As the Fifth Circuit found, "the State of Texas . . . directly educates no one; this is the work of the school districts." Texas (LULAC), 680 F.2d at 374. When a local district is involved, a court may consider conditions to the level of actual classrooms; however, when only the state agency's obligations are at issue, the court may only consider conditions within the province of the state's supervision. 20 U.S.C. §§ 1720(a), 7801(41); Gomez, 811 F.2d at 1042. In accord with its supervisory domain, a state educational agency must set guidelines for establishing language remediation programs, and it must ensure those guidelines are implemented. 20 U.S.C. §§ 1720(a), 7801(41); Gomez, 811 F.2d at 1042-43. Under the EEOA, "these general measures must constitute `appropriate action'" in order to withstand judicial review. Gomez, 811 F.2d at 1042. Accordingly, in order to determine the breadth of the Court's review of the State of Texas and TEA, the Court must determine what responsibilities are allocated to TEA and what guidelines it has set to establish Texas's language remediation programs.
In this EEOA action against the state entities, the Court must, where sufficient evidence is presented, review TEA's responsibilities where the state has delegated authority to TEA, and the Court also must review the ongoing appropriateness of the guidelines established by the state. As explained below, TEA's monitoring and enforcement functions must be reviewed under the implementation prong established in Castaneda v. Pickard, 648 F.2d 989, and the ongoing appropriateness of the state's guidelines are subject to review pursuant to that opinion's results prong. 648 F.2d at 1009-10.
3. Castaneda three prong test
The seminal case on section 1703(f) is Castaneda v. Pickard, 648 F.2d 989 (5th Cir. 1981). The plaintiffs in Castaneda, Mexican-American school children and their parents, argued that the bilingual-ESL program in the Raymondville, Texas school district violated the EEOA by failing to take "appropriate action to overcome language barriers." Id. at 1006. The plaintiffs contended "that in three areas essential to the adequacy of a bilingual program[,] curriculum, staff and testing[,] Raymondville [fell] short." Id. at 1010.
While acknowledging that Congress had provided little direction for courts interpreting the statute, the court reasoned that by requiring educational agencies to take " appropriate action to overcome language barriers" — rather than "bilingual education" or some other prescriptive measure to overcome the barriers — that Congress intended to leave state and local educational authorities "a substantial amount of latitude in choosing the programs and techniques they would use to meet their obligations under the EEOA." Id. at 1009. However, reasoning that because Congress obligated school systems to overcome language barriers and provided a private right of action in 20 U.S.C. § 1706, the court found that the latitude afforded state and local agencies was circumscribed by Congress's intent "to [ensure] that schools made a genuine and good faith effort, consistent with local circumstances and resources, to remedy language deficiencies of their students. . . ." Id.
The court also reasoned that because the language of 1703(f) did not include the words "intent" or "discrimination," Congress deliberately excluded an intent requirement from 1703(f). Id. at 1008. Accordingly, the court concluded that "the failure of an educational agency to undertake appropriate efforts to remedy language deficiencies of its students, regardless of whether such a failure is motivated by intent to discriminate against those students, would violate § 1703(f). . . ." Id.
In accord with these precepts, the court articulated a three prong test to determine the "appropriateness of a particular school system's language remediation program" under 1703(f). Id. Courts must inquire if (1) the language remediation program is based upon sound educational theory; (2) whether the school system is making reasonable efforts to implement that theory; and (3) whether, after a legitimate trial period, that implementation has achieved results in overcoming language barriers. Id. at 1009-10.
a. Corrections of previous errors: Flores v. Arizona, 515 F.3d 1140 (9th Cir. 2008)
A persuasive February 22, 2008 decision by the Ninth Circuit Court of Appeals, Flores v. Arizona, 515 F.3d 1140 (9th Cir. 2008), allowed the Court to perceive its previous clear and manifest errors of fact and law, regarding the application of NCLB requirements to EEOA implementation and the Court's analysis of the distinct bilingual and ESL programs. Persuaded by the circuit court, this Court adopts conclusions of law from the holdings in Flores. Id.
The district court in Flores denied the state superintendent of schools and members' of the state legislature ("the superintendent and legislative intervenors") motion to vacate the court's seven year old remedial order. Flores v. Arizona, 480 F.Supp.2d 1157, 1160, 1165-67 (D.Ariz. 2007). The court found that — contrary to the superintendent and legislative intervenors' contention — the proposed state legislation violated federal law and did not adequately fund English Language Learner ("ELL," the equivalent of LEP) instruction in accord with the implementation prong of Castaneda and the court's remedial order. Id. The court also acknowledged that ELL students in the school district were doing well but the "success is fleeting at best, particularly as it pertains to [the district's] high school students. It is great that children in elementary and middle school are doing better however, that is not sufficient. Success must also include high school students. . . . Currently, this is not being accomplished." Id. at 1160.
The Ninth Circuit upheld the district court on appeal and expanded upon the legal bases for the lower court's ruling. The district court's opinion in Flores was brief in order to accommodate the scheduled adjournment of the state legislature, but the Ninth Circuit further elucidated and analyzed the evidence presented at the hearing. 480 F.Supp.2d at 1160; 516 F.3d at 1154. In order to be released from the district court's order, the superintendent and legislative intervenors had the burden to demonstrate that the basic factual premises had changed — including proving that due to ELL students' improved achievement, additional funding was no longer necessary — or that the legal landscape had been altered to satisfy the requirements of the court's order. Flores, 516 F.3d at 1168, 1169-70.
In a ruling particularly relevant to the instant action, the Ninth Circuit concluded that, as a matter of fact, ELL students achievement had not changed to the degree necessary to eliminate the need for additional ELL funding. Id. at 1170. The court recognized that it did not "have data that conclusively demonstrate[d] whether ELL programs ultimately succeed — that is, whether children pass through [ELL programs] rapidly and ultimately perform as well as non-ELL students." Id. at 1156; see also id. at 1170 (noting that the "data is limited"). Despite this and other caveats, the court found, based upon the achievement test scores of ELL students, that the superintendant and legislative intervenors had not met their burden of establishing changed circumstances. Id. at 1155-56, 1170; see also id. at 1155, n. 21 (acknowledging that "[s]tandardized test scores do not . . . provide a full measure of a school's successes and failures. . . . But test scores do provide us with at least a rough sense of relative performance, and so are useful here").
In its factual analysis of the evidentiary hearing, the Ninth Circuit thoroughly explicated the failures of ELL students statewide and in the district at issue. In terms of statewide test scores, the court noted that though Arizona students passed the state standardized test at rates between sixty and seventy percent in math and reading, ELL students lagged far behind. Id. at 1156. For ELL third graders statewide in 2005, only 50% passed the math exam and only 40% passed the reading exam. Id. As in Texas, the court found that "the situation grows worse in higher grades"; in 2005, only 33% of ELL tenth graders passed the math exam, only 20% in 2006, and in reading, only 30% passed in 2005, and barely more than 10% passed in 2006. Id. Moreover, the court recounted that, though it did not have data, witness testimony demonstrated that ELL students statewide did not leave the program rapidly; many ELL students required ELL instruction for more than two years, some for more than three. Id.
On the district level, though lacking longitudinal data on ELL students progress through the program that would demonstrate the success or failure of ELL programs, the court recounted that "test results . . . show the same problems that appear in the statewide data." Id. at 1157. The court summarized the results as "relative success at lower grades (although not equal to that of English speaking students within NUSD), and increasing failures for older students, a significant majority of whom are failing the state's basic achievement tests." Id. at 1158. The court noted one bright spot in the data but added a caveat due to the limitations of the data available:
For all grades in 2005-06, reclassified ELL students were doing about as well as native English speakers. . . . But as the data on such scores does not track individual students, showing when they passed through ELL programs and how long it took them to do so, and because the reclassification methodology continues to shift, this bright spot does not offset the otherwise troubling ELL test data. On the data available, it is possible that some high achievers may rapidly be leaving ELL programs while other students continue to struggle, never achieving at the same levels as non-ELL students. Indeed, [a witness] testified that reclassification . . . takes, on average, four to five years and the district court so found. The encouraging success of reclassified students is therefore of limited significance with regard to the overall impact of [the] ELL program.Id. at 1159.
The superintendent and legislative intervenors also argued that the adoption of NCLB was a legal change that made compliance with NCLB sufficient to satisfy the EEOA. Id. at 1172. The Ninth Circuit was unpersuaded and recognized "the distinct purposes of the EEOA and NCLB: The first is an equality-based civil rights statute, while the second is a program for overall, gradual school improvement." Id. The court continued,
NCLB . . . packages federal grants with discrete, incremental achievement standards as part of a general plan gradually to improve overall performance. It does not deal in the immediate, rights-based framework inherent in civil rights law, although it is intended to ameliorate over the longer haul the conditions that lead to civil rights violations. Perhaps recognizing as much, Title III of NCLB explicitly provides that "[n]othing in this part shall be construed in a manner inconsistent with any Federal law guaranteeing a civil right." 20 U.S.C. § 6847.
The EEOA is just such a rights-enforcing law. It requires states "to ensure that needs of students with limited English language proficiency are addressed," Idaho Migrant Council v. Bd. Of Educ., 647 F.2d 69, 71 (9th Cir. 1981), by requiring them to remove barriers to equal participation in educational programs now rather than later, and it provides students with a right of action to enable them to enforce their rights, see 20 U.S.C. § 1706; Los Angeles NAACP v. Los Angeles Unified Sch. Dist., 714 F.2d 946, 950-51 (9th Cir. 1983). The EEOA's concerns, in other words, lie fundamentally with the current rights of individual students, while NCLB seeks gradually to improve their schools. An individual student whose needs are not being met under the EEOA need not wait for help just because, year after year, his school as a whole makes "adequate yearly progress" towards improving academic achievement overall, including for ELL students.Id. at 1173.
As discussed infra, the holdings in Flores persuade the Court that it committed clear and manifest error in its factual finding that it was compelled to consider the "panoptic results" of LEP students in all grades rather than considering the achievement of primary and secondary students separately. (July 30, 2007 Op. 28-29.)
Previously, the Court reasoned that though bilingual and ESL education are distinct educational theories, the secondary ESL program is merely the latter stage of a comprehensive LEP educational effort beginning with a bilingual elementary program. Id. at 29. That finding was clear and manifest error on its face. As Intervenors identified, the Court contradicted itself in its July 30 opinion, at once finding, "[b]ilingual programs are distinct from ESL programs" and that "[b]ilingual programs are also implemented differently from ESL programs under state law" and in the same opinion finding, "[t]he program for secondary students[, which is an ESL program,] is not separate and distinct from the elementary school program[, which is a bilingual program]." (Intvs.' Mot. Amend 13; July 30, 2007 Op. 10, 29.) As the Court stated in its previous findings of fact before misconstruing the facts in its analysis, the Court finds and concludes unequivocally that the bilingual program used in elementary schools and the ESL program used in secondary schools — each employing different educational theories and implemented differently by TEA — are distinct and must be analyzed as such.
Also contrary to the Court's previous conclusion, and irrespective of whether the students are taught under different educational theories, the Court finds that the fact that more LEP students are in primary school rather than secondary school (July 30, 2007 Op. 29) is not dispositive in an action brought under an equal rights statute, designed to protect "the current rights of individual students." Flores, 516 F.3d at 1173; see also Gomez, 811 F.2d at 1034 (finding that 5,185 LEP students out of 38,364 were being denied equal educational opportunities). As the district court in Flores explained, "[i]t is great that children in elementary and middle school are doing better however, that is not sufficient. Success must also include high school students. . . . Currently, this is not being accomplished." Flores, 480 F.Supp. 2d at 1160. The Ninth Circuit did not question this conclusion, describing the conditions as "relative success at lower grades (although not equal to that of English speaking students within NUSD), and increasing failures for older students, a significant majority of whom are failing the state's basic achievement tests." Flores, 516 F.3d at 1158. A majority of the individual LEP students in secondary schools are failing, despite TEA's twenty-five year trial. As the Flores opinions made clear to this Court, under a statute guaranteeing the civil rights of individual students, the fact that one segment of the LEP population is marginally succeeding does not eviscerate the rights of another segment of LEP students who are failing across the board.
Fortunately, the purpose of both Federal Rules of Civil Procedure 52(b) and 59(e) is to allow courts to correct such clear and manifest errors. Templet, 367 F.3d at 479; Fontenot, 791 F.2d at 1219. This correction of factual findings further alters the Court's application of the second, implementation prong, and third, result-based prong of Castaneda.
Throughout its July 30, 2007 vacated opinion, the Court often intertwined the implementation prong, prong two, and the results-based prong, prong three, of Castaneda. For instance, the Court misconstrued Intervenors arguments as an amalgamation of prong two and prong three: "Citing `miserable' LEP student academic achievement as proof, Intervenors contend that over the last [twenty-five] years, TEA has abandoned monitoring, enforcing, and supervising school districts to ensure compliance with . . . Texas's bilingual/ESL program." (July 30, 2007 Op. 14; see also id. at 1, 5, 17.) The Court erroneously combined prongs two and three of Intervenors' Castaneda argument; as explained infra, LEP student academic failure is a prong three, results-based inquiry, not a prong two, implementation inquiry, which would include terms such as monitoring, enforcement, and supervision. Despite this confusion in articulating Intervenors arguments, the Court analyzed Intervenors claims under the appropriate prongs. ( see July 30, 2007 Op. 21, 27, §§ II.C.I 2 (addressing prong two and prong three respectively).)
Intervenors mildly contributed to the Court's confusion by employing the term "evaluate" — a term seemingly akin to terms such as monitor and supervise, which are integral to the implementation prong — when referring to alleged prong three, results-based violations. ( See July 30, 2007 Op. 31 ("It is entirely unclear from Intervenors' argument exactly what they seek regarding `evaluation.'").) Any confusion should have been minimal because upon contextual analysis Intervenors clearly argue that Defendants should have evaluated and modified their failing language remediation program in secondary schools. ( See Intvs.' Post-Trial Br. 17 (arguing that Defendants failed to evaluate and change the LEP program in secondary schools and citing prong three of Castaneda).) The prolonged failure of the program for secondary students is a violation of prong three, and evaluation and modification of the program is Intervenors', albeit premature, suggested remedy. Otherwise, Intervenors' court filings consistently and clearly articulate their prong two and prong three arguments ( see Mot. Further Relief 10, ¶¶ 36-62, 64 (raising prong two violations; section entitled "Evisceration of Monitoring and Enforcement" and prayer to implement monitoring system); id. at 8, ¶¶ 26-35, 65 (raising prong three violations; section entitled "Failure of Defendant's Program for LEP students" and prayer for change to LEP program for secondary students); Intvs.' Post Trial Br. 8, §§ III, V (citing prong two of Castaneda and arguing that because of insufficient monitoring Defendants have failed to actually implement their language program); id. at 17, § IV (citing prong three of Castaneda and arguing that Defendants' secondary LEP programs have failed)).
Despite the relative clarity of Intervenors' arguments, Defendants' Post-Trial Brief significantly contributed to the Court's confusion. Defendants erroneously allege that Intervenors complain only "that TEA has ceased to monitor and enforce the [bilingual-ESL] program[,]" a prong two violation. (Defs.' Post-Trial Br. 2; see also id. at 3.) Perhaps most emblematic of Defendants inarticulate differentiation between the prongs was their claim that despite Intervenors objection to TEA's focusing its monitoring "on student performance rather than on compliance with procedures. . . . [T]he plaintiffs devoted a major portion of their trial presentation to the performance of LEP students in the upper grades, as a basis for arguing against TEA's current system." Id. at 13. To the contrary, Intervenors first claim that the PBMAS is unlawful because it is based upon performance data rather than onsite monitoring, and Intervenors make a separate argument that LEP performance data demonstrates that the bilingual-ESL program has failed prong three. Nevertheless, Defendants address Intervenors' prong three results-based argument in Section III.D and E. (Defs.' Post-Trial Br. 15-26.)
b. Prong one: Sound educational theory
Courts must first determine if the language remediation program is based upon sound educational theory: whether a school "is pursuing a program informed by an educational theory recognized as sound by some experts in the field or, at least, deemed a legitimate experimental strategy." Id. at 1009. There is no dispute that Defendants bilingual and ESL programs are sound in theory.c. Prong Two: Implementation
Courts next must inquire if the school system is making a reasonable effort to implement that theory: whether "the programs and practices actually used . . . are reasonably calculated to implement effectively the educational theory adopted by the school"; that is, whether "the [school] system follows through with practices, resources, and personnel necessary to transform the theory into reality." Id.
Three necessary, but non-exclusive, elements of program implementation are adequate evaluation of LEP student progress, adequate remedial education, and qualified personnel. Id. at 1010. While addressing a school district's failure to test LEP students in their native language, the Castaneda court found that "[P]roper testing and evaluation is essential in determining the progress of students involved in a bilingual program and ultimately, in evaluating the program itself." Id. at 1014. The court also determined that LEP students must have sufficient remedial education to overcome academic deficits incurred while learning English: "[i]f no remedial action is taken to overcome the academic deficits that limited English speaking students may incur during a period of intensive language training, then the language barrier, although itself remedied, might, nevertheless, pose a lingering and indirect impediment to these students' equal participation in the regular instructional program." Id. at 1011. Though § 1703(f) does not delineate whether education in subjects other than English should occur simultaneously with or subsequent to English instruction, the statute does
impose on educational agencies not only an obligation to overcome the direct obstacle to learning[,] which the language barrier itself poses, but also a duty to provide limited English speaking ability students with assistance in other areas of the curriculum where their equal participation may be impaired because of deficits incurred during participation in an agency's language remediation program.Id. The court also concluded that teachers charged with educating LEP students must be qualified. Id. at 1012-13.
Intervenors argue that Defendants fail the implementation prong because TEA does not conduct onsite monitoring and because the PBMAS system is so structurally flawed that it does not fulfill Defendants' monitoring obligations. (Intvs.' Post-Trial Br. 7, 29.) The United States asserts that Defendants fail the implementation prong because TEA does not perform cyclical onsite visits and because TEA does not intervene in low-performing individual campuses that are located within otherwise satisfactory school districts. (U.S. Post-Trial Br. 2.)
Though courts must respect the allocation of power between state and local authorities, TEA, as a state agency, must comply with the EEOA for those responsibilities allocated to it under state law. The state has established guidelines for implementation of its language remediation programs. Texas has adopted a policy to "ensure equal educational opportunity to every student," and to achieve this end, the state has established bilingual-ESL programs in public schools. Tex. Educ. Code §§ 29.051, 29.053(d). In addition, the state established guidelines for bilingual and ESL program content and instructional methods, id. at § 29.055; facilities and class sizes, id. at § 29.057; and bilingual and ESL teacher certification, id. at § 29.061. Under Texas law, TEA is required to "administer and monitor compliance with education programs required by federal or state law" and is required to evaluate and monitor the effectiveness of the state's LEP programs through PBMAS. Id. at §§ 7.021(b)(1), 29.062(a). TEA is also required to monitor bilingual education and special language programs in the areas of "program content and design," "program coverage," "identification procedures," "classification procedures," "staffing," "learning materials," "testing materials," "reclassification of students," and TEA must monitor LPACs. Id. at § 29.062(b)(1)-(9). TEA also has enforcement powers over districts and schools to ensure compliance with state standards. Id. at § 29.056.
See supra Section III.B.2.
i. Onsite monitoring is not required by the EEOA
The EEOA, like the Modified Order, does not require onsite monitoring. Nothing in the language of the EEOA or in Castaneda requires periodic onsite monitoring. All that is required is that the educational agency "follows through with practices, resources, and personnel necessary to transform the [educational] theory into reality." Id. at 1010. Just as an educational agency can choose when to administer remedial education, Texas and TEA can initially entrust local authorities with implementing the curriculum and use an appropriate data monitoring system to ensure implementation. Castaneda, 648 F.2d at 1011. However, once failure on the local level is evident, because of its state mandated enforcement powers and responsibilities to administer, evaluate, and monitor LEP programs, TEA must take further appropriate action. Tex. Educ. Code §§ 7.021(b)(1), 29.056, 29.062(a). Under the PBMAS system adopted by the state, this action will most likely be onsite intervention to correct failures in implementation. (Defs.' Ex. 5 6.)
Even if the state had not allocated these responsibilities to TEA, TEA would, nevertheless, be required to take further appropriate action because it cannot abdicate the responsibility to rectify local failures to those same failing local authorities. Gomez, 811 F.2d at 1043.
ii. The PBMAS system is flawed
Though onsite monitoring is not required by the EEOA, TEA must still effectively implement the LEP program. Effective implementation includes effective monitoring of the progress of LEP students and ultimately of the program itself. Castaneda, 648 F.2d at 1014. Since at least 1995, TEA has failed to conduct appropriate monitoring of the state's LEP program. From 1995 to 2003, the DEC cyclical onsite monitoring system repeatedly failed to review LEP programs in numerous school districts. In 2003, PBMAS replaced DEC, and although a data based monitoring system could constitute appropriate action, PBMAS, in its current form, is fatally flawed in its data collection, data analysis, and intervention systems. PBMAS under-identifies LEP students; the achievement standards used for intervention are arbitrary and not based upon equal education opportunity; monitors are not qualified; the failing achievement of higher grades is masked by passing scores of lower grades; and the failure of individual school campuses is masked by only analyzing data on the larger district level. In a monitoring system such as PBMAS, the reliability of the data on which the system is based should be paramount. ( See Defs.' Ex. 1 at 3 ("The Performance-Based Monitoring System relies on evaluation of performance and program effectiveness data at the state level; therefore, data integrity is critical.").) Actions at every level of PBMAS are based upon data; if the data is seriously flawed then the actions at every level will also be seriously flawed. Even data that is collected accurately can be distorted if it is analyzed in a manner that overextends its explanatory breadth or if the data's explanatory power is compromised by unreasonable aggregation. Because PBMAS is based upon this data and because the totality of the data is seriously flawed, PBMAS, in its present form, does not constitute appropriate action to transform the educational theory into reality.
See supra Section II.A.3.a.
(a) PBMAS monitoring and intervention is not equality based
The bilingual-ESL indicators do not compare the performance of bilingual, ESL, or LEP students to the performance of English proficient students. (Intvs.' Ex. 1 # 35.) Instead, to determine the performance level for bilingual or ESL students within a district, bilingual-ESL indicators are compared to the target passage rates that are assigned by the state, without reference to equal educational opportunity requirements. (Defs.' Ex. 2 at 7; Defs.' Ex. 3 at 7.) TEA assigns each district a performance level based upon the deviation of the district's passage rates from these target standards. For LEP students, the performance level ranges from 0, if the LEP students in a district meet or exceed the accountability standard, to performance level 3, if LEP students perform 10 percentage points or more below the standard. (Defs.' Ex. 3 at 7.)
See supra Part II.A.3.b.ii.(b).
Such indicators may be effective in facilitating compliance with the incremental improvements required for all students for NCLB, but the indicators are inadequate as a basis for intervention to ensure the current rights of individual students as required by the EEOA. Flores, 516 F.3d at 1173. Under the current system, a district could achieve a performance level of 0 where LEP students are barely meeting the passage rate, while non-LEP students exceed the passage rate standard by a significant margin. ( See U.S. Post-Trial Br. 9.) Such disparity should be examined under effective implementation. Yet, this scenario would require no intervention by TEA, and LEP students would be denied equal educational opportunity without hope of redress. Intervention, monitoring, and implementation under the EEOA must, instead, be based upon achieving equality. The indicators and interventions must be based upon comparison of the achievement of LEP students and non-LEP students. Defendants appear to recognize this stating that over a three year period in reading and math, LEP sixth graders had "cut the gap between the LEP students and all test takers almost in half." (Defs.' Post-Trial Br. 20.) Of course, change in LEP achievement will not be immediate. Therefore, the indicators that spur intervention do not have to be based upon LEP students achieving absolute equality with non-LEP students in every year. But the indicators should be based upon decreasing the margin of achievement between LEP and non-LEP students. As an example, not a mandate, a performance level of 0 in a particular subject could by achieved by a narrowing of the achievement gap by 1% a year and 5% over three years.
Even comparisons of LEP students with all students are inaccurate in a data based system such as PBMAS because the all-students category includes LEP students. LEP students almost always perform worse than non-LEP students; therefore, aggregating data for LEP students with non-LEP students in the all-students category lowers performance in the all-student category, skewing the margin of achievement lower than the actual margin between LEP students and non-LEP students.
Though better achievement by non-LEP students could increase the margin with LEP students, under the EEOA, LEP students are entitled to the same increase in achievement as non-LEP students, and until the achievement deficit is eliminated, LEP students, over a span of years, must increase their percentage achievement by an even greater percentage than non-LEP students.
(b) Monitors are not certified in bilingual-ESL education
At the time of trial, TEA had inadequate quantities of bilingual-ESL certified monitors to conduct any interventions, let alone to conduct onsite visits. The director of PBMAS, Dr. Laura Taylor, testified that at the time of trial, though there were eleven monitors available to monitor bilingual-ESL programs, PBMAS had no monitoring staff certified in bilingual-ESL education. (4 Tr. 5.) One bilingual-ESL certified monitor was scheduled to begin within a week of Taylor's October 23, 2006 testimony. (4 Tr. 38.) The Fifth Circuit in Castaneda held that the school district had not adequately implemented the LEP program because the teachers were not qualified Spanish speakers. 648 F.2d at 1012-13. Similarly, those who monitor failing LEP programs must have the requisite bilingual-ESL qualifications in order to both understand the problems confronted in LEP education and to be able to offer appropriate solutions. In its current form, PBMAS is involved in two levels of intervention, desk audits and onsite monitoring. During desk audits, PBMAS staff review the continuous improvement plans submitted by the districts, specifically focusing on whether the plans adequately target the causes of performance failures identified in the focused data analysis. (3 Tr. 175, 181-84; Defs.' Ex. 8 at 1-2). Though no onsite visits had occurred by the end of trial, Dr. Taylor testified that monitors will visit individual campuses and observe classroom instruction, and the visits "will be focused on student achievement, on issues related to what might be contributing to performance concerns, such as issues related to instructions and curriculum and those kinds of things." (4 Tr. 42-43.) In order for desk audits to determine if districts plans adequately target performance failures and in order for onsite monitoring to determine if failures are caused by instruction and curriculum, PBMAS employees, or others, who undertake these tasks must be certified in bilingual-ESL education. Without certified staff, unqualified monitors would be intervening in failing LEP programs; effectively, the blind would be leading the blind.(c) School districts are under-identifying LEP students
TEA does not verify data or monitor school districts where significant statistical data indicates that school districts are likely under-identifying LEP students. In 2005-2006, on a statewide basis, 4.9% of LEP students were reported as receiving parental denials for participation in bilingual and ESL programs. (Intvs.' Ex. 1 # 46.) However, in 2005-2006, some school districts reported five times or more the rate of denials than the statewide average. (Intvs.' Ex. 4.) Despite this alarming discrepancy, Defendants "do not, independent of the Districts, verify parental denials." (Intvs.' Ex. 1 # 48.) Prong two was designed to prevent such apathy in implementation. Districts are likely under-identifying LEP students, and TEA has not verified LEP identification in these suspect districts, undercutting the veracity of the data employed by the PBMAS system. Unidentified LEP students are classified as non-LEP, and their achievement on standardized tests and other indicators distorts the achievement of non-LEP students downward, shrinking the gap between non-LEP and LEP students, and thereby distorting the indicator. Where, as here, there are discrepancies in the data, TEA, at least, must verify the accuracy of the data by analyzing, and if necessary correcting, the data collection process used by the districts.
See supra Part II.A.3.b.ii.(a).
TEA has a data integrity system for ten percent of its districts at one time. (Defs.' Ex. 1 at 3, 5 Tr. 33.) But if TEA fails to act upon obvious discrepancies such as the extreme rate of parental denials in some districts, the effectiveness of this safeguard must be questioned.
(d) Aggregation across grades distorts data
The manner that PBMAS aggregates data unreasonably distorts the explanative function of the indicators and thereby undermines the indicators' evaluative purpose. See Castaneda, 648 F.2d at 1009, 1014 (holding that "the practices actually used" must be "reasonably calculated to implement effectively the educational theory" and that accurate testing and evaluation is necessary to monitor the program.) Many of the achievement test indicators are derived from the sum of TAKS scores across multiple grade levels, which masks the suspect performance of secondary LEP students. (Intvs.' Ex. 1 # 38; Defs.' Ex. 4 at 23-38.) Similarly, the LEP dropout rate indicator is based upon combined drop-out rates from middle school grades 7 and 8 and high school grades 9-12, which Defendants' witnesses testified distorted the data because a vast majority of drop-outs are in high school, not middle school. (Intvs.' Ex. 81 at 71:9-10, 24-25; 72:1; 150:22; 151:6; see also Intvs.' Ex. 1 # 39, # 40; 2 Tr. 200.) Both of these aggregations across primary and secondary grades unreasonably distort the data and undermine the integrity of PBMAS in violation of the implementation prong of Castaneda.(e) Masking on the campus level
Moreover, PBMAS analysis of bilingual-ESL indicators on the district, rather than campus level, masks schools failing to meet the state's goals and directly impedes intervention in these failing schools. Based upon an analysis of individual school campuses using PBMAS performance definitions, 277 schools attended by 54,963 LEP performed at a performance level lower than the stage of intervention required by district-wide data analysis. (Intvs.' Ex. 99, at 6; 5 Tr. 170.) Of these 277 schools, 248 were middle or high schools and 48,069 LEP students attended these middle and high schools. Id. Without intervention, over fifty thousand LEP students are denied equal educational opportunity.Of course, as Defendants argue, some masking is inevitable in a data based system: "a high performing campus could still `mask' a low performing class and a high performing class could `mask' a low performing student." (Defs.' Post-Trial Br. 26.) This is a slippery slope on which the Court need not tread because the standard, of course, is reasonableness. See Castaneda, 648 F.2d at 1009 (practices must be "reasonably calculated" to implement the educational theory). A state agency cannot reasonably monitor directly the progress of a class of approximately thirty children or the progress of an individual child. But a state agency can reasonably monitor on campuses, particularly when the data demonstrates that otherwise the failing campuses would deny the failing students equal educational opportunity. See Gomez, 811 F.2d at 1034 (holding that the state agency had denied equal educational opportunities to 5,185 LEP children who attended school campuses with less than twenty LEP students).
The requirement that TEA must monitor on the school campus, rather than district level, is equally compelling when analyzed through the requirements of the Modified Order. Section G(1) requires that TEA "shall [ensure] that school districts are providing equal education opportunities in all schools." The subsequent sentence in § G(1) clarifies that the TEA must ensure equal education on the campus level: "The Texas Education Agency . . . shall assist school districts in achieving a comprehensive balance[d] curriculum on all school campuses. . . ." Moreover, the discrimination that the Modified Order remediated involved segregation on the individual campus level, where originally district lines enclosed a single school and where, later, the state's formula favored small districts that perpetuated all-black inferior schools. Texas, 321 F. Supp. at 1047-48. The Modified Order's remedial power continues to flow from these remedial facts, and therefore, TEA must ensure that constitutional violations do not persist on the campus level. As TEA already collects data on the campus level, though it does not analyze it, and because TEA can retain the core of PBMAS merely expanding it to the campus level, the Court has deferred to the state's prerogative in administering the system, tailoring its remedial order to rectify only the shortcomings that flow from the original constitutional violations.
Moreover, PBMAS monitors already directly intervene on individual campuses within failing districts. Dr. Taylor, director of PBMAS, testified that after determining which districts require onsite intervention, PBMAS monitors "will focus on [—] based on our disaggregation of data [—] the campuses at which we're able to discern that performance issues exist so that we need to focus our efforts on [improving] student performance in those particular areas." (4 Tr. 42.) Defendants, therefore, willingly intervene in the campuses that they discover are contributing to the failures on the district level, but they resist intervening on campuses that are also failing but that are masked because data is only aggregated on the district level. Such deliberate oversight is exactly the type of implementation failure that prong two of Castaneda was designed to prevent. TEA is liable under EEOA for its state mandated responsibilities. TEA establishes the standards to evaluate and monitor the effectiveness of the state's LEP programs. Tex. Educ. Code §§ 7.021(b)(1), 29.062(a). TEA also enforces those standards. Id. at § 29.056. When the responsibility is allocated to TEA, as Texas's statutes and TEA's actions indicate, the EEOA protects the rights of fifty-thousand LEP students from the failures of TEA's implementation.
iii. Other monitoring systems and PBMAS's nascent development do not remedy its shortcomings under the EEOA
Other TEA monitoring — through PBMAS, NCLB, and Texas accountability rating system — that incidentally monitor the achievement of LEP students does not compensate for the flaws of PBMAS. These other monitoring programs are not based upon providing equal education opportunities and do not initiate intervention based upon failure to provide equal education opportunities. See Flores, 516 F.3d at 1172 (recognizing "the distinct purposes of the EEOA and NCLB: The first is an equality-based civil rights statute, while the second is a program for overall, gradual school improvement"). TEA stages interventions separately for each program monitored by PBMAS, and the agency has not intervened in districts that have substandard achievement across programs and has not developed any substantive guidelines to identify such districts. (4 Tr. 63-64.) A school or district's failure to achieve adequate yearly progress under NCLB does not mean that the bilingual-ESL program will be investigated. (5 Tr. 44.) The accountability rating system does not disaggregate student performance for LEP students, initiate further action based upon the failure of LEP students as a disaggregated group, nor does it hold schools or districts accountable for failure to comply with standards and regulations governing LEP education. (3 Tr. 199, 205-06, 209-12, 216, 219.) The data that is incidentally collected on LEP students as part of larger disaggregated groups or programs through these other monitoring initiatives are not applied to ensuring equal education opportunities for LEP students and therefore is inapposite to analysis of the implementation prong.That the PBMAS system is new and is evolving based upon public and agency comment does not absolve the system from meeting the immediate, rights based demands of the EEOA. The evolutionary nature of the relatively new PBMAS is commendable, but it does not excuse a failure to monitor LEP programs immediately. TEA recognized that even though PBMAS was evolving, "[t]he state's annual oversight responsibility for monitoring must continue while developing a new monitoring system." (Defs.' Ex. 1 at 1.) Defendants' argument that PBMAS, as a new system, must be given time to demonstrate its effectiveness is also misplaced. The results-based third prong of Castaneda permits a program to be "employed for a period of time sufficient to give the plan a legitimate trial" before judging whether the results indicate language barriers are being overcome. Castaneda, 648 F.2d at 1010. The second prong of Castaneda does not permit a delay in implementation; as TEA acknowledges, monitoring and implementation must be ongoing. Id.; (Defs.' Ex. 1 at 1). An educational agency cannot fail to effectively implement its programs merely because one of its methods of implementation is new. Because EEOA is a statute based upon individual rights, those rights must be protected immediately.
iv. Remedial education
The implementation prong also requires that educational agencies provide remedial education in all subjects. Though § 1703(f) does not delineate whether education in subjects other than English should occur simultaneously with or subsequent to English instruction, the statute does require that educational agencies overcome language barriers and also educate LEP students in the other areas of the curriculum. Castaneda, 648 F.2d at 1011. The evidence does not suggest that Defendants have failed to implement remedial programs. However, the results of student performance indicate failure in all of the core subjects in secondary schools, and therefore, the court will address the failing results of remedial education under the third results-based prong of Castaneda.
d. Prong three: Results i. Results based inquiry and limitations of standardized tests
Under the third prong, a court must determine whether the program has achieved results: if the program, after a legitimate period of implementation, "fails . . . to produce results indicating that the language barriers confronting students are actually being overcome, that program may, at that point no longer constitute appropriate action. . . ." Id.In a footnote, the Castaneda court recognized the difficulty of employing achievement test scores to judge the success of a language remediation program:
achievement test scores of students should not be considered the only definitive measure of a program's effectiveness in remedying language barriers. Low test scores may well reflect many obstacles to learning other than language. We have no doubt that the process of delineating the causes of differences in performance among students may well be a complicated one.Id. at 1015, n. 14.
The Castaneda court warns only that test scores are often difficult to interpret and that differing performance on standardized test scores could have multiple causes. Other courts have been similarly wary of judging the success or failure of a program based upon achievement scores, but, constrained in part by a scarcity of other data, courts usually rely upon achievement scores despite their limitations. Daunted by the task of measuring the success or failure of educational programs, the court in Teresa P. by T.P. v. Berkeley Unified School District, 724 F. Supp. 698, 715 (N.D. Cal. 1989), reasoned that, as it was "surely beyond the competence of this Court to fashion its own measure of academic achievement, . . . the Court will necessarily defer to the measuring devices already used by the school system." The court examined the classroom grades and achievement scores — which the court noted are often subject to variables such as socio-economic status — and found that LEP students were performing as well or better than their non-LEP counterparts. Id. at 715-16, 716 n. 2. Accordingly, the court held that the district's program had succeeded in removing language barriers. Id. at 716. Recently, the Ninth Circuit relied exclusively on achievement test scores to uphold a district court's finding that factual circumstances had not changed since the issuance of a remedial order. Flores, 515 F.3d at 1155-56, 1170. In so holding, the circuit court acknowledged that "[s]tandardized test scores do not . . . provide a full measure of a school's successes and failures. . . . But test scores do provide us with at least a rough sense of relative performance, and so are useful here." Id. at 1155, n. 21.
As the Castaneda court indicated, there are a multitude of indicators, other than achievement scores, of a program's success or failure. 648 F.2d at 1015, n. 14. For instance, in dicta, one district court noted that "two very significant indictors of failure in achieving the objective of equal educational opportunity for LEP children" were increased drop-out rates after students exited LEP status and the school system's use of simplified English handouts for LEP students instead of more robust English language text books. Keyes v. School District No. 1, Denver, Colo., 576 F. Supp. 1503, 1519 (D.C. Colo. 1983). In the instant action, in addition to achievement scores, the Court has data on drop-out rates, retention rates, achievement scores of students after they have exited LEP programs, and data on the length of time LEP students remain in language programs.
ii. The Court's previous causation error
In footnote 14, the Castaneda court cautioned against relying solely upon test scores, implying that other indicators may demonstrate success or failure. Despite the Castaneda court's limited warning concerning only well known shortcomings in the reliability of test scores, this Court committed clear and manifest error by wandering into the realm of general causation, whose specter was not raised by the Castaneda court or any subsequent court: "The sole evidence Intervenors rely on is aggregate student performance data pertaining to the entire State of Texas. . . . Intervenors have failed to establish that it is the program that bears responsibility, as opposed to a confluence of countless other potential factors." (July 30, 2007 Op. 32; see also Defs.' Post-Trial Br. 23 (arguing "[m]any possible reasons, other than failure to remove language barriers, could account for why the older LEP students currently are not experiencing as much success as those in grade school").) Because of the unintended significance of reading extraneous causation into Castaneda's prong three requirements, the Court now addresses and corrects its previous clear and manifest error pursuant to Federal Rule of Civil Procedure 52(b).
Based upon the precedent of Castaneda and other court decisions addressing prong three, the Court's legal reasoning was error. According to the EEOA, educational agencies must take "appropriate action to overcome language barriers that impede equal participation. . . ." 20 U.S.C. § 1703(f). By using the broad term "appropriate action" Congress gave educational agencies leeway to choose how to overcome language barriers, "by including an obligation to address the problem of language barriers . . ., Congress . . . must have intended to [ensure] that schools made a genuine and good faith effort . . . to remedy the language deficiencies of their students and deliberately placed on federal courts the difficult responsibility of determining whether that obligation had been met." Castaneda, 648 F.2d at 1009. Based upon this reading of the plain language of the statute, the Fifth Circuit developed the three prong test, including prong two — ensuring "that schools ma[k]e a genuine and good faith effort . . . to remedy . . . language deficiencies" — and prong three — determining whether the obligation to address the problem of language barriers has been met. Id. at 1009-10. Under prong three, educational agencies' obligation can only be met if the language deficiencies of students are being remedied. Sufficient evidence of student failure therefore can establish that educational agencies have not met their obligation.
As discussed above, other courts have recognized the limitations of standardized test scores, but none have required plaintiffs in a 1703(f) action to prove or to disprove potential alternative causes. See e.g. Berkeley Unified School District, 724 F. Supp. at 715, 716 n. 2 (recognizing that performance on standardized tests could be influenced by socio-economic factors but not addressing causation).
Other courts suggest that sufficient evidence of LEP student failure can demonstrate program success or failure. The court in Berkeley Unified School District found that the district had satisfied prong three based upon standardized achievement scores and classroom grades that indicated LEP students were doing as well or better than their non-LEP counterparts. 724 F. Supp. at 715-16. In finding that the program produced results indicating language barriers were being overcome, the court recognized the difficulties of measuring achievement, but found "that the best evidence of a sound and effectively implemented program lies in the results that it achieves." Id. at 716. Though in Berkeley Unified School District the students' results were favorable, the court's finding "that that best evidence of a sound program" is the results it achieves applies equally to determining success and failure.
Another court's reasoning also suggests that sufficient evidence of student failure can establish program failure. The district court in Keyes found that drop-out rates and simplified text books were "very significant indictors of failure in achieving the objective of equal educational opportunity for LEP children." 576 F. Supp. at 1519. This reasoning suggests that evidence of student failure, e.g., drop-out rates, combined with an unintended admission by the educational agency that LEP students were not overcoming language barriers, e.g., simplified textbooks, can demonstrate failure of the program to provide equal educational opportunity. Id.
In reaching its erroneous conclusion, this Court also misstated the factual record because Intervenors presented much more than "aggregate student performance data." Secondary LEP students across the board not only failed to perform at the level of their non-LEP peers on achievement tests, but also dropped out of school at significantly higher rates; had significantly higher retention rates; and remained in LEP programs for four or more years, without making adequate yearly progress. That the Court did not consider the multitude of indicators in its finding of fact was error in itself; that the Court amplified this error by misstating that Intervenors had presented only a paucity of evidence was undeniably clear and manifest reversible error that is now being corrected through Federal Rule of Civil Procedure 52(b).
Examining such nebulous factors as social and economic background as potential primary causes of LEP student failure is a task fraught with hazard. Too often, apologists pursue such ulterior causes to extenuate prejudice. In contrast, Congress passed the EEOA under the authority of § 5 of the Fourteenth Amendment and enacted it in the shadow of government endorsed discrimination that frequently perpetuated social inferiority and economic depression towards the end of racial oppression. 20 U.S.C. § 1702; Castaneda, 648 F.2d at 1008, n. 9. With this past and its present incantations looming, Congress established 1703(f) to provide educational opportunity to non-English speaking children, requiring educational agencies "to take appropriate action to overcome language barriers that impede equal participation. . . ." 20 US.C. 1703(f). By omitting an intent requirement, Congress focused on the results of actions, undertaken by educational agencies, to overcome language barriers. See Castaneda, 648 F.2d at 1001, 1007-08 (comparing § 1703(f) with § 1703(d) — which requires discriminatory intent, not merely disparate impact — and finding that § 1703(f) does not require intent). Congress included the obligation to overcome language barriers, without limitation; by doing so, "Congress also must have intended to [ensure] that schools made a genuine and good faith effort . . . to remedy the language deficiencies of their students and deliberately placed on federal courts the difficult responsibility of determining whether that obligation had been met." Castaneda, 648 F.2d at 1009 (emphasis added). To overburden 1703(f) plaintiffs with disproving extraneous causation would pervert this plain language of the statute.
With that precept in mind, the Court will not attempt to broadly define the standard of causation, if any, for failures of LEP programs under prong three of Castaneda. Instead, the Court holds, consistent with precedent, that sufficient evidence of student failure sufficiently proves program failure. But as discussed infra, the evidence of prolonged failure of secondary LEP students is so overwhelming on a multitude of indicators that it narrows potential causes of student failure to the educational program's failure. Based upon the same evaluative tools used by TEA, the clear failure of secondary LEP students unquestionably demonstrates that, despite its efforts, TEA has not met its obligation to remedy the language deficiencies of Texas students. Castaneda, 648 F.2d at 1009.
iii. The marginal success of primary LEP students in bilingual programs
First, the following data for LEP students in all grades must be understood with the realization that, contrary to conventional wisdom, in 2005-2006 only 13.1% of LEP students had not attended United States schools for at least three years; that is, 86.9% of LEP students had attended United States schools for three or more years. (Intvs.' Ex. 1 # 7.) Second, elementary LEP students are educated in bilingual education, and secondary students are educated in distinct ESL programs. Tex. Educ. Code § 29.053(d); (1 Tr. 33; 3 Tr. 46; Intvs.' Ex. 57.)
This fact directly contradicts Defendants contention that Texas school personnel have had insufficient time to assist LEP students, particularly in the higher grades. (Defs.' Post-Trial Br. 24.) Moreover, Defendants allege that the task of teaching English "grows more daunting after the early childhood years" without citing expert opinion or authority. Despite popular belief, this concept is subject to significant debate, and at the least, cannot be accepted by the Court without evidence. See, e.g., Barry McLaughlin, Myths and Misconceptions about Second Language Learning: What Every Teacher Needs to Unlearn (Nat'l Ctr. for Research on Cultural Diversity and Second Language Learning, 1992), available at http://eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/13/30/c2.pdf.
In grades kindergarten through six, the percentage margin between LEP retention rates and other students' retention rates gradually increased from 0.9% in 1994-1995 to 2.1% in 2003-2004. (Intvs.' Ex. 41 at 39.) By 2003-2004, LEP students were retained at a rate, 4.6%, more than one and a half times that of other students, 2.5%. Id.
See supra Section II.B.2.
For the sake of clarity and economy, the Court will briefly rehash the standardized test performance of the third and sixth primary grades. Third grade LEP students performed better than any other LEP grade on the TAKS standardized test. From 2003 through 2006, the achievement margin between third grade LEP students taking the TAKS in English and all students steadily decreased from the 2003 margins of 18% in reading and 12% in mathematics to the 2006 margins of 8% in reading and 7% in mathematics. (Defs.' Ex. 15.) The margin between third grade LEP students taking the TAKS in Spanish and all students did not decrease as rapidly, from the 2003 margins of 14% in reading and 17% in mathematics to the 2006 margins of 13% in reading and 13% in mathematics. Id. Under the most pertinent all-tests measurement, the margin between third grade LEP students taking the test in English and all students was 14% in 2005 and decreased slightly to 12% in 2006. (Defs.' Ex. 51.) The margin between all students and LEP students taking the TAKS in Spanish was 22% in 2005 and was unchanged in 2006. In 2006, third grade students who had exited the program two years previously (non-LEP monitored +2) performed better than all students, by a positive 6% margin in reading and by a positive 8% margin in mathematics. (Defs.' Ex. 15.) However, as the Ninth Circuit recognized, this data may not be as impressive as it first appears because, as the long-term LEP data indicates, many students will remain in the program for more than four years; it may be that only the brightest students are exiting the program rapidly while many more languish in LEP education. Flores, 516 F.3d at 1159.
See supra Section II.B.3.a for a description of performance in all elementary grades. See also Appendix.
See supra Section II.B.3.a.i; see also Appendix.
The all tests standard measures the percentage of students passing all subject areas of the TAKS test. (Intvs.' Ex. 1 #8.) The all tests category for student achievement is the most accurate category to compare the performance of LEP students and all students because to overcome language barriers, LEP students must eventually be on par with their peers in all subject areas. (5 Tr. 103-05.) Moreover, all students must pass all the subject areas on the TAKS test in English order to graduate. (1 Tr. 64; Defs.' Ex. 3 at 33.)
See supra Section II.B.4.
Sixth grade LEP students performed less well on TAKS. From 2003 through 2006, the achievement margin between sixth grade LEP students taking the TAKS in English and all students decreased from the 2003 margins of 45% in reading and 33% in mathematics to the 2006 margins of 27% in reading and 25% in mathematics. (Defs.' Ex. 15.) The margin between sixth grade LEP students taking the TAKS in Spanish and all students increased in reading and decreased slightly in mathematics, from the 2003 margins of 11% in reading and 32% in mathematics to the 2006 margins of 27% in reading and 27% in mathematics. Id. Under the most pertinent all-tests measurement, the margin between sixth grade LEP students taking the test in English and all students was 38% in 2005 and decreased slightly to 33% in 2006. (Defs.' Ex. 51.) The margin between all students and LEP students taking the TAKS in Spanish was 26% in 2005 and increased slightly to 28% in 2006. In 2006, sixth grade students who had exited the program two years previously (non-LEP monitored +2) performed better than all students, by a positive 3% margin in reading and a positive 2% margin in mathematics. (Defs.' Ex. 15.) Though certainly encouraging, this data should be tempered by the realization that many LEP students remain in LEP programs for over four years. See Flores, 516 F.3d at 1159.
See supra Section II.B.3.a.iv; see also Appendix.
See supra Section II.B.4.
The performance of primary LEP students in bilingual education programs is not overwhelming. LEP students in the primary grades are not advancing on pace with their peers: LEP students are retained at significantly higher rates than their all-student peers, and the disparity in retention rates has gradually increased since 1994. Encouragingly, primary LEP students have started to narrow the margin with all students on TAKS. Former LEP students also have had remarkable success two years after exiting the program, though the data may be distorted by a few high achievers. These mixed results are diminished by the fact that TEA enacted the current program a quarter of a century ago. See Texas (LULAC), 680 F.2d at 372 (recounting the enactment of the 1981 Bilingual and Special Language Programs Act). In that light, the fact that, in 2006, the margin between sixth grade LEP students taking the test in Spanish and all students remains at 28% in the all-tests category and that only 50% of sixth grade LEP students passed all the tests is not an endorsement of the program's success. (Defs.' Ex. 51.) Nevertheless, because of the bilingual program's recent success in decreasing the margin of performance, the Court will defer to the state for the time being. However, the Court recognizes that it has perhaps set the bar unreasonably low in order to defer to the state; if the upward trend, narrowing the performance margin, does not continue, the Court may be inclined to revisit its ruling upon a party's motion.
iv. The failure of secondary LEP students in ESL programs
LEP secondary students drop-out of school at a rate at least twice that of the all-students category. In 2003-2004, for students in grades seven through twelve, LEP students dropped out at an annual rate of 2.0% twice the rate for all students, 0.9%. (Intvs.' Ex. 1 # 11.) For students who would have graduated with the class of 2004, 16.3% of LEP students dropped out of school statewide compared with 3.9% of all students. (Intvs.' Ex. 1 # 12.) For what would have been the class of 2005, only 55.2% of LEP students graduated with their class whereas 84% of all students graduated with their class. (Intvs.' Ex. 38 at 145.)
For grades seven through twelve, the margin between non-LEP and LEP student retention rates were consistently disparate, beginning at a margin of 6.2% in 1998-1999 and ending at a rate of 7.5% in 2003-2004. By 2003-2004, LEP students were retained at a rate, 13.8%, more than double that of other students, 6.3%. (Intvs.' Ex. 41 at 42.)
See supra Section II.B.2.
Again, for the sake of clarity and economy, the Court will briefly rehash the standardized test performance of the eighth and eleventh secondary grades. As a preface, two state education employees admit that the performance of secondary students is less than acceptable. (2 Tr. 196-97; Intvs.' Ex. 74 at 112-13.) Eighth grade LEP students perform much worse on the TAKS in English than their all-student peers and show little, if any, improvement over time. From 2003 through 2006, the achievement margin between eighth grade LEP students taking the TAKS in English and all students remained practically stable in reading and mathematics and slightly decreased in social studies, from the 2003 margins of 52% in reading, 36% in mathematics, and 43% in social studies, to the 2006 margins of 51% in reading, 38% in mathematics, and 37% in social studies. (Defs.' Ex. 15.) In the only year it was reported, 9% of LEP students passed the TAKS science exam in English, compared with 52% of all students, a margin of 41%. Id. Under all-tests measurement, the margin between eighth grade LEP students taking the test in English and all students was 44% in 2005 and increased slightly to 46% in 2006. (Defs.' Ex. 51.) In 2006, eighth grade students who had exited the program two years previously (non-LEP monitored +2) performed worse than all students, by 5% in reading, 9% in mathematics, 5% in social studies, and 19% in science. (Defs.' Ex. 15.) Again, even this data of underperformance for non-LEP monitored +2 students should be tempered by the realization that many students remain in LEP programs for over four years. See Flores, 516 F.3d at 1159.
See supra Section II.B.3.b for a description of performance in all secondary grades. See also Appendix.
See supra Section II.B.3.b.iv.
The TAKS test is only administered in Spanish in grades three through six. (5 Tr. 54.) In later grades, the test is administered to some LEP students through linguistic accommodated testing ("LAT"), which allows students "to have additional linguistic accommodations[, such as a bilingual dictionary,] that helps them to better understand the language on the test so that they can show what their actual knowledge of the skills are." (5 Tr. 55-56.) The Castaneda court found that testing in core courses must be in a student's own language:
The progress of limited English speaking students in these other areas of the curriculum must be measured by means of a standardized test in their own language because no other device is adequate to determine their progress vis-a-vis that of their English speaking counterparts. . . . Only by measuring the actual progress of students in these areas during the language remediation program can it be determined that such irremediable deficiencies are not being incurred.648 F.2d at 1014. TEA's director of LEP student assessment admitted that for LEP students "it may be hard to get at what your true academic knowledge and skills are if you're taking a test in English." (5 Tr. 56.) Defendants' also assert that "no student with limited English proficiency has ever passed the TAKS exam in English and none ever will. [Because] [w]hen a student passes all TAKS test in English[,] . . . it indicates that the `LEP' label is no longer valid." (Defs.' Post-Trial Br. 23.) This sentiment adds credence to the Castaneda court's requirement that in order to analyze the success of remedial education in core classes, students must be examined in their native language, where they can demonstrate their aptitude on the subject matter, not in the English language. Though the Court is skeptical of the effectiveness of LAT, as insufficient evidence has been presented to the contrary, the Court defers to TEA's conceit that, through LAT, LEP students can be tested "in a comparable way to other students[.]" (5 Tr. 56.) If effective, as the Court has so found, LAT fulfills the native language testing requirement of Castaneda.
See supra Section II.B.3.b.ii; see also Appendix.
See supra Section II.B.4.
The performance of eleventh grade LEP students on standardized tests is significantly worse than their all-student peers, and the magnitude of this failure is multiplied because students must pass the TAKS in English in order to earn their high school diploma. (5 Tr. 90.) From 2003 through 2006, the achievement margin between eleventh grade LEP students taking the TAKS in English and all students increased in all subject other than social studies, from the 2003 margins of 41% in language arts, 29% in mathematics, 44% in social studies, and 35% in science, to the 2006 margins of 53% in language arts, 34% in mathematics, 30% in social studies, and 45% in science. (Defs.' Ex. 15.) In 2006, eleventh grade students who had exited the program two years previously (non-LEP monitored +2) performed worse than all students, by 13% in reading, 11% in mathematics, 7% in social studies, and 20% in science. (Defs.' Ex. 15.)
See supra Section II.B.3.b.v; see also Appendix.
The all-test measurement is particularly important for eleventh grade LEP students because students must pass all subjects on TAKS in English in order to graduate. (5 Tr. 90.) The margin between eleventh grade LEP students taking the test in English and all students was 47% in 2005 and increased slightly to 50% in 2006. (Defs.' Ex. 51.) After the July 2006 readministration to twelfth grade students who had previously failed to pass, only 53% of LEP students passed all the TAKS subject areas, compared with 78% of students who had completed LEP programs for one year previously, 82% of students who had completed LEP programs two years previously, and 90% for non-LEP students. (Defs.' Ex. 15-A; 5 Tr. 88-89.) Importantly, only 13.1% of LEP students in 2005-2006 were in U.S. schools for less than three full academic years. (Intvs.' Ex. 1 # 7.)
For the portion of those 47% of LEP students who failed the TAKS but have completed their coursework, they must exit the school system and venture to a community college, for which they might have to pay and which likely does not provide the language education they need in order to gain their degree. In this instance, it is unlikely the students will have the opportunity to earn a degree, and for them, equal education opportunity has unquestionably been denied.
Students remain in LEP programs longer, and sometimes significantly longer, than TEA's three year recommendation. In 2006, for instance, 64% of students who exited the LEP program by achieving advanced high English proficiency had been in LEP programs for four, five, or more years. (Defs.' Ex. 17 (Grades 3 through 12).) The Ninth Circuit, though lacking statistical data, found that the failure of LEP student performance on standardized tests was excacerbated by the slow progress of students through the LEP program; many LEP students required LEP instruction for more than two years, some for more than three. Flores, 516 F.3d at 1156. The court also cautioned that the success of LEP students after exiting the district's program should be tempered by witness testimony that "reclassification . . . takes, on average, four to five years. . . ." Id. at 1159. This Court, unlike the Ninth Circuit, does not lack longitudinal data; in both 2005 and 2006, at least 60% of the students who were reclassified as non-LEP students were in the state's bilingual-ESL program for four or more years. ((Defs.' Ex. 16 (Grades 3 through 12); Defs.' Ex. 17 (Grades 3 through 12).) This is a failure of the state's LEP program under TEA's own terms (three years), and this fact undercuts some of the success of those students who have exited the program, because most students languish in the program for many years before exiting.
See supra Section II.B.4.
Secondary LEP students in bilingual education fail terribly under every metric. Secondary LEP students drop-out of school at a rate at least twice that of the all-student categories. Secondary LEP students are retained at rates consistently double that of their peers. Secondary LEP students consistently perform worse than their peers by a margin of 40% or more on the TAKS all-tests category, and the performance gap generally increased over time in individual subjects. Even non-LEP monitored +2 students lag behind all students in secondary grades. As with the primary grades, the prolonged duration of LEP students in LEP programs potentially indicates that the performance of former LEP students represents the failure of the majority and the success of a few. Contrary to Defendants' sentiment, a 47% failure rate for eleventh and twelfth grade LEP students demonstrates that the system is indeed failing to overcome language barriers. (Defs.' Post-Trail Br. 26.) Defendants have had a quarter century to demonstrate they are overcoming language barriers on the secondary level, and the data demonstrates consistent and continued failure to fulfill this difficult, but necessary, responsibility.
v. Exclusion from advanced academic achievement
Contrary to the EEOA, LEP students complete dual enrollment, advanced placement, and international baccalaureates at much lower rates than all students. The EEOA prohibits TEA from denying equal educational opportunity through "failure . . . to take appropriate action to overcome language barriers that impede equal participation by its students in its instructional programs." 20 U.S.C. § 1703(f). The plain language of the statute, which in light of the scarcity of evidence of congressional intent the Court must closely follow, Castaneda, 648 F.2d at 1001, indicates that TEA must take appropriate action in regard to its instructional programs. The advanced academic courses are part of TEA's instructional programs and therefore, TEA must take appropriate action to overcome language barriers in those programs.
See supra Section II.B.5.
Under Texas law, TEA "must administer and monitor compliance with education programs required by federal law. . . ." Tex. Educ. Code § 7.021(b)(1). TEA has a legislated goal to increase the percentage of students taking and passing advanced placement and international baccalaureate examinations. (Intvs.' Ex. 59 at III-1.) Even if TEA did not have the responsibility to directly implement advanced academic achievement programs under state law, the agency cannot abdicate its responsibility to rectify the failures of local authorities. Gomez, 811 F.2d at 1043.
The failure of local and state authorities in achieving equal participation in advanced academic achievement courses for LEP students is clear. In 2004, LEP students completed dual enrollment courses at less than half the rate of all students, 8.5% compared to 19.9% of all students. (Intvs.' Ex. 14 at 1.9.) In 2003 and 2004, while all students took advanced placement or international baccalaureate qualifying examinations at rates of 16.1% and 17.4% and passed those examinations at over a 50% rate, the participation of LEP students was listed as "not applicable." Id.
TEA does not design advanced placement examinations, and it does not control what dual enrollment courses are offered at local colleges. The Court cannot impose on TEA requirements beyond its control. However, TEA oversees a comparable alternative, at least for Spanish and French speakers, in the international baccalaureate — which achieves the same results as other advanced academic courses, rigorous course work and college credit. (1 Tr. 71, 104.) Moreover, even though TEA does not design advanced placement tests and therefore cannot develop qualifying examinations in other languages, LEP students may be able to participate in some advanced placement courses that do not require a high level of English proficiency. TEA must take appropriate action to achieve equal participation in advanced academic courses either through the international baccalaureate or other appropriate programs.
vi. The totality of data establishes causation
The court holds that sufficient evidence of student failure can establish that educational agencies have not met their obligation to overcome language barriers. The failure of secondary LEP students under every metric clearly and convincingly demonstrates student failure, and accordingly, the failure of the ESL secondary program in Texas.
Defendants erroneously argue that "[t]he raw numbers of academic success and failure reveal nothing about what steps the school took or failed to take to produce the result." (Defs.' Post-Trial Br. 23). First, this argument is misapplied to prong three of Castaneda because the results-based prong does not seek to determine what steps an educational agency took or failed to take; the prong merely examines the results of the program. Nevertheless, though the examination is not necessary under Castaneda, the "raw numbers" when compared and contrasted to each other, conclusively demonstrate that the failure of the LEP program caused the failure of the LEP students. Wary of the perilous track on which even the well intentioned have perpetuated apologies and prejudice, the Court will demonstrate how the available data narrows the cause of the student failure to the failure of the ESL program.
See supra III.B.3.d.ii.
Even without the data of secondary LEP student performance on TAKS — which may be partially distorted because the tests were not conducted in students' native language — the totality of the data conclusively demonstrates that the failure of the ESL program caused the failure of secondary LEP students. First, the success of non-LEP monitored +2 students in primary schools in contrast to the continued failure of non-LEP monitored +2 students in secondary schools diminishes the import of potential social and economic causes of failure. Primary LEP students exist in the same general social and economic conditions as secondary LEP students on a state wide basis; economic and social conditions do not therefore explain the disparity in performance between the success of non-LEP monitored +2 primary students and the continued failure of non-LEP monitored +2 secondary students. As a further control, all non-LEP monitored +2 students from grade three through eleven have had the same time after exiting the LEP program, two years, to develop their English beyond the advanced high level required for exiting. As the non-LEP monitored +2 data suggests, the improvement of primary LEP students and the continued stagnation of secondary LEP students demonstrates that social and economic factors, reasonably consistent for LEP students statewide across all grades, are not the culprit. The cause of difference in achievement is the difference in programs, the bilingual program in primary grades and the ESL program in secondary grades.
See supra note 50.
While acknowledging the potential limitations, the secondary LEP TAKS performance data further solidifies the causal link between the ESL program's failure and secondary student failure. Even with linguistic assisted testing, secondary LEP TAKS performance may not accurately depict the knowledge of secondary LEP students because the students may not be able to demonstrate their knowledge in a language, which, by definition, they do not fully understand. However, this sentiment is not entirely supported by the data. Instead of achieving at higher levels, primary LEP students who took the test in Spanish had mixed results in comparison to their LEP counterparts who took the exam in English. In any event, the margins are so great between secondary LEP performance and the performance of their peers, that even accounting for the inaccuracy of the test in English, the substantial gap in achievement demonstrates a significant and continued failure of secondary students in comparison to the gradual marginal improvement of primary students. These students with shared economic and social backgrounds to their primary counterparts, have failed to diminish the margin of achievement with their peers over time.
See supra note 50.
See supra Section II.B.3.a.i. This discrepancy may be the result of failing remedial education for primary LEP students taking TAKS in Spanish, who have less developed English skills than LEP students taking the TAKS in English, and who may not fully comprehend the portion of daily instruction in English and are not given adequate remedial education to compensate.
The increasing retention margin between LEP students and their peers also demonstrates the failure of the ESL program. In 2003-2004, the margin between retention rates for LEP students and non-LEP students was 2.1% in primary grades and 7.5% in secondary grades. This is a significant difference, and because students are retained for failure to comprehend and pass their subject material, it further demonstrates that the ESL remedial program is not providing the same educational opportunity as its bilingual counterpart.
See supra II.B.3.a; see also Appendix.
If changing social factors during students' high school years contributed to increased rates of failure, the data for all students at the high school level would likely also demonstrate a similar marginal increase in student failure in secondary schools over primary schools. In 2006, 78% of sixth grade students in the all-students category passed all tests, and 66% of eleventh grade students passed the all tests. (Defs.' Ex. 51.) This difference of 12 percentage points represents a 15% deviation (12/78) between sixth grade and eleventh grade student achievement. In contrast, 45% sixth grade LEP students taking the TAKS in English passed all tests in 2006, and 16% of eleventh grade LEP students passed all tests. Id. This difference of 29 percentage points represents a deviation of 64% deviation (29/45) between sixth grade LEP students and eleventh grade students. Though the reliability of LEP students taking TAKS in English is questionable, both sixth grade and eleventh grade LEP students were subject to similar difficulties. The comparison of deviation between all students (15%) and LEP students (64%) indicates that even if some of the performance drop-off is attributable to changed social influences in secondary schools, the degree of the decline cannot be explained by that factor. Instead, the degree of LEP failure in secondary schools further indicates that the change from bilingual education to ESL education is the primary culprit.
See supra II.B.3.a.iv b.v; see also Appendix.
vii. Intervenors do not have the burden to demonstrate an alternative program
Defendants state that, "though they bear the burden of proof, the plaintiffs have failed to demonstrate that any alternative method of monitoring produces superior results." (Defs.' Post-Trial Br. 18.) Castaneda does not require plaintiffs to produce alternative methods under any prong, and such an approach would be antithetical to the Fifth Circuit's admonition to defer to the expertise of educational officials. TEA and Texas, not the Court or Intervenors, have the responsibility to overcome language barriers among LEP students on a statewide basis; the Court must defer to the agency and legislature's political solutions. Intervenors only need demonstrate that TEA is failing under the implementation prong or results prong in order for the Court to mandate a remedy.
Defendants also suggest that Intervenors had the burden to identify more effective programs in other states. (Defs.' Post-Trial Br. 18.) As discussed above, Intervenors do not bear that burden. But even if all other states had similar results to Texas, Texas would not be excused from modifying its LEP program in order to overcome language barriers. If the opposite were true, no state would ever have had to attempt to overcome language barriers; states could merely point across their borders, showing the equal failures of their sister states. Instead, through the EEOA, Congress placed the burden on states to correct the barriers often established, and unquestionably perpetuated, by state action. Moreover, Intervenors need not go out of state to demonstrate a successful program, as the bilingual education program employed by Defendants in lower grades has had marginal success at overcoming language barriers. (Intvs.' Post-Trial Br. 29.) Finally, Intervenors have suggested that more extensive remedial training may be necessary to correct the shortcoming in the ESL program. (Intvs.' Post-Trial Br. 28 n. 16.) Though the ongoing effectiveness of an ESL program has yet to be established, a revamped and significantly more substantial remedial efforts may satisfy the Court that TEA has sufficiently changed its failing program.
IV. Conclusion
Recognizing the stagnation of LEP secondary students in comparison to their non-LEP counterparts, Defendants note that "Plaintiffs have not suggested that English proficient students should be held back so that LEP students can catch up." (Defs.' Post-Trial Br. 20, n. 15.) The EEOA is a civil rights based statute, with the goal of equality. Defendants are correct; it would be unjust to require non-LEP students to be held back in order to achieve equality. However, Defendants' statement demonstrates what they have failed to consider after a quarter century: under the EEOA, it is equally unjust to perpetually fail to provide the resources and LEP programs necessary to ensure LEP students "catch up." The palpable injustice is equivalent whether it comes from depriving non-LEP students or from depriving LEP students.
The PBMAS system does not fulfill TEA's requirement to effectively implement the LEP program. This failure does not excuse failing results on the secondary level. After a quarter century of sputtering implementation, Defendants have failed to achieve results that demonstrate they are overcoming language barriers for secondary LEP students. Failed implementation cannot prolong the existence of a failed program in perpetuity.
Defendants must soon rectify the monitoring failures and begin implementing a new language program for secondary LEP students. As a nonbinding option, the secondary LEP program could consist of a variation of the current ESL program with substantially enhanced remedial education. The Court recognizes the difficult position of Defendants and the ongoing nature of this task. The Court will defer to Defendants and their course of action as much as possible, but the Court must ensure the rights of LEP students under the EEOA. With this in mind, demonstrations of good faith by Defendants will be looked upon favorably.
APPENDIX Grade All Tests Year LEP English LEP Spanish All Students English Gap Spanish Gap Grade Three 2005 62% 54% 76% -14% -22% 2006 65% 55% 77% -12% -22% Grade Four 2005 49% 56% 70% -21% -14% 2006 55% 63% 74% -19% -11% Grade Five 2005 19% 13% 55% -36% -42% 2006 28% 16% 64% -36% -48% Grade Six 2005 31% 43% 69% -38% -26% 2006 45% 50% 78% -33% -28% Grade Seven 2005 16% N/A 60% -44% N/A 2006 18% N/A 65% -47% N/A Grade Eight 2005 14% N/A 58% -44% N/A 2006 12% N/A 58% -46% N/A Grade Nine 2005 13% N/A 56% -43% N/A 2006 16% N/A 57% -41% N/A Grade Ten 2005 6% N/A 40% -34% N/A 2006 8% N/A 50% -42% N/A Grade Eleven 2005 13% N/A 60% -47% N/A 2006 16% N/A 66% -50% N/A Grade Three Subject Year Reading Mathematics LEP English LEP Spanish Non-LEP +2 All Students LEP English LEP Spanish Non-LEP +2 All Students 2003 63% 67% N/A 81% 62% 57% N/A 74% 2004 77% 78% N/A 88% 75% 68% N/A 83% 2005 78% 74% 95% 89% 72% 67% 89% 82% 2006 81% 76% 95% 89% 75% 69% 90% 82% Grade Four Subject Year Reading Mathematics Writing LEP English LEP Spanish Non-LEP +2 All Students LEP English LEP Spanish Non-LEP +2 All Students LEP English LEP Spanish Non-LEP +2 All Students 2003 49% 59% N/A 76% 49% 48% N/A 70% 53% 82% N/A 78% 2004 60% 66% N/A 81% 64% 62% N/A 78% 73% 88% N/A 88% 2005 58% 69% 88% 79% 68% 64% 90% 81% 80% 87% 96% 90% 2006 63% 76% 91% 82% 72% 69% 93% 83% 83% 90% 97% 92% Grade Five Subject Year Reading Mathematics Science LEP English LEP Spanish Non-LEP +2 All Students LEP English LEP Spanish Non-LEP +2 All Students LEP English LEP Spanish Non-LEP +2 All Students 2003 32% 51% N/A 67% 40% 37% N/A 65% 10% 6% N/A 39% 2004 34% 60% N/A 73% 47% 44% N/A 73% 22% 20% N/A 55% 2005 37% 60% 70% 75% 58% 44% 80% 79% 31% 23% 59% 64% 2006 48% 65% 79% 80% 63% 47% 83% 81% 46% 31% 74% 75% Grade Six Subject Year Reading Mathematics LEP English LEP Spanish Non-LEP +2 All Students LEP English LEP Spanish Non-LEP +2 All Students 2003 26% 60% N/A 71% 27% 28% N/A 60% 2004 34% 58% N/A 79% 35% 36% N/A 67% 2005 51% 59% 89% 85% 41% 44% 74% 72% 2006 64% 66% 94% 91% 54% 52% 81% 79% Grade Seven Subject Year Reading Mathematics Writing LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students 2003 21% N/A 72% 15% N/A 51% 26% N/A 76% 2004 28% N/A 75% 24% N/A 60% 52% N/A 89% 2005 33% 68% 81% 25% 60% 64% 52% 89% 88% 2006 29% 80% 79% 33% 70% 70% 56% 92% 90% Grade Eight Subject Year Reading Mathematics Social Studies Science LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students 2003 25% N/A 77% 15% N/A 51% 34% N/A 77% N/A N/A N/A 2004 35% N/A 83% 20% N/A 57% 42% N/A 81% N/A N/A N/A 2005 30% 73% 83% 22% 48% 61% 50% 80% 85% N/A N/A N/A 2006 32% 78% 83% 29% 58% 67% 46% 78% 83% 9% 33% 52% Grade Nine Subject Year Reading Mathematics LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students 2003 14% N/A 66% 11% N/A 44% 2004 24% N/A 76% 14% N/A 50% 2005 30% 74% 82% 18% 41% 56% 2006 41% 82% 87% 19% 43% 56% Grade Ten Subject Year English Language Arts Mathematics Social Studies Science LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students 2003 14% N/A 66% 17% N/A 48% 29% N/A 71% 7% N/A 42% 2004 19% N/A 72% 18% N/A 52% 36% N/A 80% 10% N/A 51% 2005 20% 52% 67% 18% 40% 58% 43% 73% 84% 11% 31% 54% 2006 32% 70% 85% 23% 44% 60% 41% 70% 83% 13% 34% 60% Grade Eleven Subject Year English Language Arts Mathematics Social Studies Science LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students LEP English Non-LEP +2 All Students 2003 20% N/A 61% 15% N/A 44% 34% N/A 78% 12% N/A 47% 2004 32% N/A 83% 34% N/A 67% 57% N/A 91% 20% N/A 63% 2005 34% 74% 87% 35% 56% 72% 53% 80% 91% 29% 50% 71% 2006 35% 75% 88% 43% 66% 77% 64% 87% 94% 30% 55% 75%