Converting accountability models into modern performance tools
(S.C.) The state’s three existing but separate school rating tools can be successfully melded to create a single accountability index that reliably measures school performance, according to new analysis released this month.
In addition, said researchers from the Regional Educational Laboratory Southeast at Florida State University, data from the new index can be used to identify schools that are high and low performing after demographic characteristics are controlled for – in essence, telling them which sites are ‘beating the odds.’
“At the time of this report’s publication, South Carolina rated school performance using three indices...[which] result in very different performance rankings of schools,” authors Sharon Koon, Yaacov Petscher and John Hughes wrote.
“Although the conceptual distinctions can be explained, communicating results to educators is challenging when ranking of schools is inconsistent across the indices,” they said in explaining the state’s motivation for partnering with REL Southeast for the study.
The comprehensive and complex analysis comes as nearly every state in the nation is at some stage of reconfiguring the way they measure and rate how well schools are preparing students for success in college and the workforce.
The move to new school evaluation models – driven by the failure of federal education law and the adoption by most states of new national curriculum standards – has largely followed a more holistic track designed to account for student achievement based on more than just standardized test scores.
Using data from the South Carolina Department of Education on public elementary schools (grades 3–5), middle schools (grades 6–8), and high schools (grades 9–12) for 2012-13, REL Southeast researchers concluded that the measures that make up each of the state’s three indices of school performance can be used to create “an overall, reliable alternative index of school performance.”
Researchers used four “confirmatory factor analysis” models to run the data, which included student population disaggregated by race, gender and socio-economic status, as well as results from the state’s three existing performance indices – Absolute, ESEA and Growth, each of which uses a variety of indicators to assess student achievement and growth.
“Confirmatory factor analysis statistical models, according to the report, estimate the relationship between observed measures (for example, scores on several different reading tests) and an unobserved construct (for example, school quality) that is believed to underlie the observed measures,” the authors said. “The unobserved construct is called a factor or latent variable when it is estimated using two or more observed measures.”
The analysts then determined that of the four processes, the best match for South Carolina’s new index is the bi-factor model “that theorizes that absolute performance, ESEA subgroup performance, and growth performance each exist as specific constructs but that an additional, general construct of school performance exists that captures something in common across all of the measures.”
The bi-factor model on which South Carolina’s alternative index is based then uses latent profile analysis to identify the demographic profiles of South Carolina schools and to determine which profiles include high-performing schools, researchers said.
The alternative school performance index identified approximately 3 percent of elementary schools, 2 percent of middle schools, and 3 percent of high schools as statistically exceeding their expected performance after the schools’ demographic characteristics were accounted for.
While South Carolina is developing its new school performance index based on requirements in a federal waiver from provisions of the Elementary and Secondary Education Act, other states have moved ahead in the process without the U.S. Department of Education’s blessing.
One of those, California, appears well-positioned to serve as a national model, however, as a rewrite of the ESEA (or No Child Left Behind Act) pending in Congress would give states broad flexibility in creating their own school accountability systems.
The state’s board of education is set next month to consider a new accountability framework that integrates a number of existing academic and fiscal reports with recently-adopted state educational priorities to help explain how well schools are meeting new educational benchmarks.
The School Accountability Report Card, for instance, provides a wide range of performance data including academic scoring, teacher certification and facility status. There are also annual fiscal audits that are being considered as part of the new system along with charter petitions.
Maryland’s School Progress Index measures schools’ progress and holds them accountable for how much or how little they improve. In a six year span, schools are expected to reduce in half the number of students not achieving proficiency in every subgroup. Incremental improvement targets are set annually.
The index evaluates schools based on student assessment scores, student improvement and gap reduction between subgroups for elementary and middle schools; and test scores, gap reduction, and college and career readiness for high schools.
Schools are grouped into categories labeling them as in need of support, intervention or recognition.
In Ohio, schools are assessed based on student achievement, progress students make, graduation rates, decreased gaps in achievement between subgroups, literacy proficiency from kindergarten to third grade, and as of this year, college and career readiness.
Schools will begin receiving overall letter grades in 2016, but accountability will not be tied to those scores until 2018.
Montana, among the states without a federal waiver from No Child Left Behind accountability requirements, continues to use Adequate Yearly Progress scores. Scores are not yet available as students participated in a field test for a new state assessment in 2014, which were officially administered for the first time this past spring.