Charter schools have featured prominently in this blog, with both supporters and detractors weighing in vigorously. Charter school critics frequently comment that charter schools do not outperform, and even under-perform, public schools overall. Charter supporters note that some charter schools have achieved outstanding results with some of the kids least well-served by our public schools.
Wherever you stand in this debate, you should take a look at a major study of charter-management operations conducted by Mathematica Policy Research and the University of Washington’s Center on Reinventing Education. Education Week summarizes the findings this way:
It’s well worth pulling up the full report, which is filled with interesting data. First — and to anticipate some possible objections to the findings – here’s a description of the methodology:
“To estimate the effects of CMOs on student achievement, we examined the gains in test scores of individual students from before they entered CMO schools until up to three years after entry, as compared to gains of a matched comparison group of students who resembled the CMO students in terms of baseline test scores and other key characteristics. Students who transferred out of CMO schools after the first year enrolled were kept in the CMO “treatment” group for the analysis. This ensures that impact estimates are not artificially inflated by the departure of low-scoring students. It also means that our impact estimates are conservative, in the sense that students who remain enrolled in CMOs for more than a year are likely to experience larger impacts than what we report here.”
And here are the basic findings:
“Test score impact estimates for the average CMO after two to three years in middle school are positive in all four subjects, but they are not statistically significant. We estimated impacts of the average CMO in reading, math, science and social studies, one to three years after a student’s initial enrollment in the CMO school (though science and social studies test scores were available only for a subset of CMOs and years). Average CMO impacts are positive in all cases but one (one-year reading impact), but they are not statistically significant (at the .05 level), despite reaching a non-trivial magnitude in math by the third year after enrollment. “
These findings seem to contradict the widespread claim by charter opponents that students fare worse in schools run by the growing number of charter management operations. But the overall results are not what’s most interesting in the report.
“The overall average impacts mask a great deal of variation among CMOs. Two years after students enroll in the CMOs covered by the impact analysis, they experience significantly positive math impacts in half of these CMOs (11 of 22), while students in about one-third of the CMOs (7 of 22) do significantly worse in math. Similarly, students in nearly half of the CMOs (10 of 22) experience significantly positive impacts in reading, while students in about a quarter of CMOs (6 of 22) experience reading impacts that are significantly negative. Table 3 shows that half of the CMOs (11 of 22) have significantly positive impacts in math or reading and nine have significantly negative impacts in one or both subjects; 10 of the 22 CMOs have significantly positive impacts in both subjects while only four have significantly negative impacts in both subjects.”
What this suggests to me is that neither charter supporters nor detractors should be so quick to make generalizations. Instead, why don’t we pay more attention to why SOME charters achieve such significant results with similar groups of students (and one other finding is that charter schools serve a HIGHER proportion of minority and poor kids than their district’s average.)
The report has some very interesting findings that shed light on this issue. I’ll post on this tomorrow. Meanwhile, here’s a link to the full report.