Earlier this month, the California Department of Education released their new rating system for schools called the California School Dashboard. While there is a lot to be said about the positives and negatives of this system, I set out to correct the most glaring omission – that Charter School Networks were not evaluated as whole districts.
So how well did each Charter Network do?
I crunched the numbers and here are the overall ratings for each of the major charter networks, as well as LAUSD’s scores for comparison.
English Language Arts
Based on this metric, which is the official metric that the state of California will be using to evaluate schools, most charter networks score similar to LAUSD. In fact, several charter schools performed below LAUSD.
Like I said last time, this is not entirely a fair comparison. Green Dot, for example, operates almost exclusively in low income, high need areas, and has taken over control of some of the district’s most struggling schools. And while LAUSD definitely has a majority of its schools in high need areas, they also have the benefit of several high performing areas that bring up the overall LAUSD score.
But that buoy affect also helps several of the charter networks. For example, Celerity Troika performs significantly higher than the other schools in the network. Without Troika, Celerity’s score in English would be a Yellow, not a Green. Without the PUC schools in the Valley, the PUC schools in Northeast LA get a score of Orange in both math and English.
On the other hand, in order to achieve scores of “significant” improvement or decline, a charter network really had to move as a whole, not just be buoyed by one group. Aspire Public Schools improved dramatically year over year. With the exception of Antonia Maria Lugo Academy, which was already higher performing than the rest of the schools, all of the Aspire schools made tremendous gains, which gives them an “improved significantly” rating in both English and Math.
I recognize that this new data has a lot of problems. It doesn’t actually measure student growth, and is really just measuring a change in proficiency – and that is an extremely weak way to evaluate a school. But what I do like about this data, and all data for that matter, is that there are clearly some very interesting stories that can be seen through the haze of the numbers. Maybe we can learn from the stories that emerge.