The CDE recently released the first batch of growth scores for schools and districts across California. Growth scores measure the performance of schools and districts by calculating by how much their students’ SBAC scores differ from what was expected given their prior SBAC scores. By design, the average growth score across all students should be zero (or very close to it). Growth scores are given in the same units as SBAC scores. A growth score of -5 doesn’t mean that students scored five points worse than they did the year before. It means that their SBAC score grew by 5 points less than expected. Perhaps they scored 2500 last year and were expected to score 2523 this year but only scored 2518.
A previous post went into more detail about how growth scores are calculated and what the limitations are of the CDE’s chosen algorithm. The key points to remember come from these two charts, which I’ve lifted from that previous post:
Observe that students who stay in the 70th percentile gain about 150 SBAC points between 3rd grade and 8th grade while students who stay in the 20th percentile gain only about 130. In general, students in higher percentiles tend to increase their SBAC scores by more than students in lower percentiles. The gap between them gets wider over time.
Observe here that the pattern is even more extreme for Math: students in higher percentiles gain a lot more points than students in lower percentiles.
The growth model is based on a linear regression whose only independent variables are a student’s ELA and Math scores in the previous year. In particular, the student’s grade is not a variable in the model. A student who scores 2500 in 3rd grade will be predicted to grow as much as a student who scores 2500 in 7th grade even though the 3rd grade student will be in a much higher percentile. The charts showed that students in higher percentiles gained more SBAC points than students in lower percentiles. Since they both receive the same prediction, the higher percentile student will tend to exceed that prediction and thus get a growth score greater than zero while the lower percentile student will tend to get a growth score less than zero.
A district whose students start and finish the year in the 25th percentile has done just as good a job, no better and no worse, than a district whose students start and finish the year in the 75th percentile. But, due to the way the growth scores are calculated, a district whose students stay in the 25th percentile will tend to have a lower growth score than a district whose students stay in the 75th percentile.
For this reason, when we look at growth scores, we are always going to look at them in the context of the prior year’s SBAC scores, specifically the average Distance from Standard1 (DFS) of the students. This will enable us to see if a district’s performance is truly outstanding. Note that the Distance from Standard and the growth score are in the same units.
ELA Growth Scores
The chart below shows the ELA growth scores for the 114 districts that had at least 4,000 students with growth scores.
The diagonal line is the best-fit line based on a linear regression against each district’s average distance from standard in 2024. The R-squared is 0.36 indicating that, while there’s clearly a relationship, there’s a lot of scope for districts with similar prior achievement scores to achieve very different growth scores. Hayward and West Contra Costa both had weak SBAC scores in 2024 (both were around 60 points below standard) but Hayward’s growth score of 0 was a lot better than West Contra Costa’s –11. Los Angeles and Compton both had 2024 SBAC scores 25-28 points below standard but Compton’s growth score of 12 was much better than Los Angeles’s still-creditable +1. In fact, Compton’s growth score was better than any other district in the sample. San Francisco, meanwhile, had a growth score of -1, which is meh. It’s a bit below what would be expected but not egregiously so.
Math Growth Scores
The analogous chart for Math growth scores is different in two ways.
the relationship between the prior year SBAC scores and the current year growth score is much stronger (R-squared = 0.81)
the range of growth score values is significantly wider. Growth score values of +20 or higher are found.
Both are a consequence of the phenomenon we saw in the first charts, namely that the gap between the average score gain in higher and lower percentiles is much greater for Math than ELA.
Nevertheless, Compton still excels. Its growth score of +13 is less than that of districts like Cupertino and Irvine and San Ramon Valley (all of which are at +20 or higher) but, given that its students started the year 39 points below standard, it surpassed expectations by more than any of these other districts.
So, which is the best performing district?
There are nearly 700 districts with growth scores and only the 114 largest are shown on the charts above. Those 114 districts represent about 63% of all students but there are districts too small to show on the charts which had even higher growth scores than Compton in both ELA and Math. The largest of these districts was Orinda, in Contra Costa, which had growth scores of 13 (ELA) and 16 (Math). But Orinda had only 1,350 students, far less than Compton’s 5,900, and its prior achievement scores were 88 points above standard (ELA) and 72 above standard (Math) so its high growth scores are not as impressive. The highest growth scores of all belong to Scotia Union Elementary in Humboldt County (24 in ELA; 42 in Math) but Scotia Union had only 104 students with growth scores, fewer than many schools. Similarly, the absolute worst growth scores belong to Geyserville Unified in Sonoma (-17 in ELA; -44 in Math) but Geyserville had only 73 students. The worst performing district of any size was Barstow Unified in San Bernardino, whose 1,900 students had growth scores of -32 in ELA and -24 in Math.
Impressive as its scores are, it is far too early to blithely declare that Compton is the best school district in California. During the development of the growth scores model, the CDE published what the test scores would have been using the last pre-pandemic SBAC data. At that time, Compton’s growth scores were the equivalent of -1 in ELA and +3 in Math. Has Compton improved significantly in the intervening five years or are its high scores just a statistical artifact?
SFUSD’s preferred benchmark has long been Long Beach Unified. Years ago, when I first started analyzing student achievement data, I identified Clovis Unified in Fresno and ABC Unified in Los Angeles as districts that seemed to do particularly well after adjusting for their demographics. How are these three rated by the growth scores method? Long Beach scored +2 in ELA and -3 in Math; ABC scored +5 in ELA and +3 in Math; Clovis scored +9 in ELA and +4 in Math. Good scores, but not as good as Compton. In the test data from the pre-pandemic era, those districts were all stronger than Compton. Long Beach was +5 and +2, ABC was +6 and +6, and Clovis was +9 and +3. Even San Francisco was +5 and +2.
It will take multiple years of data to know whether Compton’s high scores are an indicator of true excellence or just a blip.
Example: the lowest score required to meet the standard for 5th grade ELA is 2502. If the average 5th grader in the district has a score of 2510, that’s 8 points above the average. Calculate the distance from standard for each of the grades from 3-7 and average them to get the school or district’s DFS. Grades 3-7 are used because growth scores are calculated only for students in grades 4-8. Instead of DFS, I could have used the percentage who met or exceeded the standard because the two numbers have a 99% correlation but it seemed better to use DFS because it’s in the same units as the growth score.


