A recent Daily News editorial regurgitated its usual “retain Mayoral control” mantra (under the unintentionally ironic moniker, Save Our Schools) by citing the “perfectly visible” progress shown by this year’s DOE school report cards.
Leave aside for the moment the patent absurdity of buying into a measurement system that allows the agency in question to measure itself while incentivizing its employees in every conceivable way to cheat and otherwise bend rules to make their numbers. Leave aside as well the debatability of a rating system in which 80% of the schools being measured are graded as B or better, a remarkable year-to-year increase of some 18 percentage points. Rather, let’s just look at the numbers behind how those grades were established.
In 2006-07, the DOE established conversion scales at each level (elementary, middle, and high school) under which each school’s raw score on a roughly 100-point scale would be converted to a letter grade. This year, the DOE has apparently reset these scales significantly downward for elementary and high schools, enough to account for the great majority of the gains at these two levels. In other words, when you control all the data and define your own success and you want better results, simply adjust your scales downward and, voile, you’ve got them!
Here are the elementary school scale conversion cutoff scores for each letter grade for the last two years (the cutoffs for F scores are irrelevant, since they simply reflect the lowest scoring school in their category).
A -- 64.0 (2006/07), 59.6 (2007/08)
B – 49.9 (2006/07), 45.8 (2007/08)
C - 38.8 (2006/07), 32.6 (2007/08)
D - 30.9 (2006/07), 28.4 (2007/08)
Note that it takes 4-6 raw score points less this year for a school to achieve an A, B, or C. Last year, the DOE rated 70.5% of elementary schools as A or B level. This year, they claimed improvement to 79.8% of schools achieving A or B scores. However, using the same conversion scale as last year, only 71.5% of elementary schools would merit and A or B, almost no change from last year despite the enormous pressure placed on school principals and teachers.
Almost the same results can be found at the high school level, beginning with the slightly less dramatic reductions in scale conversion cutoff scores:
A – 67.6 (2006/07), 64.2 (2007/08)
B – 48.8 (2006/07), 43.5 (2007/08)
C – 35.1 (2006/07), 34.3 (2007/08)
D – 28.9 (2006/07), 29.7 (2007/08)
Last year, the DOE rated 65.5% of high schools as A or B level. This year, that percentage increased to a mind-bending 83.1%. However, if the DOE had applied the same letter grade conversion scale as last year’s, only 70.8% of high schools would have received A or B grades, and 6% would have received D or F grades. Roughly two-thirds of the percentage point improvement in high schools achieving A or B grades came from lowering the cutoff scores!
The DOE had announced plans earlier this year to modify its school rating system, so year-to-year comparisons are not necessarily straightforward. Nevertheless, for reasons that have not yet been made clear, the score cutoff scales for elementary and high schools have been adjusted significantly downward while those for middle schools remained nearly the same. Perhaps the rationale is mathematically legitimate, but the action is so suspect and the effect so pronounced, further explanation is warranted. Otherwise, lowering the grading bar has the appearance of being little more than an effort to influence the publicly reportable “success” of Mayoral control and the Chancellor’s accountability regime.
Given the DOE's tight control over data, its lack of accountability to any higher review body, and its general opacity to outside view, how is the public to know whether the reported increase in A- and B-rated schools is real or simply illusory? Assuming, that is, that you accept the legitimacy of the grades themselves, an already dubious proposition.