There is now abundant evidence to show that the grades that DOE awards schools – their so-called “progress reports” -- are meaningless and rely on chance, like throwing dice, because they rely predominantly on whether a school’s test scores last year were above or below the year before.
Sixty percent of the grade depends on so-called “progress”, which means annual gains or losses in test scores, 25% on the test scores themselves, and only 15% on survey results. As a result, this year, many schools went from failing grades to grades of A , and just as occurred last year, some highly regarded schools received “Fs”.
Last year, in a Daily News oped entitled “Why parents & teachers should reject the new school grades”, I pointed out that research shows that 30-80% of the annual gains or losses in test scores are random – and sure enough, this is what the grades turned out to be. Last week, Daniel Koretz, professor at Harvard and national expert on testing, wrote:
”My advice to New Yorkers is to … ignore the letter grades and to push for improvements to the evaluation system…It does not make sense for parents to choose schools, or for policymakers to praise or berate schools, for a rating that is so strongly influenced by error.”
And yet according to the DOE, eighteen schools that received a D or an F last year have new principals this year – showing how lousy decisions are being based upon these inherently unreliable measures.
As Ellen Foote points out, the principal at IS 89, a school that received a “D” last year despite being selected as the only NYC middle school to receive an award for its achievements from the federal government:
“Last year’s grades often reversed the state’s opinion of schools… She knew of one District 2 middle school that got a B last year but was on the state’s list of schools that need improvement. Under No Child Left Behind, many students in that school transferred to I.S. 89, a Blue Ribbon school. That meant the students transferred from a B school to a D school. This year, both schools received an A.
Indeed, as Eduwonkette has written, the proportion of NYC schools receiving an “F” that are in good standing according to federal or state government standards is larger than the proportion of schools receiving an “A”.
“How do you reconcile those discrepancies?” Foote said. “How do parents make sense of that? It just is all over the place — it’s such a disservice to schools and to parents…. It’s so simplistic at best and confusing and probably invalid at the worst, at a very high cost in terms of money, resources and morale.”
The smaller the school or number of students tested, the more likely annual gains or losses are invalid; and as this post in Gotham Schools reveals, the smaller the school, the more likely it received an extremely high or low grade this year.
In response to this sort of criticism, Jim Liebman, the head of the DOE accountability office, said last spring that this year’s progress score would be based on two years of data instead of one year– which would have improved its reliability, yet for some reason, he went back on his word.
Why? To my knowledge, the DOE has never explained.
Rather than making things better, Liebman made the grade even more unreliable by increasing the importance of the “progress” score to account for 60% of a school's grade, an even larger percentage -- up from 55% last year. Why? Again, this has not been explained.
Was this to take advantage of last year’s anomalously high increases in scores? Were Klein and Liebman themselves gaming the system to ensure that 80% of schools would receive “A”s or “B”s --so that they could claim great credit for their so-called but illusory improvements? Who’s to know?
The idea that 80% of NYC schools could be rated “A” or “B” is in itself absurd, given the fact that our school system has among the highest class sizes in the nation and among the lowest graduation rates. This is grade inflation of the highest order, apparently crafted so the administration can pat itself on the back.
I wish those reporters who uncritically reported on the $20 million teacher bonus program on Friday – which also rely on improving a school’s score in the unreliable progress category – would now dig a little deeper, to examine how these cash rewards were likely awarded randomly as well.