Wednesday, September 28, 2011

Why the school progress reports and NYC education reporters deserve a big fat “F”


Gary Rubinstein, a math teacher at Stuyvesant, totally outclasses the numerous education reporters in this city in his analysis of the recent school grades.  

He shows via this graph to the right and discusses in his blog how there is little or no correlation in rank order for schools between last year or this year's progress reports. 

(Incidentally, in a subsequent posting, Rubinstein also points out that NYC charters are twice as likely to get "Fs" in progress than non-charters.)

Unfortunately the mainstream media continue to repeat without dispute Suransky’s claim that the progress reports were much more “stable” this year, even though 60% of schools changed grades.    

Not one reporter, to my knowledge anyway, has bothered to point out how experts have shown that 32-80% of the annual gains or losses in scores at the school level are essentially random – and yet 60% of the school grade is based upon these annual gains or losses. 

See the Daily News oped I wrote in 2007, in which I offer even more criticisms of the progress reports, including their inherent instability, “Why parents and teachers should reject the new grades”.  

 Researchers have found that 32 to 80% of the annual fluctuations in a typical school’s scores are random or due to one time factors alone, unrelated to the amount of learning taking place. Thus, given the formula used by the Department of Education, a school’s grade may be based more on chance than anything else.

Yet here is one typical headline from last week: School report cards stabilize after years of unpredictability. Here is the NY Times account:

“We have a really high level of stability this year, which is a good thing,” said Shael Polakow-Suransky, chief academic officer for the city’s Department of Education…. There is movement and that’s good because we are measuring one year of data and we expect schools will go up and down, but we don’t want to see movement caused by something that’s external to the kids,” Mr. Polakow-Suransky said, referring to changes in the state exams that caused incredible increases and then a drop-off in schools’ grades.

Of course, if one year’s movement up and down is primarily random, that by definition is “external.” 

Nor have any education reporters bothered to report that Jim Liebman, who designed the system, testified to the City Council when the grades were introduced that the DOE would improve the reliability of the system to incorporate three years worth of test score data instead– which both he and Suransky have refused to do.

Indeed, as recounted on p. 121 of Beth Fertig’s book, Why can’t U teach me 2 read,  Liebman is quoted as responding to Michael Markowitz’s observations that the grading system was designed to provide essentially random results this way:

“There’s a lot I actually agree with, he said in a concession to his opponent…He then proceeded to explain how the system would eventually include three years’ worth of data on every school so the risk of big fluctuations from one year to the next wouldn’t be such a problem.”

And yet no one, including Fertig, has mentioned this discrepancy and DOE’s lack of rationale for intentionally allowing an essentially single unreliable grade to determine a school’s future – with 10% of schools now guaranteed to be given a failing grade and thus liable to being closed, based largely on chance. 


4 comments:

Anonymous said...

Leonie,

Thanks for the trip down memory lane, aka the DOE rabbit hole.

For another piece of edutainment nostalgia on the topic, my favorite, from Sept 2008:
"Could a Monkey Do a Better Job of Predicting Which Schools Show Student Progress in English Skills than the New York City Department of Education?"

http://blogs.edweek.org/edweek/eduwonkette/2008/09/could_a_monkey_do_a_better_job.html

Note that DOE has jiggered the "progress" metric over the years, most recently using a "percentile growth" model. Same result: random results.

At this point, we need to recognize it's on purpose, as there have been so many technical dismantlings of School Progress Reports by wonkier wonks than moi.

-- Michael Markowitz

Anonymous said...

Perhaps by "more stable," DOE simply meant: "Whereas in an election year (Fall 2009), we gave out 98% A's and B's... and last year we throttled back and gave out 60% A's abd B's.... this year we again are giving out 60% A's and B's. Therefore, the world is not only flat, it is stable."

That for the bottom 40%, C's got throttled back from 36% to 30%, with the D's and F's rising from 4% to 10% simply means the world may be flat, but the edges are sharper.

And the odds of being on the precipice two years running is a product of two still-random variables. Two randoms don't make a right, just wrong luck.

-- Michael Markowitz

Anonymous said...

Bloomberg detractors are invariably close relatives of schoolteachers, except the fringe who still thinks he's Uncle Martian.

Skip said...

The parent trigger law is a wonderful benefit. Charter Schools have made a big difference in our family. There is a great website for PARENT REVIEWS of charter schools called www.mykidsschool.net. They want more more people to know about the benefits of a charter, and I do too! Check them out.