To add insult to injury, the NYC Department of Education is expected to release the teacher data reports to the media tomorrow -- with the names of individual teachers attached. These reports are based SOLELY on the change in student test scores of individual teachers, filtered through a complicated formula that is supposed to control for factors out of their control, which is essentially impossible to do. Moreover, there are huge margins of error that mean a teacher with a high rating one year is often rated extremely low the next. Sign our petition now, if you haven't yet, urging the papers not to publish these reports; and read the outraged comments of parents, teachers, principals and researchers, pointing out how unreliable these reports are as an indication of teacher quality.
Though most of the critiques so far focus on the inherently volatile nature and large margins of error in any such calculation, here in NY State we have a special problem: the state tests themselves have been fatally flawed for many years. There has been rampant test score inflation over the past decade; many of the test questions themselves are amazingly dumb and ambiguous; and there are other severe problems with the scaling and the design of these exams that only testing experts fully understand. Though the State Education Department claims to have now solved these problems, few actually believe this to be the case.
As further evidence, see Fred Smith's analysis below. Fred is a retired assessment expert for the NYC Board of Education, who has written widely on the fundamental flaws in the state tests. Here, he shows how deep problems remain in their design and execution -- making their results, and the new teacher evaluation system and teacher data reports based upon them, essentially worthless. He goes on to urge parents to boycott the state exams this spring. Please leave a comment about whether you would consider keeping your child out of school for this purpose!
New York State’s Testing Program (NYSTP) has relied on a series of deeply flawed exams given to 1.2 million students a year. This conclusion is supported by comparing English Language Arts (ELA) and Math data from 2006 to 2011 with National Assessment of Educational Progress (NAEP) data, but not in the usual way.