Wednesday, November 25, 2009
Yet Another Nail in the NYS Regents Exam Coffin
On November 19, the Office of the NY State Comptroller released a report of its findings from an audit of local district scoring of high school Regents exams. The results, while not surprising to those closest to high school education in NY State, was nevertheless stunning in its confirmation of just how badly skewed the entire Regents examination system has become. Equally startling was the mainstream press’s utter failure to note the findings, virtually all of which agreed with by the Regents themselves. (Note: only Maura Walz at the Gotham Schools website seems to have reported on this so far.)
The audit team randomly selected 200 NY State high schools and, using a team of experienced high school teachers, rescored nine non-multiple-choice questions on one 2005 subject area exam (identified only as Exam A) and thirteen non-multiple-choice questions on another 2005 subject area exam (Exam B). In total, the Review team rescored almost 2,400 Exam A papers and over 3,200 Exam B papers, looking only at questions where local school exam graders has discretion over how many points to award their students’ answers. Their findings in summary:
“…a significant tendency for local school districts to award full credit on questions requiring scorer judgment even when the exam answers were vague, incomplete, inaccurate, or insufficiently detailed.”
That sentence euphemistically recaps the much more disturbing details of their findings:
1. For Exam B, the locally reported total scores of the thirteen questions were higher than the Review Team’s re-scored total on 80% of the examination papers reviewed (totals were the same on 15% of the papers).
2. For Exam A, the locally reported total scores on the nine questions were higher than the Review Team’s re-scored total on 58% of the examination papers reviewed (totals were the same on 32% of the papers).
3. For Exam B, the locally reported total scores were at least three raw score points higher (or lower) on 34% of the exam papers re-scored by the Review Team. Three raw score points can easily scale to ten or more points on the student’s final, converted score. While not detailed in the report, one can well imagine that “bubble students’” tests were most prone to this higher level of score inflation to ensure they passed the raw score hurdle to receive a converted score of 65 or more.
4. For Exam A, the locally reported total scores were at least three raw score points higher (or lower) on 17% of the exam papers re-scored by the Review Team.
5. Exam B contained two five-point essay questions. The locally reported scores on these two questions were higher (or lower) than the Review Team’s re-scoring in 47% and 43%, respectively, of the exam papers reviewed.
6. Eighteen of 192 selected schools failed altogether to submit their requested Exam A papers, and 20 of 205 did not submit their Exam B papers. Even the Comptroller’s audit report suggests that these compliance failures might be attempts, as they put it, “to avoid scrutiny.”
7. Review of SED’s procedures for follow-up on privately-lodged complaints of scoring fraud or irregularity found no evidence that twelve of them had ever been investigated. Thus, even an honest teacher who whistle-blows on scoring fraud has virtually no guarantee that SED will conduct any investigation whatsoever. The door for cheating, fraud, or just looking the other way on exam grading is wide open and seemingly encouraged by SED’s actions and lack thereof.
Combine this pattern of fraudulently inflated grading with the persistent dumbing down of Regents exams and the concomitant lowering of the raw score needed for a passing scaled score grade, and the end result is an examination system that is utterly meaningless as a measure of knowledge or understanding. Even worse, as the Comptroller’s report makes clear, SED has failed completely to follow up on any of these issues, even having in hand another report from 2003/2004 detailing almost exactly the same problems.
What has become clear in the past five or six years (noticeably since the advent of NCLB), is that the NY State Regents examination system, once a moderately respectable measure of academic achievement, is now broken almost beyond repair. As long as the numbers are up, everyone rests easy; nobody seems to care that they are meaningless, as witnessed by the high levels of remediation required of first-year college student products of our state education system. As usual, the losers in this breakdown are the students and their sadly unaware parents.
It seems clear as well that the time has come for a major investigation and overhaul of SED and the Regents system. Governor Paterson and others in Albany, when will you wake up and start doing what’s right for the children in your state?