Questionable contract?

If you want to volunteer for our Citizens Contract Oversight Committee, or have a tip to share, please email us at

Sunday, November 16, 2008

Don’t Like the Results? Change the Scale!

A recent Daily News editorial regurgitated its usual “retain Mayoral control” mantra (under the unintentionally ironic moniker, Save Our Schools) by citing the “perfectly visible” progress shown by this year’s DOE school report cards.

Leave aside for the moment the patent absurdity of buying into a measurement system that allows the agency in question to measure itself while incentivizing its employees in every conceivable way to cheat and otherwise bend rules to make their numbers. Leave aside as well the debatability of a rating system in which 80% of the schools being measured are graded as B or better, a remarkable year-to-year increase of some 18 percentage points. Rather, let’s just look at the numbers behind how those grades were established.

In 2006-07, the DOE established conversion scales at each level (elementary, middle, and high school) under which each school’s raw score on a roughly 100-point scale would be converted to a letter grade. This year, the DOE has apparently reset these scales significantly downward for elementary and high schools, enough to account for the great majority of the gains at these two levels. In other words, when you control all the data and define your own success and you want better results, simply adjust your scales downward and, voile, you’ve got them!

Here are the elementary school scale conversion cutoff scores for each letter grade for the last two years (the cutoffs for F scores are irrelevant, since they simply reflect the lowest scoring school in their category).

A -- 64.0 (2006/07), 59.6 (2007/08)
B – 49.9 (2006/07), 45.8 (2007/08)
C - 38.8 (2006/07), 32.6 (2007/08)
D - 30.9 (2006/07), 28.4 (2007/08)

Note that it takes 4-6 raw score points less this year for a school to achieve an A, B, or C. Last year, the DOE rated 70.5% of elementary schools as A or B level. This year, they claimed improvement to 79.8% of schools achieving A or B scores. However, using the same conversion scale as last year, only 71.5% of elementary schools would merit and A or B, almost no change from last year despite the enormous pressure placed on school principals and teachers.

Almost the same results can be found at the high school level, beginning with the slightly less dramatic reductions in scale conversion cutoff scores:

A – 67.6 (2006/07), 64.2 (2007/08)
B – 48.8 (2006/07), 43.5 (2007/08)
C – 35.1 (2006/07), 34.3 (2007/08)
D – 28.9 (2006/07), 29.7 (2007/08)

Last year, the DOE rated 65.5% of high schools as A or B level. This year, that percentage increased to a mind-bending 83.1%. However, if the DOE had applied the same letter grade conversion scale as last year’s, only 70.8% of high schools would have received A or B grades, and 6% would have received D or F grades. Roughly two-thirds of the percentage point improvement in high schools achieving A or B grades came from lowering the cutoff scores!

The DOE had announced plans earlier this year to modify its school rating system, so year-to-year comparisons are not necessarily straightforward. Nevertheless, for reasons that have not yet been made clear, the score cutoff scales for elementary and high schools have been adjusted significantly downward while those for middle schools remained nearly the same. Perhaps the rationale is mathematically legitimate, but the action is so suspect and the effect so pronounced, further explanation is warranted. Otherwise, lowering the grading bar has the appearance of being little more than an effort to influence the publicly reportable “success” of Mayoral control and the Chancellor’s accountability regime.

Given the DOE's tight control over data, its lack of accountability to any higher review body, and its general opacity to outside view, how is the public to know whether the reported increase in A- and B-rated schools is real or simply illusory? Assuming, that is, that you accept the legitimacy of the grades themselves, an already dubious proposition.

1 comment:

Anonymous said...

Here are the changes DOE made. The conversion is listed in the fourth bullet from the bottom, apparently to adjust for other changes DOE made?

Final High School Progress Report Changes
High schools

The following is a list of changes that will be made for the 2007-08 high school Progress Reports:

• Peer index – Modify the peer index to incorporate two school-level demographic factors: the percentage of special education students and the percentage of students who start high school over-age by 2 or more years. This approach changes each school’s existing peer index (the average student proficiency based on 8th grade ELA and math test scores) by subtracting from that number the school's percentage of special education students (weighted twice) and its percentage of over-age (2+ years) students:

Average student proficiency – (2 x percentage of Special Education students) – (percentage of over-age students)

Example for a school with an average student proficiency of 3.38, 12% Special Education students, and 5% over-age students (2+ years): 3.38 – 2 (0.12) – 0.05 = 3.09

• Graduation

o Calculate a school's graduation rate using a methodology similar to the one the state uses, minus the state's 5-month rule

o Provide new graduation weights in the “Weighted Diploma Rates” (new graduation codes have been added to ATS to enter this information; see the announcement in last week’s Principals' Weekly):

§ A 1.0 weight (equivalent to a Local Diploma) for IEP diplomas for special education students who qualify for the New York State Alternate Assessment (NYSAA)

§ An additional 0.5 weight for the following graduation distinctions: Career and Technology Education-endorsed diplomas, Regents Diplomas with Advanced Designation through the Arts, and Associates Degrees

o Double the weight of any graduation distinction (Local Diploma or higher) for any special education student or any student who starts high school over-age by 2 or more years

• Additional credit

o Lower the minimum number of students required for additional credit eligibility from 20 to 15 so that more schools qualify

o Introduce three new additional credit measures:
§ The percentage of students in the lowest third citywide who score 75 or higher on the ELA Regents (75 is the CUNY cut-off to exempt students from taking remedial college classes)

§ The percentage of students in the lowest third citywide who score 75 or higher on the math Regents
§ The percentage of students in the lowest third citywide who graduate with a Regents Diploma or higher

Schools can earn up to two points for each additional credit measure.

• Weight of Peer and City Horizons – Each school’s results are compared to other high schools in its peer group and citywide. Currently, a school’s results compared to its peer group are weighted twice as much as its results compared to the city. The weights are changing so that a school’s results compared to its peer group are weighted three times as much as its results compared to the city.

• Category Weights – use the following weights for each category:
o School Environment (15 points) (currently 15 points)
o Student Performance (25 points) (currently 30 points)
o Student Progress (60 points) (currently 55 points)

• Progress Measures – Associate the lowest third progress measures with credit accumulation, not the “Weighted Regents Pass Rates.”

• Time Period of Measurement – Evaluate graduation, credit, and Regents measures on a September to August school year calendar. Summer 2008 results would count on this year’s Progress Report.

• Letter Grades by Progress Report Category – Add letter grades (A-F) for each section of the Progress Report (School Environment, Student Performance, and Student Progress)

These changes will have the following effects on next year's Progress Report:

• Grade Cutoff Scores: Grade cutoff scores will be adjusted to coincide with what the grade distribution would have been for 2007-08 using the 2006-07 Progress Report rules.

• Peer Groups: Schools will be assigned new peer groups for next year based on the new peer indexing methodology.
• Peer and City Horizons: Peer and city horizons will be updated to take into account the new peer groups, the revised metric definitions, and an additional year of data.

• Targets: The 2006-07 rules will be used to determine whether a school met its pre-existing target; going forward, new targets will be based on the 2007-08 rules.