Wednesday, July 23, 2008

Comparative NAEP Results Help Cut through the DOE's Smoke and Mirrors

Two more news articles in the last two days – one from a middle school principal in Cleveland and the other from a Social Studies teacher at Jamaica High School in Queens -- decried the simultaneous overemphasis on and dumbing down of standardized tests. These and other recent testimonials from “the front lines” arrive while Mayor Bloomberg and Chancellor Klein, with the television advertising help of their unofficial public relations organ, the Fund for Public Schools, continue basking in their self-proclaimed remarkable progress in the City’s public school system.

Leaving aside for the moment the nontrivial question of whether the NYC and NYS standardized test results measure any type of academic progress other than increased ability to take standardized tests, can we compare New York City’s progress since 2003 against that of other large American cities? Happily, the answer is an unqualified “Yes.” Happily because the comparison vehicle is none other than the widely acknowledged “gold standard” of American educational achievement, the National Assessment of Educational Progress, or NAEP, sponsored by the U.S. Department of Education’s National Center for Education Statistics. In 2003, 2005, and 2007, eleven major cities – Atlanta, Austin (since 2005), Boston, Charlotte, Chicago, Cleveland, Houston, Los Angeles, New York City, San Diego, and Washington, D.C. – have participated in assessments of Math and Reading at Grades 4 and 8 under their Trial Urban District Assessment (TUDA) program.

How according to NAEP TUDA has New York City compared during these years of amazing academic progress? Poorly, to be charitable. Using the NAEP TUDA’s reported results for percentage of students at or above a basic level of achievement, one can rank New York City’s gains (or losses) from 2003 to 2007 against those other ten cities. Here’s a summary of how we ranked out of the 11 cities (in categories where one or more cities may not have data reported, the total number of cities is noted in parentheses):

Progress in 4th Grade Reading Among
-- White students – 11th
-- Black students – 2nd
-- Hispanic students – 8th (out of 10)
-- Free lunch eligible students – 6th
-- Free lunch not eligible students – 9th (out of 10)

Progress in 4th Grade Mathematics Among
-- White students – 5th
-- Black students – 2nd
-- Hispanic students – 3rd (out of 10)
-- Free lunch eligible students – 3rd
-- Free lunch not eligible students – 8th (out of 10)

Progress in 8th Grade Reading Among
-- White students – 7th (out of 9)
-- Black students – 11th
-- Hispanic students – 10th (out of 10)
-- Free lunch eligible students – 10th
-- Free lunch not eligible students – 10th (out of 10)

Progress in 8th Grade Mathematics Among
-- White students – 9th (out of 9)
-- Black students – 9th
-- Hispanic students – 9th (out of 10)
-- Free lunch eligible students – 11th
-- Free lunch not eligible students – 9th (out of 10)

Combining and averaging all twenty of these rankings for every city, New York’s average ranking is 7.60, or 10th out of the 11 cities. Only Austin’s average ranking (8.20) is lower. Take away 4th Grade Mathematics, the City’s only bright spot, and the combined average of the remaining rankings place New York City a distant last among the eleven cities participating.

Averaging each city’s rankings for just 4th Grade Reading and Mathematics, NYC schools achieve the 6th best average ranking out of the eleven cities, while NYC rates dead last (11th) in combined 8th Grade Reading and Mathematics average ranking, dead last in combined 4th and 8th Grade Reading Only average ranking, 8th in combined 4th and 8th Grade Mathematics Only ranking, and tied for last and dead last, respectively, in Free Lunch Eligible and Free Lunch Not Eligible average ranking. The 8th Grade results are particularly instructive, since one might reasonably expect better than a dead last ranking for the cohort of students whose travels from 2003's NAEP exam to that of 2007 correspond precisely to the last four years of constantly proclaimed glorious progress under Chancellor Klein and Mayoral control of the public schools. (For those who are curious, Atlanta and Boston are the clear winners in NAEP progress from 2003 to 2007, followed by Houston, Los Angeles, and San Diego.)

By the only available cross-comparative studies of urban school district progress during the Mayoral control years, New York City’s public schools badly lag every other district measured but one. NAEP TUDA provides the sole objective means of assessing Chancellor Klein’s incessant and self-serving claims of progress and refuting those soft, fuzzy, and golden-hued television commercials paid for by his friends at the Fund for Public Schools. Based on the one and only standardized test measure we can trust, the only one that has not been dumbed down or politicized into propagandistic irrelevance, we can only give the Mayor and Chancellor an ungentlemanly F for their four-year academic progress.

Hardly the picture one gets from television, the major local press, the UFT, or the national mainstream media, all of whom have long since drunk the Kool-Aid of Mayoral control and academic progress measures that are nothing more than smoke and mirrors. Thankfully, a few honest voices from the front lines, such as those noted earlier and the fourth grade Bronx teacher quoted in my recent posting, are starting to shine through that smoke and shatter those mirrors.

6 comments:

Chaz said...

Don't forget that the national SAT scores have not improved since the Kleinberg administration took over.

Pissedoffteacher said...

An elementary school teacher told me one of the kids in her school answered only 11 out of 30 questions on one of these standardized tests and passed. She did not think it was possible for him to get all 11 correct.

Talk about low standards!

Anonymous said...

I've read that the characteristics of NAEP data -- they come from student samples -- make them unusable for rank ordering the states. I suspect that the data would also be unusable to rank order the "TUDAs". See

http://pareonline.net/pdf/v10n9.pdf

Leonie Haimson said...

NYC insists on using gain scores on its school progress reports; to be consistent, the Bloomberg/Klein administration deserves an "F".

Steve Koss said...

To anonymous: Actually, Dr. Stoneberg's paper does not argue against ranking per se. Rather, he states that because the NAEP and TUDA exam results are all based on sampling, each score result constitutes only a "point estimate" of the actual State or City scores and that of each subgroup. These point estimates are surrounded by a standard error of estimate, which one might think of as a cloud of probability within which the "true" value lies (much as voter surveys are announced with "margins of error").

As a result, Dr. Stoneberg correctly argues that one cannot properly rank one state as 9th and another 10th if their standard errors cause their high/low ranges to overlap. He does, however, assert that (in his example) states can acceptably be grouped according to their high/low estimates. I chose to combine as many as 20 separate group and subgroup rankings with the thought that (a) the "true" values, some higher and some lower than the point estimates, would tend to wash each other out and (b) regardless of NYC's ordinal rankings, it was clear as I pointed out in the posting that Boston, Atlanta, Houston, Los Angeles, and San Diego fell into a grouping well above NYC, Charlotte, Chicago, Cleveland, and D.C. This conclusion is fully consistent with Dr. Stoneberg's article. It is also completely at odds with the impression that the Mayor and Chancellor have been giving the public, an impression that the media have bought into by repeatedly lionizing them both for their "incredible" turnaround of NYC's public school system.

Anonymous said...

Actually, Stoneberg discourages in every instance using NAEP scores to rank order jurisdictions (i.e., states or urban districts) or subgroups thereof. Rank order analyses are unnecessary, if for no other reason, because NAEP has given us an online tool -- the NAEP Data Explorer – that performs sound, statistical comparisons of the jurisdictions and their subgroups that fully account for standard error. Here are some example of defendable statements supported by the NAEP Data Explorer comparing NYC to other urban districts using the percentage of grade 4 students scoring at or above Basic in mathematics in 2003 and 2007.

In 2003, the percentage of NYC grade 4 students scoring at or above Basic in mathematics was higher than their peers in six districts (Atlanta, Boston, Chicago, Cleveland, the DC district, and Los Angles), similar to their peers in two districts (Houston and San Diego), and lower than their peers in one district (Charlotte). Austin did not participate in the 2003 assessment.

In 2007, the percentage of NYC grade 4 students scoring at or above Basic in mathematics was higher than their peers in five districts (Atlanta, Chicago, Cleveland, the DC district, and Los Angeles), similar to their peers in four districts (Austin, Boston, Houston, and San Diego), and lower than their peers in one district (Charlotte).

From 2003 to 2007, the percentage of grade 4 students scoring at or above Basic in mathematics increased in eight urban districts (Atlanta, Boston, Chicago, the DC district, Houston, Los Angeles, New York, and San Diego), did not change significantly in two districts (Charlotte and Cleveland). None of the urban districts saw grade 4 mathematics scores decrease from 2003 to 2007.