“Public elementary and middle schools in New York City led by ‘Aspiring Principals Program’-trained principals have achieved comparable or higher rates of student improvement than schools led by other new principals ... These results were obtained even though APP-trained principals were more likely to be placed in chronically low-performing schools.”
The New York Times chimed in: “Graduates of a program designed to inculcate school principals with unconventional thinking have gone on to help drive up English test scores even though the graduates were often placed at schools with histories of academic failure.” The article went on to explain that the APP graduates helped increase English Language Arts scores at elementary and middle schools “at a faster pace than new principals with more traditional résumés”; while in math the APP principals made progress, but “at a pace no better than their peers.”
What did this report actually say? Written by Sean P. Corcoran, Amy Ellen Schwartz, and Meryle Weinstein of NYU’s Institute for Education and Social Policy, it compared the performance of schools under the leadership of graduates from the Aspiring Principals Program (APP) with that of schools under other new principals.
Both groups had to have been placed as new principals and to have remained in their positions for three years. Of the 147 graduates in the 2004 and 2005 APP cohorts, 88 (60 percent) met the inclusion criteria. 371 non-APP principals met the criteria; of these, 334 were in schools with comparable grade configurations. So there were 88 APP principals and 334 comparison principals in the study.
The schools in the two categories were significantly different. Compared to other new principals, APP principals tended to be placed in lower-performing schools and schools trending downward in ELA and math. There were also demographic and geographic differences.
The study used two types of comparison: (a) a straightforward comparison of average achievement in both types of schools and (b) a regression analysis (controlling for various school and student characteristics). It was the regression analysis that suggested an APP edge in ELA (but not for math) for elementary and middle schools.
The study found that test scores at schools in both groups improved over the period of the study in terms of test scores– but not as much as schools in the rest of the city. More specifically, the regression analysis indicated that the ELA standardized scores of APP elementary and middle schools were relatively stable, compared to schools headed by new principals who were not APP graduates. In math, APP elementary and middle schools fared slightly worse than comparison schools in relation to the city, but the differences were not statistically significant.
At the high school level (not mentioned in the NYU press release or NYT article), the differences between APP and comparison schools were “minor and inconclusive.”
There are many questions that the study did not address. Only 88 out of 147 graduates in the 2004-2005 and 2005-2006 cohorts met the inclusion criteria. More than 18 percent of APP graduates were never placed as principals at all. The rest stayed in their positions for fewer than three years.
Is this a high or low number? The authors wrote that they did not have comparative mobility information for the non-APP principals, but they presumably could have reported the average attrition rate for New York City principals overall.
Also, the study only analyzed test score data – which alone are insufficient to fully evaluate a school’s performance. Wasn’t there other data that could have been examined? What about the parent and teacher surveys at APP-headed schools compared to schools run by other new principals?
Though the study compared the size of the schools for both cohorts (APP graduates on average headed smaller schools) they did not compare class sizes – or other school-level conditions that could have contributed to the relative performance of both groups.
Most intriguing is the finding that the relative test scores at both sets of schools continued to decline compared to the rising achievement of schools citywide, but schools headed by APP principals declined less –at least in terms of their ELA results:
It was only after doing a regression analysis, by controlling for various factors (including student background), that they found that the relative performance of APP schools was relatively stable while the comparison schools continued to decline. See this graph:
Unmentioned in any of the news articles was the fact that the research organization Mathematica had originally been commissioned by DOE to do an in-depth, multi-year study of the Leadership Academy. Yet after several years of analysis, this study was cancelled by DOE, just months before the results were supposed to be released. What Mathematica might have been discovered about the program and its graduates will probably never be known.
For another close look at this study, see Aaron Pallas’ critique at Gotham Schools.