tag:blogger.com,1999:blog-2586988941850907367.post2869142566048380823..comments2024-03-24T11:39:28.574-04:00Comments on NYC Public School Parents: More Editorial Nonsense in the Major NYC NewspapersPatrick Sullivanhttp://www.blogger.com/profile/10631038958645725010noreply@blogger.comBlogger11125tag:blogger.com,1999:blog-2586988941850907367.post-3901493659034710092008-02-27T13:44:00.000-05:002008-02-27T13:44:00.000-05:00This comment has been removed by the author.AODhttps://www.blogger.com/profile/07283553110555573901noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-77577973409156299832008-02-27T13:32:00.000-05:002008-02-27T13:32:00.000-05:00This comment has been removed by the author.AODhttps://www.blogger.com/profile/07283553110555573901noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-45538506407887859092008-02-27T13:30:00.000-05:002008-02-27T13:30:00.000-05:00This comment has been removed by the author.AODhttps://www.blogger.com/profile/07283553110555573901noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-86957748351306758542008-02-19T11:10:00.000-05:002008-02-19T11:10:00.000-05:00Another major flaw about the school grading system...Another major flaw about the school grading system: not only is it based primarily upon one variable as Steve points out -- but one year's worth of a variable -- the difference between last year's test scores and the year before. <BR/><BR/>Thus, if you keep the analogy on baseball, a hitter could hit 43 homeruns and have an average of .306, but if he hit 45 the year before and had an average of .325, he would receive a lower grade than a player who improved their scores over the year before, even if hehad no homeruns and had an average of .206. I don't think many people would consider this a good way to measure athletic prowess. <BR/><BR/>This is why certain NYC schools that had 80-90% of kids at grade level got Fs, while others who had up to 70% below grade level and were on the state failing list received As.Leonie Haimsonhttps://www.blogger.com/profile/17317355552298136811noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-91702860993069866562008-02-12T20:37:00.000-05:002008-02-12T20:37:00.000-05:00The author of the oped that Steve so adeptly skewe...The author of the oped that Steve so adeptly skewers is Kevin Carey -- formerly of Ed Trust and now Education Sector.<BR/><BR/>Kevin Carey's response is here:<BR/><BR/>http://www.quickanded.com/2008/02/<BR/>dont-be-questioning-my-bill-james-itude.html <BR/><BR/>It's amazing how statistically illiterate these Beltway education pundits are.Leonie Haimsonhttps://www.blogger.com/profile/17317355552298136811noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-61825412295949818332008-02-08T23:39:00.000-05:002008-02-08T23:39:00.000-05:00My apologies to OTAC, I think. I read his comment ...My apologies to OTAC, I think. I read his comment as a response to my posting, while he/she was apparently quoting from Carey's blog. I didn't see any quote marks, so I thought the comments were OTAC's own response. <BR/><BR/>My disagreement with Carey's blogged response comes from the fact that I never argued about the use or non-use of multivariate modeling per se. His suggesting THAT as the point of contention is only to misdirect the discussion into irrelevant technical considerations and away from the real fallacy of the Billy Beane baseball comparison. My point was to argue the notion that measuring teachers solely by a single variable -- whether absolute achievement or relative progress versus mathematically adjusted expectations -- is only weakly comparable, at best, to the sort of multifaceted analysis baseball statisticians like Billy Beane employ to evaluate players' "value added" performance. Efforts like Mr. Carey's to legitimize in the public's mind the DOE's teacher evaluation plans by using a baseball analogy are facile and applealing for their recognizable content. And maliciously inappropriate and misleading for the very same reasons.Steve Kosshttps://www.blogger.com/profile/03837868893003246039noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-5673658033542183652008-02-08T23:08:00.000-05:002008-02-08T23:08:00.000-05:00This comment has been removed by a blog administrator.Patrick Sullivanhttps://www.blogger.com/profile/10631038958645725010noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-36730474519255493412008-02-08T18:10:00.000-05:002008-02-08T18:10:00.000-05:00I beg to differ with the comment from Mr. OTAC. Al...I beg to differ with the comment from Mr. OTAC. All those variables he mentioned, while commendable in themselves, serve only to establish a rationalized basis for comparison, the sort of analysis one usually sees in the media as "adjusted for other factors." The truth is, after these adjustments are made, we are STILL just looking at one variable, the student's measured gain in a standardized test score. This is no more sensible in Mr. Carey's baseball context than evaluating every player's yearly home run production versus what could be expected, adjusted for height, weight, age, years of experience, etc. No matter how you adjust it, evaluating a major leaguer (or his manager) on a single dependent variable -- home run -- only serves, as I wrote, to pervert the multi-faceted nature of what you're trying to measure. I resolutely stand by my critique of Mr. Carey's analogy as being totally inappropriate to education and horribly misleading to parents who are not so fortunate as to understand the mathematical intricacies of multivariate regression modeling.Steve Kosshttps://www.blogger.com/profile/03837868893003246039noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-20153223446636052102008-02-08T14:30:00.000-05:002008-02-08T14:30:00.000-05:00Interestingly, he comments on your post over on hi...Interestingly, he comments on your post over on his blog, but you can't leave a comment there. Again, he's dead wrong. He claims that NYC's measures are multivariate, but then goes on to show that you are right -- the ONLY *dependent* variable is the test score:<BR/><BR/>Moreover, Koss doesn't know what he's talking about. The NYC value-added measures are not "derived from a single variable," they're exactly the kind of complicated multi-variate measure he describes. As the NY Times reported.<BR/><BR/> The city’s pilot program uses a statistical analysis to measure students’ previous-year test scores, their numbers of absences and whether they receive special education services or free lunch, as well as class size, among other factors. Based on all those factors, that analysis then sets a “predicted gain” for a teacher’s class, which is measured against students’ actual gains to determine how much a teacher has contributed to students’ growth.OTAChttps://www.blogger.com/profile/11392627868517540786noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-78913012045470829232008-02-08T10:12:00.000-05:002008-02-08T10:12:00.000-05:00GREAT column, Steve. Seriously, is there a Pulitz...GREAT column, Steve. Seriously, is there a Pulitzer for blog posts? Apart from the particular position, the column is a joy to read: informative and inventive. This deserves the widest readership. Thanks, Steve!<BR/>-DavidUnknownhttps://www.blogger.com/profile/14236609176600839502noreply@blogger.comtag:blogger.com,1999:blog-2586988941850907367.post-18601870122417470132008-02-07T11:22:00.000-05:002008-02-07T11:22:00.000-05:00Great piece Steve...How damn ridiculous comparing ...Great piece Steve...How damn ridiculous comparing our children with millionaire baseball players...David M. Quintanahttps://www.blogger.com/profile/08177269865683156100noreply@blogger.com