Showing posts with label SAT. Show all posts
Showing posts with label SAT. Show all posts

Saturday, May 7, 2016

How you can help unmask the Common Core by asking your Commissioner about computer scoring of the PARCC/SBAC exams

Please take a look at my piece on computer scoring of the PARCC and SBAC exams in a special report of the Network for Public Education; also posted at the Washington Post AnswerSheet.

To summarize: neither Pearson, in charge of administering and scoring  the PARCC exams, nor AIR in charge of the SBAC exams, have publicly told parents that their standard contracts call for half or more of their children's writing responses to be graded exclusively by robo-scorers, and never to be seen by a human being.  Yet these states also have the option to have the exams scored by humans at an extra charge.

Why is this important?  No computer program has so far been shown to have the ability to distinguish total nonsense from coherent prose.  Instead, robo-scorers grade prose according to its length and how much abstruse vocabulary is used.  As Les Perelman has warned, an expert on computer scoring, the inability of machines to identify gibberish from substance may lead teachers and test prep companies to train students to game the system by writing verbose and nonsensical prose -- the opposite of what the Common Core standards are supposed to encourage.

This reality unmasks the claims of its proponents that the Common Core and its allied assessments are really aimed at encouraging critical thinking and more advanced skills.  Instead as I conclude, "the primary objective of Bill Gates and many of those promoting the Common Core and allied exams is to standardize both instruction and assessment and to outsource them to reductionist algorithms and machines, in the effort to make them more efficient. Essentially, the point of this grandiose project imposed upon our nation’s schools is to eliminate the human element in education as much as possible."

One correction to the piece: Instead of my implication that ETS  uses computers exclusively to score  writing on the GRE exams, the company informed us that a trained human reader and a computer scores  every writing sample. If the human and the computer scores closely agree, the average of the two scores is used . If they disagree, a second human being is asked to assess the sample, and the final score is the average of the two human scores.

We wrote every Commissioner from the PARCC and SBAC states to ask them if they were using automated scoring, and only those from the following states responded: Oregon, Massachusetts, Nevada, and West Virginia. These officials informed us that their states were not using automated scoring at this time.

Wyoming Superintendent Jillian Balow also replied to our letter that she shared our concerns about automated scoring and that her state was no longer using the SBAC exam. Colorado Commissioner Richard Crandall responded to a parent activist in his state,that the state was using the standard PARCC method of having the writing samples scored by a computer two thirds of the time, with only ten percent of those re-checked by a human being, and argued this was necessary to maximize "efficiency."    (If you'd like to read them, their responses in full are linked to from my report.)

Though Rhode Island Commissioner Ken Wagner, formerly NY Deputy Commissioner, did not respond to our letter, the same day he  made the astonishing claim in a public forum that  his state was also using computers to score two thirds of the writing samples on the PARCC exam, because research has shown that "computers did a better job scoring than even expert trained teachers."  He also added that “SAT, GRE, GMA, those kinds of programs have been doing this stuff for a very long time.

Yet as mentioned above,  every writing sample on the GRE is scored by both a computer and a human being, and the College Board uses trained human scorers exclusively on writing samples for the SAT and AP exams.

The following 18 states and districts have failed to respond to our letter as to whether they are using computers to score writing samples on their PARCC or SBAC exams: CA, CT, DE, DC, HI, ID, IL, LA, MD, MI, MT, NH, NJ, NM, ND, SD, VT, and WA.

If you are a parent, teacher or advocate in one of these states, please send your Commissioner of Education these questions:
  • What percentage of the ELA exams in our state are being scored by machines this year, and how many of these exams will then be re-scored by a human being?
  • What happens if the machine score varies significantly from the score given by the human being?
  • Will parents have the opportunity to learn whether their children’s ELA exam was scored by a human being or a machine?
Their email addresses are posted here.  And please let us know if you get a response by emailing us at info@studentprivacymatters.org and/or posting their reply below.  Thanks!  

Saturday, September 3, 2011

The top NYC public high schools in terms of college-readiness and SAT scores

Here is a file of NYC high schools ranked as to the percentage of their students in 2010 who graduated “college-ready,” which is estimated by the state education department as scoring at least 75 on the Regents exam in English and 80 on Math – called the "aspirational performance measure," or APM.  Here is a file ranked by their students' 2010 SAT scores.  UPDATE: here are files with 2011 SAT scores and AP scores.
Students scoring lower than this on the APM, according to the state, are likely to need remediation in college.  The spreadsheet also disaggregates this percentage by ELL and special education status, gender and ethnicity.
Lots of caveats before you interpret the APM or SAT list as a reliable ranking of the quality of NYC high schools:
  • To a large degree, these results are determined by the selectivity of the schools' admissions process – not the quality of the school itself.  In other words, schools with the highest percentages of college-ready graduates tend to be those that admit the highest-achieving 8th graders in the first place.  (See this recent paper, for example, that suggests that attending a highly selective high school like Stuyvesant or Bronx Science does not appear to increase SAT scores, college enrollment or college graduation rates.)
  • Schools with fewer than 20 students in any cohort are listed an “s” for suppressed.

  • Some NYC high schools are alternative/portfolio schools and do not take the math Regents; so they are omitted from the APM list.

  •  Regents scores are not in themselves wholly reliable indicators since schools grade these exams themselves;  now that the city has announced that they will use the college-ready percentage in their accountability system, this measurement will be even less reliable in the future.

  • These figures do not take into account the high dropout and/or discharge rate at many high schools; thus, one way a high school might be able to elevate its score is by pushing many low-achieving students out.

  • In any case, test scores in isolation are never a reliable gauge of achievement or actual learning.   
Still, I think it’s interesting and worthwhile for parents to have access to this information.  The statistics overall are lamentably low.  Statewide, only about 37% of students graduate from high school college-ready; in the city, the figure is even lower at  21.4%.
    Here is an article about the low college-readiness percentages of some NYC high schools with high graduation rates; here is a link to the NYSED explanation of these scores.  See this NY Post article that ranks the top 50 HS in NYC by using several academic measures.  Here is the DOE webpage with AP results as well.