The results were expressed in grade equivalents, which allowed one to see that rather than progressing, both groups had fallen further behind in terms of grade level.
The study found a slight but significantly larger average growth score in reading comprehension among second graders who were tested in a sample of about one third of the first and second cohort of schools that had access to literacy coaches, as compared to second graders at similar schools that did not. No significant gains were found overall, or in the other two specific areas of word decoding or word knowledge.
What this may suggest is that while the growth scores of students at schools that had access to coaches were significantly greater in the area of comprehension, the same significance did not occur when the scores of the students of teachers that actually received coaching were compared to students of teachers who did not receive coaching.
This passage is similarly hard to understand:
Moreover, the more coaching a teacher received, the more growth the students had, on average. The difference between students whose teachers had more coaching and those who did not was statistically significant.
Again, no data is provided to back up either claim, and no definition of “more coaching” is offered in which would allow one to evaluate the claim made by the second statement. How much more coaching was needed for a teacher' students to exhibit statistically significant test score gains?
No information is provided either as to whether the coaches believed their training was useful, even though a third of school leaders reported that "the coach was out of the building too often for professional learning."
All in all, there needs to be a more comprehensive study with much more data provided before one can be assured that the program is providing benefits to kids worth the cost. And such an analysis would be far more credible coming from an experienced and independent research outfit like RAND.