Reading Recovery, parents and experts

Debate has been joined (after a fashion) over Reading Recovery on the Learning & Reading Disabilities blog and the Wrightslaw site. In two recent posts ("Should I Stop My Child's School from Using Reading Recovery to Teach Him to Read" and "What Mistake Do Parents of Children At-Risk for Reading Disabilities Make About Reading Recovery"), L&RD defends Reading Recovery against criticisms in the Wrightslaw article "Reading Recovery: What Do School Districts Get for Their Money?" by Melissa Farrall, Ph.D. Farrall says scientific evidence doesn't support the effectiveness of Reading Recovery; it doesn't help enough kids (because a meaningful percentage of kids don't complete the program successfully) and it doesn't help kids enough (because the threshold for "success" in the program can be set too low); and it's too expensive. L&RD says Reading Recovery has been validated by scientific evidence (including a meta-analysis of multiple studies published in 2004, not cited by Farrall) and by the What Works Clearinghouse, and that Reading Recovery may not be "perfect" (as no program is), but that parents should see if Reading Recovery works for their child first before rejecting it untried.

Since Reading Recovery and the "wars" over its effectiveness have been around a while, the fact that these battles continue to rage is an indication that neither side can dispositively claim victory. If Reading Recovery didn't work at all, that would have become clear enough by now, anecdotally and scientifically; if Reading Recovery were a silver-bullet solution to reading remediation, it's hard to imagine that educators, researchers and parents wouldn't all be united in getting it disseminated as widely as possible.There's no reason to doubt the sincerity and veracity of parents who report that Reading Recovery worked wonders for their child, or to dismiss the legitimacy of peer-reviewed studies that show that there are reading programs out there which have implemented Reading Recovery with good results.

But how much should parents defer to "the experts"? Parents are challenging the experts in Seattle, where parents have prevailed in litigation to prevent the adoption of "discovery" math, and in California and Connecticut, where parents want the right to pull the "parent trigger" to reform failing schools. I believe that parents who step in and take the reins from the professionals who've been entrusted with their children's education do so only as a last resort; parents would much rather trust that the interests, goals and judgments of experts are aligned with theirs. To earn and keep that trust, experts need to show their work and intelligibly demonstrate the reasoning behind their decisions.

I'm inclined to find that verified studies are useful for ruling out programs that have been proven ineffective, but it's not clear to me that the studies which demonstrate Reading Recovery's effectiveness speak to whether Reading Recovery success is readily replicable in the vastly different environments (school districts, schools, classrooms) in which it's implemented. It's not enough to learn that Reading Recovery works well for those for whom it works well. In evaluating the efficacy of a program, we should know more about the children who were supposed to be, but weren't, helped by the program, whether there were other interventions that could have worked for them during their year in the program, and how many other children could have been helped with other resources that could have been funded by the money spent on the ineffective intervention.

Our school district has used Reading Recovery since the 1989-90 school year, with declining success over time:

Last school year, 349 students in the district qualified for Reading Recovery; 262 students participated in Reading Recovery; of those students, 110 were successfully "discontinued" from the program. (This success rate of 42.3% is significantly lower than the 58.5% national success rate last reported by Reading Recovery.) Last year's Reading Recovery program cost the district approximately $1,271,200, or $4,852 per student enrolled in the program, or $11,556 per student successfully "discontinued" from the program.

Reading Recovery resources in the district are directed toward schools with higher poverty levels, and 83% of the students enrolled in Reading Recovery last year (approximately 217 of the 262 students) were low-income students. Despite this focus, reading proficiency levels for low-income students in our district, measured at the first state assessment opportunity following early-grade intervention, have been on the decline (the following information is from the DPI web site):

Because the district's Reading Recovery resources are allocated among higher-poverty schools, there are students who meet Reading Recovery criteria (i.e. in the lowest 20% of readers) who don't get enrolled if their school is not a "Reading Recovery school", and students who are "overqualified" for Reading Recovery  who do get enrolled in Reading Recovery to "fill out" the program group in their school's Reading Recovery program. (The latter phenomenon may not be a good thing: it's reported that text reading level gains of "overqualified" students in the program may end up not progressing on pace with their non-Reading Recovery peers.) It also appears that some students in the lowest 20% of readers have skill levels that are too low to allow them to complete Reading Recovery successfully. Also, although Reading Recovery is implemented in a series of three "rounds" throughout the school year, students in the first round have a much better chance of completing the program successfully than those who enroll in the second or third rounds.

It's been suggested that Reading Recovery resources be redistributed to reach more of the lowest 20% of readers, with the thought that success rates may improve if better targeted toward the types of students for whom the program was designed. However, program success doesn't seem to be correlated with participation levels:

The district produced two written reports evaluating Reading Recovery in 2004 and 2009 (much of the information in this post, and the information in the first and third charts in this post, are drawn from those reports):
From the 2004 report: "Reading Recovery clearly serves a population of needy students based on income and other demographic factors. Reading Recovery serves a population that has deficits in reading ability at the beginning of Grade 1. However, it is not clear that they constitute the lowest 20-25% of all first graders, a stated goal of the program developers."
From the 2009 report: 
"Reading Recovery clearly serves a population of needy students based on income and other demographic factors."
"It is difficult to accurately define the students who are in the lowest 20% but this table [Table 1 of the 2009 report] indicates that there are probably significant numbers of students who need help with literacy but either do not have the opportunity to receive Reading Recovery (not available at their school) or Reading Recovery did not identify them or was unable to place them."
From the 2004 report: "When combining both successful RR students and the unsuccessful RR students over an entire school year of service delivery the overall program impact does not yield statistically significant achievement gains when comparing performance of participants to similar but non-participating students after controlling for intervening affects [sic] (e.g., poverty, special education status, parent education, etc.)."
From the 2009 report: "When combining all Reading Recovery students over an entire school year of service, the overall program impact does not yield statistically significant achievement gains when comparing the performance of participants to similar but non-participating students after controlling for intervening affects [sic] (e.g., poverty, special education status, parent education, etc.)." 
From the 2004 report: "There appears to be a need to develop a better method for identifying students who could be successful in this program. In 2003-04, more than half of all Reading Recovery students were not discontinued. Targeting services more efficiently to students who benefit most is strongly encouraged. The longitudinal data provided by the district’s Primary Language Arts Assessments (PAA) may be a useful resource in developing such predictive tools while simultaneously affording meaningful instructional and diagnostic information."
From the 2009 report: "It is recommended that Reading Recovery teachers utilize the access they have to the student information system to record enrollment as well as observation summary data on all students eligible for Reading Recovery and those who receive the intervention. This should also be tracked in the Student Intervention Monitoring System for reference by other staff involved with an individual student's literacy programming. This would allow the development of better predictive models that accurately identify students with a high likelihood of success in the Reading Recovery program." 
The 2009 report recommended that the Reading Recovery program continue, with efforts to achieve fuller implementation among the targeted population, and then be re-evaluated within two years. In light of the history and results of this program, I found this recommendation astonishing, but was thankful when the school board directed district staff to revisit and report back on the district's reading programs as a whole before proceeding further. I hope for the best in the discussions and decisions that ensue.

No comments:

Post a Comment