Is the "Buzz" About Knowledge Justified by the Evidence (Part 1)?
In short: yes. But it's complicated.
Some argue there’s no good evidence that building students’ knowledge improves their reading comprehension. That claim overlooks the near impossibility of measuring comprehension in the abstract—and it risks condemning many kids to academic failure.
In a recent column for the Hechinger Report titled “The Buzz Around Teaching Facts to Boost Reading Is Bigger Than the Evidence for It,” education journalist Jill Barshay reviews what she sees as the evidence for knowledge-building curricula. She finds it lacking, or even non-existent.
Education commentator Robert Pondiscio has already offered one response to the column—which I highly recommend. I’m offering a deeper dive into the research base, in two parts, of which today’s post is the first.
Barshay makes a few good points. For example—despite the misleading headline—she notes that a knowledge-building approach requires more than just teaching a bunch of random facts. Rather, she observes, it involves having students acquire knowledge in a logical sequence across grade levels, gaining useful general academic vocabulary as they go.
But fundamentally, I would argue—as does Pondiscio—that the column presents a misleading picture of the evidence for building knowledge. Barshay is generally a careful reporter who approaches education research with a healthy skepticism, and I’ve often relied on her work for my own thinking and writing. In this instance, though, I’m left scratching my head.
I’m not going to dwell on what seems an unjustifiably hostile characterization of what a knowledge-building curriculum looks like in practice. Suffice it to say that it’s not about “forcing children to learn a specific set of facts” or “stuffing” them with knowledge, which conjures up images of kids being held down like geese being readied to produce foie gras. As I’ve seen in classrooms and heard over and over again from teachers, kids actually enjoy acquiring knowledge, if it’s presented in an engaging way.
I’ll stick to the claims about the evidence, or supposed lack thereof. “Content skeptics,” Barshay writes, “point out that there’s never been a study to show that increasing knowledge of the world boosts reading scores.” Barshay doesn’t just say that’s what these skeptics claim. She presents the lack of any such study as fact.
The Grissmer Core Knowledge Study
She then brings up a study led by University of Virginia researcher David Grissmer, the results of which were reported last year. That study, which I’ve previously written about here and here, followed students who applied, via a kindergarten lottery, to nine charter schools in Colorado that based instruction on a knowledge-building framework called the Core Knowledge Sequence. One group—the “treatment” group—got into the schools; the other group didn’t get in and went elsewhere, serving as the “control” group.
Following these students from kindergarten through sixth grade, the study found that the Core Knowledge students outperformed the control group on state reading tests by 16 percentile points—a gain large enough that, according to the researchers, it’s equivalent to moving the performance of U.S. 13-year-olds from mediocre to near the top on an international test. That certainly seems like evidence that building kids’ knowledge of the world can boost reading scores.
But Barshay dismisses the study primarily because it was done in charter schools. Maybe, she suggests, it wasn’t the curriculum itself that made a difference. “Perhaps [the schools] had hired great teachers and trained them well,” for example. But even if that’s the case, does the study really tell us nothing about the potential of a knowledge-building curriculum? Doesn’t it indicate that it can significantly boost reading scores at least in the context of charter schools?
Actually, the findings may be even more limited than that. These schools weren’t using a commercially available knowledge-building literacy curriculum, like Core Knowledge Language Arts. The Core Knowledge Sequence is vaguer than an actual curriculum, and it also goes beyond language arts or reading to cover a range of subjects.
In addition, according to a co-investigator who worked on the Grissmer study, the schools in the study weren’t just any old charter schools. They were fiercely committed to a knowledge-building approach.
Thomas G. White, who visited the schools many times in connection with the study, said it wasn’t necessarily the case—as Barshay speculated—that the schools had “hired great teachers.” The teachers were less experienced and lower-paid than their counterparts in regular public schools. But they had been hired because they were committed to the Core Knowledge approach.
“The teachers were on board, and there was lots of cross-grade interaction,” White told me. “You need coherence, you need time, you need low teacher turnover, you need sustained leadership. You need, above all, a common commitment to Core Knowledge.”
So the study’s findings might not generalize to commercially available knowledge-building literacy curricula or to all charter schools. But that’s not a reason to dismiss the evidence out of hand. The study shows us what a knowledge-building approach can accomplish in a certain environment—and the approach itself was what that environment was built around.
Knowledge-Building and Low-Income Students
Barshay’s other problem with the Grissmer study is that eight of the nine schools were in middle- or upper-middle-income communities. In theory, lower-income students will benefit from a knowledge-building curriculum the most, because more affluent students are better able to pick up academic knowledge and vocabulary outside school. So the real question, Barshay says, is whether building knowledge boosts scores for children who are reading below grade level.
In the Grissmer study, students at the low-income school did see significantly larger positive results than their counterparts at more affluent schools. But it was only one school, and the sample size there was very small. Nor is it clear that the individual students in the study were low-income. The larger positive results might also reflect the relatively poor quality of regular public schools in that district, which the lottery losers in that part of the sample were likely to attend.
But even if the study can’t tell us anything definitive about low-income students, it does suggest that both affluent and less affluent students benefit from a knowledge-building approach. And there’s at least one long-term study indicating that struggling readers do benefit the most from a knowledge-building curriculum—one that Barshay doesn’t mention.
In that study, which followed almost 9,000 elementary students in a low-income district for three years, children who got the Bookworms K-5 Reading and Writing curriculum showed marked improvement on standardized reading tests, with an effect size of 0.26 by fifth grade—a large effect, in the context of education interventions. As the theory would predict, students who started out weakest experienced the most growth.
A note on research terminology: An “effect size” is a way of comparing results of different studies. In the education context, it has proven quite difficult to produce positive effect sizes that are considered statistically significant—i.e., that provide a reliable basis for concluding that an intervention caused an effect, as opposed to something that could have happened merely by chance.
I have more to say on the general topic of evidence for knowledge-building—as did Barshay—including an analysis of a recently reported study of CKLA. But I’ll save that for my next post.
researchED in New York City
In the meantime: if you’re interested in how cognitive science applies to education, consider attending the next researchED conference, which will be held in New York City on March 29th. I’ll be speaking there about my new book, alongside a stellar line-up of researchers and educators providing relatively brief (40-minute) but meaty presentations.
This year’s presenters include Patrice Bain, Zach Groshell, Kate Jones, Holly Korbey, Kristen McQuillan and Barbara Davidson (of the Knowledge Matters Campaign), Molly Ness, Tom Sherrington, Alexandra Chalonec and Christine Teahan (of The Writing Revolution), and Glenn Whitman. Keynote speakers are Jim Heal, Meg Lee, and Tom Bennett—founder of the international researchED movement. You can find the full list, and register for a relatively modest fee of about $70, here.
These conferences are, in my experience, both illuminating and a lot of fun. Hope to see you there!
I don’t see a huge problem with results arising from the commitment of teachers to an approach. Isn’t that how programs are meant to work? So we can deduce that if implemented with enthusiasm and fidelity, it works. I mean if it was implemented in a half arsed way, we would get no result. We have to assume that if we put energy into something rigorous, it will work.
This is a useful article regarding the research on a knowledge based curriculum. My view is that a knowledge based curriculum has to be focused around helping students better understand the world around them - key understandings in each subject area. These form the basis for learning and remembering knowledge and facts. For example framing learning about genetics as the "conflict between nature and nurture" helps give meaning to learning biology and makes learning interesting and important. The more we can focus content learning around key understandings, the more the learning becomes meaningful, interesting, and worth learning and remembering. I hope that Natalie Wexler will write more about this idea in the future.