Boston Study: What Higher Standardized Test Scores Don’t Mean
Students at Roxbury Prep Charter School, which is known for its high achievement test scores, in 2011. (Jesse Costa/WBUR)
The MIT researchers avoid loaded terms like intelligence, so let me be the blunt one and sum up a provocative new Boston-based study coming out soon in the leading psychology journal Psychological Science:
If you’re a kid who’s lucky enough to go to a school that boosts your performance on standardized tests like the MCAS, you’re scoring higher because you know more, but probably not because you’ve gotten smarter. And by smarter, I mean better at certain measurable cognitive skills that psychologists call “fluid intelligence” or “fluid reasoning” — like working memory and problem-solving in a novel situation.
MIT sums up the findings:
In a study of nearly 1,400 eighth-graders in the Boston public school system, the researchers found that some schools have successfully raised their students’ scores on the Massachusetts Comprehensive Assessment System (MCAS). However, those schools had almost no effect on students’ performance on tests of fluid intelligence skills, such as working memory capacity, speed of information processing, and ability to solve abstract problems.
The researchers calculated how much of the variation in MCAS scores was due to the school that students attended. For MCAS scores in English, schools accounted for 24 percent of the variation, and they accounted for 34 percent of the math MCAS variation. However, the schools accounted for very little of the variation in fluid cognitive skills — less than 3 percent for all three skills combined.
Even stronger evidence came from a comparison of about 200 students who had entered a lottery for admittance to a handful of Boston’s oversubscribed charter schools, many of which achieve strong improvement in MCAS scores. The researchers found that students who were randomly selected to attend high-performing charter schools did significantly better on the math MCAS than those who were not chosen, but there was no corresponding increase in fluid intelligence scores.
It will be interesting to see how this study resonates in the eternally contentious discussion about standardized tests and the fraught practice of “teaching to the test.” To get a clearer sense of what the study says about testing — and what it doesn’t — I spoke with the paper’s senior author, MIT neuroscience professor John Gabrieli, of the McGovern Institute for Brain Research. Our conversation, lightly edited:
Let’s begin with the ending: How would you sum up what this study found?
Our core findings were that which school a student attended did influence his or her test scores on statewide tests, but it did not appear to influence their fluid cognitive abilities; abilities such as how quickly you process novel information, how much information you can juggle in your mind, what people call ‘working memory,’ and how much you can apply novel, fluid reasoning to novel problems.
And what were the skills that it did affect?
They affected what psychologists call ‘crystallized knowledge,’ knowledge of vocabulary and language, knowledge of arithmetic and calculation, the kinds of things that we teach in schools and we want students to know.
So in lay language, what school you attend could affect how much you know, but not how smart you are?
Well, I think there’s two kinds of “smarts” that psychologists have identified. ‘Crystallized smarts’ are what you know in terms of vocabulary and arithmetic, and that matters a lot. Your vocabulary is how you speak to people, how you read, so I think there’s a lot of smarts in crystallized knowledge. There’s another kind of smarts, which is rapid fluid thinking in novel situations, that’s never been taught in schools, and we thought it was possible that just by going to a school that’s very good at teaching you the crystallized knowledge, you would pick up, through a lot of study and teacher support, fluid cognitive skills. And what this study taught us is, even schools that do an impressive job in enhancing their students’ scores on standardized tests, on crystallized abilities, don’t seem to move those abilities in regards to cognitive fluid skills.
So there will surely be a response to this study of, ‘See, you’re making us teach to these tests, and they’re not what the kids need,’ and that means that even more so we’re not giving the kids what they need most. What’s your response to that?
Our response to that is that these tests have been shown to relate to performance, not only on the tests themselves, but they transfer to other things, like SATs or advanced placement tests; they transfer to the probability of going to college and completing college. And in Europe, they’ve been shown to correlate with adults’ occupational status and income. So there’s lots of evidence that scoring well on these things is an indicator of knowledge gained as valuable for one’s future.
Though what we hear a lot now from education gurus is that more and more in this knowledge economy, that fluid intelligence matters even more. It’s like, it’s what you can do, not what you know, because knowledge is becoming a commodity…
Well, we think the more you can get of both, the better. We think it’s fantastic that schools can sometimes do dramatic improvement in this crystallized knowledge. If you don’t know the content of what you’re working on, it’s hard to be abstractly creative. But this other kind of skills is something that we haven’t really thought much about in terms of curriculum and students in any kind of school; only in the last decade has there been some evidence that it’s malleable and educable, and I think that this study spurs us to want to seek further understanding of how we can apply these kinds of programs in schools, so that children can succeed on every dimension.
You said there are good indications from various studies that crystallized knowledge matters. Do we have strong indications that fluid intelligence matters in how you do in life?
Yes, we do now. In almost all cases, those two things tend to travel together. So, individuals who tend to score well on one, tend to score well on the other, because they come from supportive environments, good schools, so we know that both of those are correlated with all kinds of success in academics and life-outcomes, but they tend to travel together.
The sole exception is aging. So, as you go from 20 to 80, you have a decade-by-decade decrement in fluid skills, even as you sustain your crystallized skills. So what we think is that, you know, maybe when we’re asking schools to dramatically improve their students’ academic performance — and some schools are able to succeed at that, you know they’re producing a novel student: one who has grown in novel skills, abilities in crystallized knowledge that they did not have before — but they aren’t having the commensurate gain in these fluid skills. And we don’t even know if that matters. We don’t know whether those who gained the crystallized skills will just go on to do well, and in fact there’s evidence they will on average; but what the consequence might be of not having the commensurate fluid skills, we just don’t know yet.
So does any clear policy recommendation stem from these findings?
I think if there were a well-known curriculum for schools to enhance fluid skills, I think this study would encourage schools to do it. At the moment, that’s still a research topic, rather than a certainty. So there’s much more encouraging news that some programs appear to raise fluid skills in children and adolescents than we thought a decade ago, but there’s not yet a consensus about a program that consistently is effective for children. And so I think this makes that a higher priority: to look amongst those programs, figure out which ones are effective, which are scalable for children, and could be added to curriculum to make children strong in all dimensions they could be strong.
Why did you undertake this study?
We undertook the study because we were very interested in what sorts of things were not measured by standardized tests. And our original focus was on other aspects of students’ character, such as motivation, or persistence, and we’re very interested in that as well, but we thought we would measure some of these cognitive abilities that are rarely measured in relation to MCAS or standardized scores. And so that’s where we discovered this dissociation between schools that can raise the state scores, but not produce a commensurate rise in fluid cognitive skills.
I gather you used well-established neuropsychological tools for measuring these things?
Yes, these measures of fluid cognitive skills have been used for decades by cognitive psychologists who try to understand individual differences in cognitive abilities. So these have been widely studied, widely related to positive outcomes in academics and other life-measures, but again, we don’t know what it means when a school can successfully increase one set of abilities without, by current standards, increasing the other set of abilities. On the one hand, it’s a creation of a novel achievement, which is helping students do better on crystallized knowledge or state tests, than they would have been expected to without the intervention of that school.
Was there a difference among schools in terms of whatever gains or losses on fluid skills there were?
We didn’t see any — with the exception of the exam schools that pick students on the basis of test scores, those students have high test scores. But besides that, where a school accounted for approximately 1/3 of the variation in state test scores, they accounted for very near zero of the variation on these fluid cognitive skill measures.
So ultimately, given the very contentious atmosphere around standardized testing, is there anything you’d like to emphatically say this study does not show?
Two things we think the study does not show: It does not show that there’s any problem with standardized testing; we think there are lots of issues on the strengths and limitations of such testing, but our results don’t speak to that. And second, it does not show anything, for example, about charter versus district schools; we found, although in Boston charter schools have a particularly impressive record of raising test scores, we found the same kinds of results across all kinds of schools.
And one last personal question: If I want to improve my own fluid cognitive skills, is there anything I can do?
There are a number of programs that claim they can help individuals do that. It’s still very much back-and-forth in terms of research findings at the moment. For older adults, it turns out that still the best evidence is for regular exercise, even for such cognitive skills. But for training programs, on the computer or in classrooms, or things like that, there’s at the moment just uncertainty and back-and-forth results about what program works consistently.