About the 4th Grade ELA Test Scores
THE SHARP DECLINE IN TOP SCORES
1 in 17 New York City fourth graders scored a 4 on this year’s ELA test – down from 1 in 7 in 2005. The drop was especially sharp among some of the city’s most high-performing districts and schools, including the following districts:
|District||% of 4’s in 2005||% of 4’s in 2006||% change (+/-)|
See a chart with the declines in selected NYC schools here (pdf).
Here’s another way of looking at the decline. In 2005, on the city’s ELA test for 3rd graders, 13,047 students scored a 4. In 2006, on the state’s ELA test for 4th graders, 3,793 students scored a 4. That’s a 71 percent decline in top scores among the same cohort of students from one year to the next. Of the city 3rd graders who were told they were exceeding state standards in 2005, over 9,000 of them, or 71 percent, were told that they were only “meeting” state standards as 4th graders in 2006.
THE SCORES OVERALL
Link to the 4th grade ELA scores and data for 2005 and 2006 here.
Here you can see how NYC Schools Chancellor Joel Klein and NY State Education Commissioner Richard Mills crunch and spin the numbers:
Chancellor Klein’s press release on the scores
Chancellor Klein’s Power Point presentation
Commissioner Mills’ press release on the scores
Commissioner Mills’ Power Point presentation
As you can see, neither Klein nor Mills mentioned the sharp decline in 4’s. Perhaps that was because focusing on a decline in top scores would have embarrassed both officials – Klein, because the decline would seem to signal a sharp decline in school performance, and Mills, because the decline might bring scrutiny to a deeply flawed test.
Just as likely, Klein and Mills avoided analysis of the 4’s because both know the limitations of the state ELA test – specifically that it’s not designed to draw fine distinctions between students, especially students performing at or above grade level. As one state Education official told us, “It’s really about the ones and twos.”
Both officials focused on 3’s and 4’s combined – the students who "passed" the test. Here, Chancellor Klein found good news for the NYC schools: although the percentage of NYC students scoring a 3 or 4 declined from 2005, the city actually gained ground versus the state’s other cities. This was a “harder” test than last year’s, Klein explained, and the important thing was that NYC’s decline was lower than the decline elsewhere.
“HARD” TEST – OR BAD TEST
There are two ways of making a “harder” test: make the content harder, or change the grading scale.
On the writing test, at least, it would be a stretch to describe the content of this year’s 4th grade ELA test as “hard.” Deeply flawed, nonsensical, and bewildering -- yes. Hard -- no.
Commissioner Mills and the Education Department did change the grading scale this year, in deciding the “cut scores” – where the lines are drawn between ones, twos, threes and fours. Every year’s cuts are different, so that a “scaled score” of 600 could be a 1 one year and a 2 the next.
This year’s range for a 4 is much narrower than last year’s. A 4 in 2005 was a “scaled score” between 692 and 800. In 2006, a 4 is between 716 and 775. So a large number of scores that would have been 4’s last year are 3’s this year.
There is a fair amount of, shall we say, mystery in this process of drawing the “cut scores.” We imagine that there is room for other considerations here besides calibrating the scores with the state’s curricular standards.
Yes, the grading on this year’s test was “hard” – thanks to where the state’s decision about the grading scale. That may explain, in large part, the decline in 4’s.
But “cut scores” (ones through fours) aside, we suspect this year’s scaled scores on the ELA still give a distorted picture of student performance and school performance.
COMMISSIONER MILLS’ NUMBERS NOT WORTH CRUNCHING
The fatal flaws in the two main essay questions taint the results of the 4th grade ELA test. Neither question could possibly have elicited the best writing from students. Nor could the responses to either question have been scored consistently and objectively by thousands of widely varied graders across New York State.
Tens of millions of public dollars are spent each year in the development, administration and scoring of these tests, and millions more in crunching the numbers and spinning them into a story education officials tell us about school performance. More importantly, the schools’ curriculum is re-shaped to prepare students for these tests, and countless hours of classroom and homework time are devoted to that preparation. Even respectable curriculum developers examine these tests and design curricular materials accordingly; Teachers’ College now offers a unit called “Test-Taking as a Genre.”
With tests this flawed, all of these resources are wasted, and the data they produce paint a distorted picture of student and school performance. Are we really to believe that 60% fewer fourth graders in NYC are exceeding state ELA standards than last year? Are we to believe that 71% of the cohort deemed to be “exceeding” state standards in 2005 are now no longer exceeding standards but just “meeting” them? Should we search for the cause of this pedagogical collapse, and insist on the dismissal of those responsible? Or is there something wrong with this picture?
Commissioner Mills may continue to defend this year’s “hard” test. He’ll maintain that the drop in top scores is really a return to reality after a couple of years of increasingly “easy” tests (which, for all we know, also offer a distorted picture of student and school performance). He may even believe it.
(There’s one ironic note here: Mills and Klein say it was a hard test. Other officials of the state Education Department, when confronted with the nonsensical “Brownie the Cow” question, insisted that the question was “easy.”)
But we’ve seen the essay questions. And common sense tells us that at least part of the explanation for the drop in top scores lies in the old adage: ask a dumb question, and you almost never get an excellent answer (1 in 17 sounds about right). And, just as importantly: ask a dumb question, and the graders will have a hard time telling the difference between meeting standards (3) and exceeding standards (4).