Standardized testing in America apparently has gotten so big that it's now the subject of a major motion picture.
|
To view this graphic you must have Adobe Acrobat Reader, available as a free download from Adobe |
|||
The SAT is the granddaddy of standardized tests -- it was first given in 1926 -- but countless such tests have followed. And their significance is growing, thanks to the federal No Child Left Behind Act, which some critics have nicknamed No Child Left Untested.
It's because of the No Child Left Behind Act that Pennsylvania students will have to take state tests in math and reading every year from third through eighth grade -- and once again in high school -- and three tests in science (in elementary, middle and high school).
And some also will take the National Assessment of Educational Progress in math and reading.
But while the tests are called standardized, what they say about how students are doing is anything but.
"None of this is science. All of this is judgment,'' said Lawrence Feinberg, assistant director of the National Assessment Governing Board, which sets NAEP policy.
Take the results of the Pennsylvania System of School Assessment -- known as the PSSA -- in math and reading in grades 5, 8 and 11.
After a review of the fifth-grade results, Murry Gerber, chairman of the education goal committee of the Allegheny Conference on Community Development, announced the region was "failing" to provide "essential skills'' to students.
"Citizens may be surprised to see that their schools are not doing as well as they thought,'' Gerber told a news conference last fall.
But a Pennsylvania School Reform Network report put a very different spin on the same results:
"By 2003 standards, Pennsylvania students and the teachers and schools that educate them did quite well -- better than most people think."
If those two comments confuse you, consider the plight of schools that find themselves on lists that praise them and ones that condemn them -- at the same time.
About a year ago, Roosevelt Elementary School in Carrick was singled out by Standard and Poor's School Evaluation Services as one of the schools statewide making the biggest gains on the PSSA.
Then, this fall, the state put Roosevelt on a warning list. That's because students made the state's math and reading goals, but missed its attendance standard.
After that, the Allegheny Conference on Community Development gave Roosevelt a grade of E-minus in both math and reading.
Praise, a warning and condemnation. What's the public to think?
It's a classic case of the half-full, half-empty glass. It depends on who's measuring, how and why.
Consider the results of NAEP, which many experts believe is the best national benchmark of student achievement.
Thanks to No Child Left Behind, NAEP results for states are being compared to the results on state standardized tests, making some states look as if their tests are too easy or their students are doing poorly.
The tests have four achievement levels: advanced, proficient, basic and below basic.
Part of the problem is the varied definitions of "proficient." NAEP says that 30 percent of Pennsylvania's eighth-graders were proficient or better in math. But the state's own PSSA rated 51 percent of eighth-graders proficient.
In reading, NAEP said 32 percent were proficient or advanced; state tests showed 63 percent scored proficient or better.
So, how many students really are proficient?
"I get a headache when I think about it,'' said Mike Cohen, president of Achieve Inc., a Washington, D.C.-based nonprofit group that advises states on tests.
Yes, he said, it's easier to score proficient on many state tests -- including Pennsylvania's -- than on the NAEP. The quality varies, or the tests are fitted to different standards, Cohen said.
Nowhere is the gap more obvious than in Texas, where 89 percent of eighth-graders are proficient or advanced on the state reading test -- but only 26 percent on NAEP.
"Unfortunately, those labels are so variable from state to state that it's hard to make much sense of them,'' said Robert Linn, a University of Colorado professor and co-director of the Center for Research on Evaluation, Standards and Student Testing.
Even within the same test, the standard of proficiency varies widely from grade to grade, according to H.D. Hoover, a University of Iowa professor and director of the Iowa Basic Skills Testing Program, who studied NAEP and three major standardized tests -- the Iowa, Stanford and California achievement tests.
He gave this example from one of the tests:
A second-grader scores in the 60th percentile in reading -- better than 60 percent of the kids who took the test -- and is deemed proficient. In third grade, the same student ranks in the 80th percentile -- even higher -- but is labeled not proficient.
What happened? Different committees helped set the cutoff scores for "proficient" in those two grades, he said.
Hoover thinks that's wrong. "The kid had a great year. If you read better than 80 percent of the kids, you're a good reader,'' he said.
"In virtually every case, [proficiency] has been set in a way that is very misleading about kids' learning levels,'' said Hoover. "If our kids can read like the average third- or fourth-grader when they're in third or fourth grade, to say they aren't proficient is ridiculous.''
Psychometrician Shula Nedley, former testing director for Pittsburgh Public Schools, doesn't agree that average students are by definition "proficient."
When teachers are interviewed about which of their students are proficient, Nedley said, "What you find out is to be an average reader in this country as measured by XYZ test, you don't have to be a proficient reader." Test makers face political pressures. Nedley said publishers of commercial standardized tests purchased by districts know that "if the tests seem very, very hard, they're not going to sell them.''
Those who set achievement levels for state tests face different pressures than those at NAEP because the state tests have repercussions for individual students and NAEP does not. Gerald Bracey, associate education professor at George Mason University, considers NAEP achievement levels "outrageously difficult.''
"They don't accord with any other data, most notably international comparisons,'' Bracey said.
Cohen said his organization soon will release a report on how the tests compare to the skills high school graduates need to do well in college or on the job. He said many of the questions on state tests don't measure that knowledge.
Experts said that no one test alone will provide a full picture on how a student is doing and what the student needs.
"The most important thing is the parents and teachers have to be informed what the assessments can and cannot do,'' said Suzanne Lane, a University of Pittsburgh education professor. "You have to think of not just one assessment. You have to think about an assessment system.''