We've been given a first look at the new AzMERIT results
from the tests students took in spring. They haven't been broken down in granular detail, but we know how students scored at each grade level in math and language arts. The numbers look reasonably good. Basically, they're a little better than they were the year before. No question, up is better than down, but does that mean Arizona students have improved in math and language arts? It's not an easy question to answer. Let me throw out a few ideas without trying to arrive at any solid conclusions.
This is the third year the state has given students the AzMERIT test as a replacement for AIMS, and that means it's the second year teachers have been able to teach to the new test. The first year, teachers didn't have much of an idea what the test was like, so when it came to test prep, they were like generals fighting the last war. They had been teaching to the AIMS test for years, and they didn't know how to change their strategies to help their students with AzMERIT. The second year they knew more about how the new test was structured and what kind of questions the students would be asked, so they made an effort at tailoring their test prep to the task. The third year, with the previous year's experience under their belts, they refined their test prep technique a bit more. Which begs the question: do this year's higher scores reflect an improvement in students' achievement or their teachers' test prep proficiency?
Whenever students are taught how to take a specific test, the results are thrown into doubt. Are students learning the concepts behind the test questions, or have they simply become more adept at answering the questions? Our obsession with yearly results on high stakes tests means the results people value so highly don't mean much. Worse, the tests distort students' educational experience by making teachers focus on narrow sections of the curriculum at the expense of equally important areas which aren't on the test. You can't blame teachers for spending an inordinate amount of time on what will be tested, even when they know their overemphasis on the tested material does their students a disservice. Their individual evaluations and the state grades their schools receive hang in the balance. The scores are too damn important to let giving their students a comprehensive education get in the way.
If we want to monitor students to get a sense of how they're doing on their basic math and language skills, a better way is to test student achievement every few years in selected grades — and separate the scores from funding and school grades.
Actually, we have a test like that, the NAEP (National Assessment of Educational Progress) which has been given to a sampling of 4th and 8th grade students across the country every few years since the 1970s If there's one test educators tend to agree has a reasonable amount of validity, the NAEP is it. Interestingly, in the last round of NAEP tests in 2015, Arizona students improved everywhere except in 4th grade math where the score dipped slightly, while the overall national scores went down. If Arizona wants bragging rights for improved student achievement, the NAEP results are the place to go.
But that brings up another question about our improved numbers on both the NAEP and AzMERIT tests. Do the students taking the recent tests represent the same socioeconomic groupings as with previous tests? If Arizona's demographics have shifted over the past few years, the results could shift without a significant change in the achievement level of individual students.
Look, for instance, at Arizona's Mexican American population. The influx of immigrants has slowed in recent years. More Mexican Americans have left the state than have moved here. As a result, more of Arizona's Mexican American students were born in the U.S., and more of those who came from elsewhere have been in the country for a number of years. That means the current students are likely to be more proficient in English and have spent more time in U.S. schools than the population a few years ago. You would expect them to score higher on the state tests than a group with overall lower English skills and fewer years of U.S. education. A shift in the student population in other socioeconomic groups over the past few years could also result in changed scores without a significant change in the level of individual student achievement. It would take an extensive, detailed analysis of the changes in Arizona's student population, then a similarly detailed study of how various student populations performed on the tests to separate changing demographics from increasing test scores—and after all that work, the conclusions would be tentative at best.
Evaluating the AzMERIT results, or any high stakes standardized test results, is tricky business. The most important thing we can learn from high stakes testing is that we don't learn much from high stakes testing. Oh, and we learn that higher family income results in higher test scores, but we already knew that. Now, if Arizona test scores keep climbing on future NAEP tests, it may be time to say we're doing a better job educating students in the areas of math and language arts. If the state's demographics don't change, that is.