I sometimes cite education studies and statistics in my posts, but I try to be careful to write about the "conclusions" drawn from the material rather than saying the study "shows" or "proves" something. Any study is only as good as the quality of its data as well as the way the data is sliced and diced. When it comes to studies concerning education, that's a big problem. Skepticism is always advisable.
Short side trip: When I was taking a graduate school statistics course, our assignment was to go to the library, find a good and a bad statistical study and analyze them for their strengths and weaknesses. I asked the prof where I would be most likely to find bad statistical analysis, and he said, "Go to the education journals. Most of those studies are pretty bad." He wasn't criticizing the researchers as much as he was pointing out that it's almost impossible to create strong control groups or comparisons because the variances between students and teachers are so large. No two students, groups of students or teachers are identical, so any conclusions researchers draw from the data are open to question.
Case in point: the rise in state test scores, especially among Hispanic students, starting in 2007. Does that mean Arizona began doing a better job educating its Hispanic population?
Two researchers at Arizona State University's Educational Policy Analysis and Evaluation program have taken a look at the rise in Arizona student scores on state tests
, especially among Hispanic students, starting in 2007, a few years before SB 1070 passed in 2010. They ask the question: is the rise in scores an indication that student achievement went up, or does it reflect fewer undocumented students in our schools, which would mean fewer Hispanic students whose English language skills are low? Their conclusion: SB 1070 and the 2007 law requiring businesses to use E-Verify to check the legal status of their employees resulted in a drop in undocumented students, and that was the main driver behind the increase in state test scores among Hispanic students.
The data is fairly convincing. When the researchers looked at Arizona schools where the student population was more than 75 percent Hispanic, they found a far more dramatic rise in student scores starting in 2007 than they found in schools with smaller Hispanic student populations. They also found that the 75 percent-plus schools had a greater percentage drop in the number of Hispanic students than the other schools.
Like all studies, this one should be viewed with a healthy dose of skepticism, but its conclusions make strong intuitive sense. If the students who are likely to score the lowest on a test are eliminated from the population, test scores will rise even if the remaining students don't improve their scores. Because E-Verify and SB 1070, along with contributing economic factors, led to a net loss of undocumented Hispanics in Arizona, you would expect student scores, especially those of Hispanic students, to rise.
Here's why this is important. The Department of Education touted the rises in scores as evidence that Arizona's students were improving, a feather in the cap for the Department and for Tom Horne and John Huppenthal, the Superintendents of Public Instruction when the increases began. But if it was just a case of a changing student population, not an improvement in individual student scores, the superintendents didn't earn any bragging rights, and Arizona has little reason to conclude that it's doing a better job of educating its students in spite of funding cuts.