Value-Added Testing and “Weapons of Math Destruction”

I’ve written about the technical problems with using value-added testing to assess teacher performance before. Note that technical is not a synonym for trivial–these problems are fundamental. The recently settled Chicago teachers strike has made this issue important once again. What I’ve found is that once you explain how these methods work, especially to those who are statistically savvy and know a thing or two about linear regression, people can’t believe someone would actually use these methods (But anything is possible in America! Or something).

I came across this post by mathbabe that gets at why ‘reformers’ like these methods, despite the broad opinion that they’re pretty awful (boldface mine):

I agree that the value-added model (VAM) is deeply flawed; I’ve blogged about it multiple times, for example here.

The way I see it, VAM is a prime example of the way that mathematics is used as a weapon against normal people – in this case, teachers, principals, and schools. If you don’t see my logic, ask yourself this:

Why would a overly-complex, unproved and very crappy model be so protected by politicians?

There’s really one reason, namely it serves a political function, not a mathematical one. And that political function is to maintain control over the union via a magical box that nobody completely understands (including the politicians, but it serves their purposes in spite of this) and therefore nobody can argue against.

This might seem ridiculous when you have examples like this one from the Washington Post (h/t Chris Wiggins), in which a devoted and beloved math teacher named Ashley received a ludicrously low VAM score…

He [Ashley’s principal] wasn’t supposed to assume he was smart enough to understand the math behind the model. He wasn’t supposed to realize that these so-called “guides to explain the scores” actually represent the smoke being blown into the eyes of educators for the purposes of dismembering what’s left of the power of teachers’ unions in this country.

If he were better behaved, he would have bowed to the authority of the inscrutable, i.e. mathematics, and assume that his prize math teacher must have had flaws he, as her principal, just hadn’t seen before.

And then mathbabe uses a great term (that I plan on stealing), “Weapons of Math Destruction”:

Politicans have created a WMD (Weapon of Math Destruction) in VAM; it’s the equivalent of owning an uzi factory when you’re fighting a war against people with pointy sticks….

If you don’t know what I mean by WMD, let me help out: one way to spot a WMD is to look at the name versus the underlying model and take note of discrepancies. VAM is a great example of this:

•The name “Value-Added Model” makes us think we might learn how much a teacher brings to the class above and beyond, say, rote memorization.
•In fact, if you look carefully you will see that the model is measuring exactly that: teaching to the test, but with error bars so enormous that the noise almost completely obliterates any “teaching to the test” signal.

Nobody wants crappy teachers in the system, but vilifying well-meaning and hard-working professionals and subjecting them to random but high-stakes testing is not the solution, it’s pure old-fashioned scapegoating.

Again, it’s worth repeating: anyone who has had to use these methods professionally (for the record, I have) usually ranges between being extremely cautious to outright opposition in using these methods to assess teacher performance.

Of course, education ‘reformers’ and their pundit class allies are blissfully ignorant of such difficult details….

This entry was posted in Education, Mathematics, Statistics. Bookmark the permalink.

1 Response to Value-Added Testing and “Weapons of Math Destruction”

  1. joemac53 says:

    The last bit of crazy bullshit that I stayed awake for in a faculty meeting (right before I retired) was the VAM that was going to be tacked onto MCAS scores. It was up to the dept heads to explain this to their faculty. I don’t know what the English head had to say, but my math boss was at a loss. The state even warned us that two students who had identical scores in 4th and 8th grad MCAS tests might well be on “different trajectories” and therefore have different VAM values. My boss (and I) instantly thought of parabolas when we heard “trajectories”. Of course, with only two data points, there would be an infinity of trajectories available.
    When I went to the Dept of Ed website for an explanation I found that the model depended on multiple factors over which teachers had no control, or factors that they would have no way of knowing unless they invaded every student’s privacy.
    I don’t know if they are still selling this crap, but I’m glad i retired.

Comments are closed.