Teacher Evaluation As Practiced

While I’m very skeptical of teacher evaluation methods, since they are incredibly imprecise, it still blows my mind how stupidly implemented they are. Consider D.C.’s IMPACT system (boldface mine):

Under DCPS’s teacher evaluation system, called IMPACT, teachers in affluent Ward 3 get ratings that are significantly above those in lower-income Wards 7 and 8, according to a study based on data from 2010 to 2013. Another study shows that 41% of teachers in Ward 3 received IMPACT’s top rating of “highly effective” in 2011-12, as compared to only 9% in Ward 8.

DCPS bases IMPACT scores on a number of factors, including classroom observations and growth in students’ test scores for teachers of tested grades and subjects. Charter schools have their own methods of evaluating teachers….

It [IMPACT] also may explain why there are so many fewer highly rated teachers at high-poverty schools in the first place.

For one thing, part of the IMPACT score for some teachers depends on how much the teacher has increased her students’ test scores in a given year. But the tests are geared to a student’s grade level, and many students at high-needs schools are several grade levels behind.

If a 10th-grader comes into a teacher’s class at a 5th-grade level and the teacher succeeds in bringing the student’s skills up to a 6th- or 7th-grade level, the test isn’t geared to capture that improvement. Neither the teacher nor the school gets credit. And there’s virtually no way to bring a student up five grade levels in a single year.

No teacher wants to go into a school where you can only be told you’ve failed,” says David Tansey, a math teacher at Dunbar High School.

Teachers at high-needs schools, where behavior problems are more common, are also more likely to get low ratings on the classroom observation component of their IMPACT scores. Tansey recalls getting a low rating from one observer because a student cursed in class.

Tansey pointed out that the student had corrected himself, something that reflected Tansey’s efforts and was a vast improvement over the student’s behavior at the beginning of the year. But, he says, that made no difference to the observer.

Essentially, there’s no incentive whatsoever to teach poorly performing–which is strongly correlated with being poor–students.

It boggles the mind that a bunch of people at DDOE presumably strapped on their thinkin’ caps, had lots of meetings, wrote reports, and then devised a system… that gives powerful incentives to avoid the neediest students. Leaving aside technical issues (which aren’t trivial), this is the reality of education ‘reform’, and it’s not helping at all.

This entry was posted in DC, Education, Fucking Morons. Bookmark the permalink.