Everything You Need to Know About Education ‘Reform’: StudentsFirst Issues State Ratings Unrelated to Actual Learning

So I’m working on some other education-related posts (recently read some stuff that pissed me off and want to get it right), but it’s worth noting that Michelle Rhee’s education faith-tank StudentsFirst has issued ratings of various states–ratings that have nothing to do with actual educational outcomes (boldface mine):

In a report issued Monday, StudentsFirst ranks states based on how closely they follow the group’s platform, looking at policies related not only to tenure and evaluations but also to pensions and the governance of school districts. The group uses the classic academic grading system, awarding states A to F ratings.

With no states receiving an A, two states receiving B-minuses and 11 states branded with an F, StudentsFirst would seem to be building a reputation as a harsh grader.

Ms. Rhee said that the relatively weak showing reflected how recently statehouses had begun to address issues like tenure and performance evaluations.

So who are the winners? You’ll never guess:

The two highest-ranking states, Florida and Louisiana, received B-minus ratings. The states that were given F’s included Alabama, California, Iowa and New Hampshire. New Jersey and New York received D grades, and Connecticut a D-plus. The ratings, which focused purely on state laws and policies, did not take into account student test scores.

You don’t say? We’ll get to test scores in a moment, but first, let’s look at StudentsFirst’s grades:

studentsfirstv1

So Massachusetts, which has one of the best education systems in the world, ranks fourteenth. One of Massachusetts’ supposed deficiencies is that “[t]he state should create more high-quality public charter options for parents by strengthening accountability rules for charter schools and improving their access to facilities.” Never mind that a reevaluation of Boston charter schools showed no difference between regular public and charter schools (pdf; though the charter schools did have lower retention rates. Yes, lower to the point where they had to be rectified by state law). Minnesota, which is also very competitive, ranks 26th with a D. New Hampshire, which also does very well according to the NAEP, ranks…41st with an F. North Dakota, another high-performing state according to the NAEP, finishes dead last.

So we have systems that work very well (or are even world-class), but according to StudentsFirst, they are mediocre since they are ideologically incorrect. This is fucking moronic.

What makes this exercise in neo-liberal corporate agitprop all the more despicable is that any of the top-performing nations, such as Korea, Taiwan, Japan, Finland (don’t forget Poland Finland!) would fail miserably according to StudentsFirst’s political criteria.

That, and their U.S. gradings, tell you all you need to know about education ‘reform’ and its so-called leadership.

This entry was posted in Conservatives, Education, Fucking Morons. Bookmark the permalink.

3 Responses to Everything You Need to Know About Education ‘Reform’: StudentsFirst Issues State Ratings Unrelated to Actual Learning

  1. JohnR says:

    This is a rather bizarre effort by Rhee’s group. The starting point is the assumption that education quality is negatively correlated with things like teacher tenure. Rhee ranks the 50 states plus DC by her assessment of their political effort to impose what she sees as positive changes in the education system. Accordingly, we can use her rankings as a rough estimate of what she sees as good and bad state education systems. The question then becomes, how do we measure the outcomes of these systems? By sheer luck, we happen to have national test scores, which should allow us to compare these same 51 regions on a standard scale. If Rhee is correct in her assumption(s), there should be some correlation between her rankings and the rankings of the same regions based on the observed test scores. I just did a quick check – I took the 2011 NAEP 8th grade reading scores, ranked the states and DC by the results and checked the correlation. There was indeed a slight negative correlation (Rhee’s ranks were slightly likely to reflect the opposite of the actual test scores) but it was nowhere near significant at something like -0.2. If you plot Rhee’s rankings against the actual values, you get a nicely scattered scatter plot with that slight negative trend line. The bottom 10 NAEP ranked regions include 3 of Rhee’s top 10 regions, and 2 of her bottom 10 regions. The top 10 NAEP regions include 3 of Rhee’s bottom 10 regions and 1 of her top 10 (Colorado ranks about the same in both sets, oddly enough). She can certainly try to obscure the point by a hand-waving smokescreen about how states are only beginning to put her favored policies into place, but that raises the question of how some of her “worst” states have some of the best scores (and vice versa). All she has actually done with the exercise that spawned this all, is to point out that (unless there is a lot of information that looks very different from the little set I’ve looked at here) her assumptions are highly questionable. Incidentally, looking at recent change in the results doesn’t do her any favors either, as where the states rank and how much they’ve changed also show the same sort of pattern (ie not much, but slightly opposite) when we compare her assessments and the NAEP 8th grade reading scores.

  2. Pingback: More StudentsFirst Insanity | Mike the Mad Biologist

  3. Pingback: Links 1/11/13 | Mike the Mad Biologist

Comments are closed.