One of the flaws–a bad frame if you must–in the communication debate flareups that happen every so often is that the debate (such as it is) is phrased as “why can’t scientists communicate?” This is actually an imprecise and incorrect way of stating the problem. It should be stated as “How do scientists convincingly explain that misinformation is actually misinformation?” Once we phrase it like that, we realize that the problem isn’t that we need to ‘stop being such a scientist’, because no one has a particularly good strategy for dealing with misinformation.
To use an example that won’t (maybe) cause too much unnecessary flaming, here’s some coverage of a recent review of the KIPP charter schools:
At about half the KIPP schools, the study found that the gains in math for students after three years in the schools were equivalent of 1.2 years of extra instruction, and .9 years of additional instruction in reading…
On average, the report says, KIPP middle schools have students who are more likely to be living in poverty and are more likely to be black or Hispanic than are students from the schools around them. Back when they were in 4th grade, the study also found, a majority of the KIPP middle school students had lower test scores on average than did students in their local school districts.
Sounds great! Is our children learning? Yes! But there’s a problem–and it is very technical, but, nonetheless, critical:
However, an initial analysis of the report by Professor Gary Miron of Western Michigan University concludes that this initial study report misrepresents the attrition data. According to Miron, “While it may be true that attrition rates for KIPP schools and surrounding districts are similar, there is a big difference: KIPP does not generally fill empty places with the weaker students who are moving from school to school. Traditional public schools must receive all students who wish to attend, so the lower-performing students leaving KIPP schools receive a place in those schools.”
In contrast, Miron explains, “The lower performing, transient students coming from traditional public schools are not given a place in KIPP, since those schools generally only take students in during the initial intake grade, whether this be 5th or 6th grade.”
The KIPP study’s description of attrition only considers half the equation, when comparing KIPP schools to matched traditional public schools. The researchers looked at the attrition rates, which they found to be similar – in the sense of the number of students departing from schools. But they never considered the receiving or intake rate. Even though the researchers agree that the students who are mobile are lower performing, they do not take into account the reality that KIPP schools do not generally receive these students.
This is highly technical, but really critical*. It’s pretty boring, actually. While a few people will have the time and inclination to follow this, most won’t. So what’s the effective pushback? This is really outside the boundaries of framing, since this truly is a matter of analysis. Once we reach the ‘dueling experts’ stage, all that’s left are ad hominem attacks: my expert is better than yours.
*KIPP schools do perform better, but if they had to retain poorly performing students, it’s not clear how KIPP schools would hold up.
You chose a good example of on communicating a complex issue: how do we measure a good education–when we can barely define it?
I didn’t read the Mathematica report on KIPP, just the Education Week coverage of it. I thought that the comments of the EdWeek piece were quite thoughtful considering the quite contentious topic.
Another point is that some characteristics are harder to measure than family income, or achievement data from the previous school. The short version is that enrolling in a KIPP school requires a parent or guardian to find out about the school (or at least take interest in and respond to outreach), step up to work the application process and go through an intake counseling/interview that includes signed commitments to this-n-that. IN addition, incoming KIPP students must take a test (KIPP says acceptance is based on this test) — which requires the student to be willing to sit for the test. In other words, incoming KIPP families self-select for being motivated to put effort into their child’s education and high-functioning enough to get through the enrollment process. The incoming student must be compliant enough to sign agreements and take the test. This process selects out families that are too dysfunctional or unmotivated, and oppositional/defiant students — the students and families who pose the greatest burden to public education.
Is there a way to quantify the degree of interest and concern a family has for its children’s education, and the amount of effort the family is willing to devote? Is there a way to quantify a student’s willingness to comply with requirements and commit to a set of expectations? How does science handle this?
Correcting an omitted word: KIPP says that acceptance is NOT based on the results of the test that it requires incoming students to take. The test is to assess the student’s academic level. At grades above 5, students are placed in the grade they test into. KIPP says that incoming 5th graders are placed in grade 5 regardless of the results of that test.
Some further points about this testing, however.
— I’ve been told repeatedly that in the community, the test is believed to be an admission test — that is, some families may believe that students who don’t test well won’t get in. This would tend to deter not only non-compliant students who would refuse to cooperate with the testing, but also students and families who believe they wouldn’t have a chance of “passing” and being admitted.
— Students who are told that based on the test results they will be required to repeat 5th grade if they enroll at KIPP, despite having completed 5th grade, are quite likely to choose not to enroll at KIPP — especially less-compliant students and families. So once again, this process is likely to deter the lower-functioning and less-compliant.
As an unpaid blogger and public-school advocate, I’ve done research on KIPP attrition rates, and also went undercover to apply to KIPP San Francisco Bay Academy for my then-7th-grader, to learn about the process. So my information is partly based on what I learned from that experience.