I made my request on 19 February, splitting it into three parts to make it simpler.
The first part asked for the terms of reference, timelines, dates, correspondence and briefings on it, and expected publication date for the final report.
The second one asked for any preliminary drafts of the final reports, in case there were holdups in getting the final report.
And the last one asked for the final report.
Simple enough, right? Delays in the final report should have at least gotten me the drafts.
A phone call from the Ministry suggested a down-scoping that might hurry things along. They wanted to restrict the dates to after 1 March 2017 and exclude correspondence. Smelling something fishy, I said "Now wait a minute. Sometimes Ministries seek a clarification so that they can re-start the OIA clock. You're not doing anything like that are you?" And the Ministry assured me they weren't. I made clear that I wouldn't agree to any changed wording if it meant they were going to restart the clock. And then I sent them a note agreeing to the clarification on the condition that it wouldn't change the dates. The correspondence would have been nice, but even nicer to just get the report.
The OIA clock had it all due on 19 March.
I received an extension letter on 26 March telling me that they'd restarted the clock because of the clarification of the date. Nice. After they told me that they totally weren't going to do that. The Ministry replied they could do that at their sole discretion so nuts to me. More the fool me for believing what the Ministry told me on the phone. I'm told that it's odd for the Ministry to push an extension after having already secured an agreed down-scoping.
Anyway, it's all sitting with the Ombudsman.
And then, on Friday, the report showed up on the Education Counts website. I found out about it by Twitter, when Seymour's office noted that it was out. And then when I went through emails I hadn't looked at on Friday, there was one from one of the partnership schools lauding how well it did in the report. So I guess I would have seen it Friday if I'd been a bit less busy Friday afternoon. Maybe it went up earlier than Friday - I couldn't tell.
So - the Ministry stonewalled the OIA request for the report, then never told me when it quietly went up on their website. I've still not heard back from them. I sent them an email this morning letting them know it's on their website in case they hadn't noticed it.
What's then the explosive stuff that the Ministry's been so reluctant to release? Nothing, really. Survey results from parents and students at some of the partnership schools with uneven response rates and that doesn't compare results with traditional state schools. They didn't compare outcomes because there was no comparison group available - I would have thought that you could put together a matched group of students through IDI.
They use anonymised student record data from the Ministry of Ed to look at student characteristics like:
- ethnicity and iwi affiliation
- proportion of students who had attended lots of schools prior to going to the partnership school (transience)
- proportion of students who had only attended a partnership school, who had shifted to a partnership school from another school, proportion that jumped around between other schools and partnership schools - but with nothing that would let us know whether the transitions out of partnership schools are high or low relative to other schools
- number of stand downs and suspensions - with comparison for those same students at their prior or subsequent non-partnership schools. Students at partnership schools look to have substantially lower numbers of standdowns while at the partnership schools - but you'd really want to check against other students who flip around among state schools and compare with their currently attended school.
The rest is all survey data from parents and kids about why they chose the partnership schools. The parents answering the surveys look pretty satisfied, but we have no comparison group of comparable students at state schools and little sense of whether the parents answering the surveys are representative. Parents generally find things at their current partnership school better than at their kids' prior school, but the same could easily be true of any other group of students who recently shifted from one school to another school - people change schools when they didn't like the last school (or when they move for other reasons).
And that's about it. There's nothing really in there that would let you know whether the partnership schools are doing well or doing poorly. And since all the parents would have been answering the survey questions under the cloud of "You know, if the answers look bad, the government will likely close our school and you love our school right?, it's hard to tell what to make of any of it. Whatever your priors were about the things, there's no reason to change them. Folks like me who think parental choice matters won't have changed views about that, other than frustration that the prior government so completely screwed up setting an evaluation framework around the things that could have provided any kind of evidence as to effectiveness. It wouldn't have been that hard: baseline testing of all kids applying for partnership schools, randomised entry into oversubscribed schools, then annual testing of all the kids who applied regardless of whether they went to the partnership school.
It all leaves me a bit baffled. Why stonewall a report that really doesn't have much in it? Why not just release the thing saying "Well, we don't really find any compelling evidence in this report either way, so we see no need to change our prior policy."
I wonder if part of the problem was just that the report looks like it was delivered way late. All the prior stuff had suggested the report was due end-2017. The report on the website is dated March 2018. Why couldn't the Ministry have just said "Hey, the report hasn't been delivered yet" when we talked on the phone back in February?
Maybe the stonewalling was less about avoiding embarrassing the Minister and more about having commissioned a report that really can't say much.
Update: Nothing in here should be read as damning any of those schools - as I understand some have read it. The evaluation framework is the problem.
From the response rates, it looks like one of the schools took the evaluation more seriously than did the others and worked harder to get responses back. But even if there were 100% response rates, without comparisons over in state schools, it's hard to say what any of the parent satisfaction numbers would mean. Would a lot of parents saying that they're happier with their current middle school than with their prior primary school mean that there's something in particular that's better about that middle school - or about that parents are generally happier with middle schools? Impossible to tell without baseline data on how well middle schools (for example) generally perform.
Update: Nothing in here should be read as damning any of those schools - as I understand some have read it. The evaluation framework is the problem.
From the response rates, it looks like one of the schools took the evaluation more seriously than did the others and worked harder to get responses back. But even if there were 100% response rates, without comparisons over in state schools, it's hard to say what any of the parent satisfaction numbers would mean. Would a lot of parents saying that they're happier with their current middle school than with their prior primary school mean that there's something in particular that's better about that middle school - or about that parents are generally happier with middle schools? Impossible to tell without baseline data on how well middle schools (for example) generally perform.
No comments:
Post a Comment