Thursday 18 July 2019

Stats and IDI funding

Stats don't come for free. Who should pay? 

Newsroom covers budget problems over at Statistics NZ.
An appendix document to Shaw’s February briefing raises the prospect that, without extra money, some Government priorities – surveys on social statistics, child poverty, and the household labour force survey – might have to stop.

Stats NZ, which employs more than 900 people, seems keen to cut surveys rather than staff. It says budget constraints “would limit our ability to ensure remuneration kept pace with the market, and Stats NZ would lose capability”.

(For the year ended June 30, 2018, Stats NZ’s personnel costs were $115 million, up from $83 million the year before. The census accounted for 81 percent of the increase. This year’s estimate, contained in the last annual report, is for an $89 million personnel bill.)

One way to raise more money, Stats NZ suggests, is to start charging a “value-add service” for use of its so-called integrated data infrastructure, or IDI. That’s data used by government agencies, researchers and academics to better understand societal issues. A quick search reveals research topics using IDI such as “causes and consequences of criminal activities”, “child obesity prevalence”, and “predicting suicide and self-harm risk”.

Substantial increases in the use of IDI would require “more aggressive charging”, Stats NZ tells Shaw. In fact, the appendix says forecasts of increased Stats NZ revenues rely on charging more for IDI.

After its clash with MBIE over the accommodation survey, you’ve got to wonder how much success Stats NZ will have in demanding other agencies and universities pay for the data.
In a pure model, you'd run all of these services on cost-recovery where Ministries using data series would pay the costs of running them. The value of the data is in the using; figuring out what data should be prioritised should be worked out by looking backward from usefulness; if the value of a data series to a Ministry in running its operations is less than the cost to Stats of running that survey, then the Ministry should be finding other ways of doing things. Now that all requires some tight prior assumptions about that funding to each of the Ministries for each is appropriate and that Ministries are getting things right in their own prioritisations, but it isn't a crazy first cut.

Now let's tweak things.

Suppose we expect that Ministries have weak-as-heck incentives to run post-implementation reviews and are looking for excuses not to run the things. First best is Cabinet demanding the things and budget bids for programme continuation depending on them. If something blocks that, then subsidising the inputs into PIRs and CBAs might not be nuts. It might not be particularly cost-effective if the incentives to undertake them are sufficiently weak and if the cost of the stats is small relative to staff time in running the analysis of the stats, but you can then at least get rid of lines for excuses. It's harder to plead reasons for not running the PIR if the data for doing it is sitting right there.

When we start looking at external uses of IDI, things get more complicated. And, full disclosure, our organisation does work in the IDI.

Every project in the IDI has to pass a public interest test:
2. Safe projects – to gain access to integrated data, researchers must have a project they can demonstrate is in the public interest.

Research projects must focus on finding insights and solutions to issues that are likely to have a wide public benefit. The IDI and LBD cannot be used for individual case management, such as making decisions about a specific person or family.
These are not for-profit consultancy reports helping to drive business decisions. If they were, then it would make complete sense for Stats to at least charge full cost recovery. They're instead public interest projects judged likely to have wide public benefit.

If those projects are undertaken by the public sector, then you'd hope that the funding line for the project would incorporate the data costs and the whole thing weighed on whether the overall benefit outweighs the overall cost.

Where those projects are undertaken by NGOs on limited budgets, putting their own resources into the provision of those public goods, you might generally expect that public goods are underprovided relative to a first best. So things then depend on the actual measure of public benefit relative to the cost of providing the IDI infrastructure and staffing - and that's anybody's guess. I'd like to think that the kind of data lab work we're doing on getting better measures of school performance could lead to policy changes that improve school outcomes overall, but I could be wrong.

I don't know what we're currently being charged for IDI access, but do know that there are prices we couldn't pay. The main project cost is researcher time; charges will be small relative to that.

Maybe Stats could spread the fixed costs of the IDI by making it easier for foreign academics to use, so per-project costs could be lower than otherwise.

I generally see government investment in this kind of data as being potentially highly leveraged. For an upfront investment in good and accessible data, you let a lot of not-NZ-government-funded people do a lot of work that can help the government in setting better policy. That's especially true where you let foreign researchers help out. In that kind of context, the cost of increased access fees could be outweighed by the reduction in the value of produced research. And I'm reminded of the story of the CURF that took an analyst months to make but was hardly ever used by anybody because Stats just makes it too hard to access the things.

No comments:

Post a Comment