Monday, October 1, 2012

An example.

I don't believe for a minute that it will be taken as such, but the debacle around the release of the so called national standards data (NSD) for our schools provides an illustrative example of how science works. Granted, it's not usually carried out in such a public forum but I think it's a fair example. With one major exception.

ScienceNational Standards
-ExperimentSurvey/data collection
-Peer review to ensure coherence
-Publish Publish
-Work is attacked by the scientific communityWork attacked by people who know basic statistics
-Work becomes valued and retained or is found lacking and is forgottenWork is found lacking

Ideally your experimental design gets thrashed about before you start. The NSD equivalent would have been the government listening to the people who study education.  Peer review would have been equivalent to getting some statisticians to curate the data and verify what conclusions can and cannot be drawn from the data. I've argued before that peer review is a basic hurdle, not definitive of good science. Over the years there's been a fair amount of work that's got past the peer review process that really shouldn't have. Wakefields paper linking the MMR vaccine and autism for one. More recently the appalling rats feeding on GMOs out of France. Without that basic coherence check we would have chaos.

Which is amply demonstrated by the current reporting of the NSD. The data is currently being disassembled by a number of people who know their statistics. Result: there's precious little information in there and it does an okay-ish job of reporting something that we already knew, that there is a correlation between poor performance and decile.

The difference being that the chances of the NSD being quietly forgotten and not acted upon is small.

No comments:

Post a Comment