More evidence that league tables are no help to parents

School league tables are very little use to parents choosing a secondary school for their children, because their ability to predict on the basis of a single year’s data how good a school will be in six years’ time, when the child takes GCSE.

This was demonstrated by George Leckie and Harvey Goldstein of the University of Bristol, who concluded that the confidence intervals in one performance measure used (“contextual value added”) were so wide that almost no schools could be reliably distinguished as being better than others.

But what if you looked at results not in a single year but in a run of years? It might be expected that performance measured over several years would have greater predictive power than that measured over a single year.

Leckie and Goldstein revisit the subject in the latest issue of the Journal of the Royal Statistical Society Series A, pages 833-836. They consider the cohort of children that used the 2009 tables to choose schools for entry in 2010 and will take GCSE in 2016. They use eight years of CVA tables, 2002-09, to predict whether schools can reliably be distinguished from one another, comparing this with using only the 2009 tables.

Their conclusion is that it makes very little difference, the incorporation of earlier years of data only marginally improving precision. “School’s future contextual value-added performance cannot be predicted reliably” they say. “School league tables have very little to offer as guides for school choice ... They continue, therefore, to provide parents with school choice information that is inherently misleading.”

In fact, the 2010 tables will be the last to contain CVA measures, the Coalition Government having announced that it is abandoning them. The 2011 tables will, however, include other value-added measures when they are published in December 2011 and January 2012. In the White Paper The Importance of Teaching, the Department for Education said: “We will put an end to the current ‘contextual value added’ (CVA) measure. This measure attempts to quantify how well a school does with its pupil population compared to pupils with similar characteristics nationally. However, the measure is difficult for the public to understand, and recent research shows it to be a less strong predictor of success than raw attainment measures.”

So it looks as if at least one government department takes account of research findings, at least when they fit with ministerial preconceptions. The CVA tables were little help to parents, although they may be useful, Leckie and Goldstein say, in monitoring and identifying schools that need further investigation.