Research Assessment: abandoning the denominator

 Leaving out the denominator is a cardinal error in statistics. It is unhelpful to be told that 65 people in Sheffield have swine flu, for example, if one doesn’t know the population of Sheffield. Does this bald figure mean one in a hundred people has the infection, one in a thousand, or one in ten thousand?

 So it is astonishing that the recently-published 2008 Research Assessment Exercise, which provides the most comprehensive review of the quality of research in British higher education, was published without denominators.
 Universities were allowed to submit details of as many or as few of their academics as they wished, without disclosing the total numbers eligible for submission.
 
This enabled universities with a few stars and a long tail of less productive researchers to submit only the stars, and win a high profile for research excellence, enhancing their reputations. 
 
Some universities get very hot under the collar when it seemed possible they would have to disclose the total numbers eligible. There are rumours that 11 of them threatened to seek a legal injunction to prevent that happening. They certainly sought legal advice, though a spokesman for the Higher Education Funding Council for England (HEFCE) denied a legal challenge had actually been launched.
 
Those universities opposed to the release of the data blame HEFCE for a muddle over the definition of staff eligible for inclusion. This means, they say, that different institutions might have interpreted the guidance in different ways, submitting data that would not be comparable across higher education as a whole.
 
The Higher Education Statistics Agency (HESA) had intended to publish the total number of eligible researchers at the same time as the RAE results appeared. Had they done so, the figures would have provided the denominator and made the results meaningful. Without these figures, anybody attempting to compare the overall quality of universities has to guess at the total pool from which the submitted researchers were selected.
 
And as a condition of publishing RAE data, newspapers were legally obliged  to include “in a reasonably prominent position” a statement saying (among other things) “HESA holds no data specifying which or how many staff have been regarded by each institution as eligible for inclusion in RAE 2008 ...further, the data that HESA does hold is not an adequate alternative basis on which to estimate eligible staff numbers”. It goes on, but I can’t bear to read any more of  it.
 
Whether cock-up or conspiracy, this outcome should embarrass HEFCE, HESA, and the universities themselves. What did it mean in real terms? Let’s look at economics, a discipline chosen at random. Selecting it does not imply that either the discipline of economics or any of the institutions mentioned misbehaved in any way. But economists, above most others, ought to understand the importance of having all the figures one needs to make a judgement.
 
London School of Economics, a top player in the economics game, submitted 41.6 economists and econometricians. (The fractions are the result of some members of staff not being full time.) LSE has 63 economists in the economics faculty, 156 in the Centre for Economic Performance, and 115 in the Centres for Economics and Related Disciplines – a total of 334. This overstates the numbers because it includes PhD students who are not eligible for inclusion, but it gives a rough idea.
 
Oxford University submitted 78.5 economists, out of a total of 117 listed on the university’s Economics Faculty website, plus those in other departments and in individual colleges, also listed there. There may be one or two more economists in the Said Business School. These totals also include PhD students.
 
Both got good profiles, but LSE’s was better: 60 per cent of its entry were in the top, four-star category (world-leading research), 35 per cent got three stars (internationally excellent), and 5 per cent two stars (quality recognised internationally). Oxford got 40 per cent in the top category, 55 per cent in the second, and 5 per cent in the third.
 
An identical profile to Oxford was achieved by Warwick, which submitted 49.63 economists. Its Economics Department lists 70 staff. 
 
So it looks, on the basis of this admittedly crude calculation, that both Oxford and Warwick submitted a higher proportion of their eligible academics than did the LSE. None of them did anything wrong, but the published figures make it impossible to assess which of these three institutions is best.
 
Make your own mind up, but don’t expect any help from the RAE.