More grief for ONS over its construction stats
A bad week for the Office for National Statistics just got worse, with a scathing review of an error it made in calculating the output of the construction industry back in August.
This mistake was picked up by a journalist who spotted an inconsistency between two tables at a briefing on the day the Statistical Bulletin on Output in Construction was released. To have to rely on a journalist to spot its errors rather sums up the hapless state of ONS at the moment. The UK Statistics Authority is not impressed.
To make matters worse, the mistake was made in statistics about the building industry, whose leaders have been critical of ONS output for some time. The bulletin said the industry had grown by 2.3 per cent in Quarter 2 of 2011, when the true figure was 0.5 per cent, unchanged from the figure in the initial estimate released on July 26.
Had it really been 2.3 per cent, it would have added 0.1 per cent to GDP growth that quarter, manna from heaven for the Chancellor. As it was, it added nothing.
How did it happen? The UKSA’s review finds a near-comical series of errors, starting with somebody copying the wrong formula into a spreadsheet, which resulted in the months March, April and May being picked up rather than April, May and June. The new line in the spreadsheet should have been entered manually, not copied.
The inconsistency should have been spotted by routine checks, but wasn’t. The big change from the initial estimate should have rung alarm bells, but didn’t. The best practice guide for quality assurance circulated internally by ONS in June would have picked up the error, if it had been followed. But it wasn't.
The review finds failures of procedure and culture, together with the lack of an “expert and experienced eye” being cast over the draft publication.
It suggests that senior managers are not sufficiently sensitive to the external environment – what commentators are saying, what relevance the statistics have to markets and to government policy, and what the implications of errors are. Savvy managers, in other words, would make sure that high-profile statistics didn’t contain errors gross enough to be spotted within a few minutes by a journalist.
Outdated spreadsheet technology was a contributory factor, but past efforts to replace spreadsheets haven’t been too successful. It will take time, UKSA admits. There were also too few staff available. Many of these criticisms echo those in yesterday’s post about the delay in production of the Blue Book.
Once the error was pointed out, it took ONS four hours to withdraw the statistics. That should have been done at once, the review says, because they were market-sensitive. ONS should also have left the original flawed bulletin on its website alongside the corrected version, to enable a proper audit trail. Replacing it in toto can cause confusion and suspicion.
All in all, there wasn’t much that went right with this particular set of statistics. .