This morning’s Times carries a story with the headline “Go figure: nation’s number crunchers red-faced again”. The story talks about two more data problems for the ONS; one on the measurement of people on zero hours contracts and the other on a delay in publishing the balance of trade statistics. The article goes on to describe the ONS’s recent history and to report political criticism of its performance. A City economist is quoted as saying the ONS “must try harder to get a grip”. And the paper suggests that senior staff losses after ONS relocated some of its operations from to Newport were damaging.
I’ve some sympathy with the last point – I left the ONS in 2006 just before the decision was taken to almost completely close the London Office. This decision did result in a big loss of knowledge and experience – as well as day-to-day personal contact with key data users in the Bank of England and the Treasury.
But the ONS’s troubles go back much further. In his 1956 Budget speech, Harold Macmillan talked of the difficulties of managing the economy using “last Year’s Bradshaw”. In the 1980’s the then Central Statistical Office mis-measured the growth of the economy leading – according to Nigel Lawson – to policy errors. And I remember well a member of the Treasury select committee telling the national statistician, Len Cook in 2003, “I should suggest you wake up and smell the coffee because the commentaries suggest there is a catalogue of revisions upwards, downwards and sideways.” This followed a major revision to the estimate of GDP growth on the back of revised construction data.
Official statistics seems to go through a cycle of change which starts with criticism of the output, a review, additional resources, some stability, relative decline in resources and then criticism. Th missed Lawson Boom led to the Pickford review which led to a major reorganisation and the so-called Chancellor’s Initiative and then some stability and then a decline in quality and the ONS modernisation project. And now we have another review of economic statistics which is likely to recognise the difficulty in the job the ONS does – but not necessarily get the right answer for improvement.
Measuring the economy is tough – it’s big and complex and requires lots of data. I’ll give some view on that later. But here are some prescriptions for the ONS.
- Focus on what is important. Much of the ONS’s output is required by EU regulation. But they don’t have to produce everything to the same quality. Identify the key statistics and put the resources on those – develop cheap and simple methods for the rest. And try not to waste resources on politcal vanity projects like the measurement of happiness which add nothing to the sum of human knowledge.
- Be creative in the use of data. It’s getting even harder to get companies and individuals to respond to surveys. Multiple attempts have been made to use other data sources – but the big data is there and can be used. It might just take a change in mindset to recognise that modelled estimates are at least as good as direct measurement in some areas.
- Present better: the ONS website remains a standing joke in the user community. It’s hard to navigate, has a dreadful search function and data releases are often turgid and impenetrable. Focus on getting good quality information out there. And don’t sweat the technicalities on errors and so on. Yes, the ONS should understand their data quality – but frankly most users believe that the ONS should only publish data which can be trusted.
This may not be enough – but it has to be start. At root Official Statisticians (and probably statisticians more generally) often have a producer’s attitude – they know what they are doing and the user doesn’t understand how hard the job is. The ONS needs to shift its attitude and understand and address the concerns of users rather than relying on technical defences of its current approach. Things will go wrong because measurement is difficult – but thinking about the user will reduce the reputational damage when that time comes.