Ed Humpherson, Director General for Regulation, on the Office for Statistics Regulation’s role in safeguarding official statistics.
Statistics frame public debate. They inform public understanding of what’s going on in the world. They help give a basis for people to make decisions – whether as policy makers, citizens, professionals or businesses. And they create a common ground for debate – about what’s working, what isn’t working, and what needs to change.
So it’s essential that people can have confidence in the statistics produced by Government. It’s the role of the Office for Statistics Regulation to uphold this public confidence across all areas of policy and all Government departments.
We can do this in high profile ways. For example, in October we wrote publicly to the Department for Education expressing concern about the way the Department was using statistics. We were concerned about a mixture of mistakes (for example, on the improvement in PIRLS scores for schools in England); poor presentation (for example, in a tweet on increases in school funding); and excessive weight placed on facts that did not give the complete picture (for example, on increases in the number of children in good or outstanding schools since 2010).
Taken together, we were concerned that this indicated that the Department did not always live up to the standards we expect in using and communicating statistics. There was a risk that a misleading impression was given of the performance of the school system in England, one which clashed with lived experience of professionals and parents. And this is a pity, not least because there are some genuine areas of success in education in England over the last decade.
But it’s important to place this high-profile intervention in context, in three ways.
First, this is only the high-profile part of what we do. Most of our work is less eye-catching but absolutely fundamental: reviewing the work of the Department, and all other Government Departments, in producing regular official statistics. We review whether the department producing the statistic has a set of trustworthy processes; whether the statistics are of sufficient quality; and whether they are valuable to users. Recently this has included reviewing the Department for Education’s phonics screening check and key stage 1 assessments statistics, and the Welsh Government’s examination results statistics – in both cases we found lots of evidence of great work by the teams producing the statistics, but also some areas for improvement. These three pillars – trustworthiness, quality and value – form the bedrock of our day-to-day work.
Second, DfE have responded well to our criticism. It has put in place new processes and reemphasised the importance of trustworthiness, quality and value. Though of course the acid test is whether the outside world sees a meaningful change in the way it uses statistics. For now, we are still hearing concerns from those in the sector, and so are continuing to monitor the situation closely.
Third, we are happy to comment on how others use statistics. It’s important that all participants in public debate use statistics fairly. For example, in January we highlighted the risk that the School Cuts website’s use of statistics might give a misleading impression of changes to school budgets.
All of this revolves around our core aim: ensure that statistics serve their purpose as an essential public asset.