This is a blog based on the Annual Lecture for the Centre for Science and Policy that Ed Humpherson delivered in March 2019. A full link to the lecture is here.
Why do people think that we’re facing threats to the role played by data and evidence?
Let’s start with Thomas Gresham. He was a banker in Tudor times. He is known for his concerns about “debasing” of a currency. Debasing a currency is when the composition of a coin is deliberately changed – in other words, precious metal is replaced by base metal. Gresham’s Law states that the “bad money drives out the good”: confidence collapses, people don’t know which money to trust, and they stop using money altogether. (Actually, I understand that Gresham never actually used the phrase “the bad money drives out the good”. It was a reinterpretation of his writing by a later economist.)
Gresham’s law is often used as an analogy – to a situation where people lose confidence in the value of something. For data, the analogy is: the bad data – the fake news, the misused statistics – drive out the good. People no longer know what to have confidence in. They mistrust everything. They see the world of quantification and evidence as inherently untrustworthy; they think it’s all made up.
Is there any evidence of this happening? Well, it’s certainly possible to find evidence of people worried about the debasing of the currency of data and evidence. There are famous, iconic examples – insert your own favourite misuse here.
There’s also a broader sense of a threat to public reason. In a lecture at the British Academy on Ethical Communication in a Digital Age last October, Onora O’Neill covered this. By public reason, she meant the notion that public life should be based on and protect certain standards and norms. She outlined how digital technologies can undermine these standards. Communication technologies are not used to inform or communicate, but to grab attention, and use it to misinform, nudge and manipulate. She added that we don’t spend enough time thinking about the ethics of public reason – what good public reasoning looks like.
To see this, think about any number of articles or reports on misinformation. They tend to focus on the legal and regulatory responses to the risks to democracy of misinformation. Yet they often say little about what it actually means to misinform, and next to nothing on what good looks like – what does it mean to inform?
Our work highlights plenty of examples of misuse of statistics. These examples include issues like the presentation of school funding data, or unreliable claims of success for initiatives to reduce rough sleeping. We also see problems of verifiability: what we call “naked numbers”. Numbers with no source. No context. No way of finding out what they are based on.
What all these examples do is reduce statistics to just A Number. A misleading, naked or decontextualized number. And they do represent a debasing of the coin – but not the kind that is done in a backstreet workshop by an anonymous faker, but in plain sight by an official body, as if, in the coinage analogy, the Royal Mint was doing it itself in plain sight.
However, it would be wrong to give the impression that these problems – the naked, misleading numbers – are the norm. As the UK Statistics Authority’s Chair Sir David Norgrove often points out, the norm is responsible production and use of statistics. And more generally, our Code of Practice is all about how the pillars of trustworthiness, quality and value support public confidence in statistics.
Examples of government bodies living up to these concepts abound. To give just three examples:
Life expectancy statistics show a steady increase in life expectancy at birth – an unbroken succession of improvements stretching back to 1970s. Yet over the last couple of years, the rate of increase has started to stall. That’s what the statistics-as-numbers say. And if the statisticians were stuck in the Just The Numbers mindset, there it would end. However, ONS and Public Health England are looking behind the data. They have published excellent analysis that unpicks the factors that may be causing the slowdown.
On the productivity puzzle, it’s sufficient just to highlight the great work done by statisticians at the ONS to analyse and unpick this puzzle.
Cabinet Office’s Race Disparity Unit spent a year developing a website that collates data from a huge number of different sources across Government. The website is exemplary in terms of support for public reason. It is incredibly clear on the process which underlies its creation. It is based on extensive user engagement: what do people want to know, what questions are they interested in?
In these cases, analysts to go beyond just publishing The Numbers. In all three cases, they recognise that the world is complex. They recognise that the statistics are estimates, and they need careful explanation.
So, to summarise this: Statistics are not just technocratic measurements. They should never be just The Numbers.
Statistics are also social tools. Just as a coin is not just a piece of metal but a symbol of socially constructed value – which is why we worry about it being debased – so it is with statistics. They are numerical estimates, but they can also be part of a social interaction, a dialogue between producer (who understands the data) and user (who wants explanations and insight). They should be an invitation to conversation. That’s what the Code’s pillars of trustworthiness, quality and value are all about.
So, I don’t think it’s inevitable that the bad data will drive out the good. As long as producers have the right approach, based on trustworthiness, quality and value, the good statistics can thrive.