Five rules for fact-checking shakey statistics in the news

You can’t have a functioning democratic state without a literate population. That’s been recognised since at least the mid-Victorian era. The 1867 reform act expanded the franchise to many working-class men, not all of whom were literate, and the elite were concerned that it would make it harder to run the country: “I believe it will be absolutely necessary to compel our future masters to learn their letters,” said one politician. These worries partly drove the 1870 and 1880 education acts, making elementary education compulsory for all. Voters needed, in short, to be able to read the newspapers.
These days, though, you can’t have a functioning democratic state without a numerate population. We need to understand not just the written word, but numbers.

Advertisement

That’s been especially true over the last 12 months – suddenly we all have had to know why infection fatality rates are different from case fatality rates, or what an exponential curve is, or why the R value matters. But it’s true all the time. If politicians tell us that crime rates or poverty rates have gone up or down, or that the NHS is getting a budget increase, how can we trust them if we don’t know how to read numbers?
And there’s an additional problem. Journalists themselves are not always especially good with statistics. So a lot of numbers, by the time they reach the reader, have already been mangled quite badly.
In our new book, How to Read Numbers: A Guide to Stats in the News (and Knowing When to Trust Them), my economist cousin David and I want to help readers understand the numbers in the media a bit better: to talk about common ways in which they go wrong, and how to spot them. Here are five of the most important.
Is that a big number?
You’ll often see numbers given in the news without further context, and often they’ll sound big and impressive. Take one very famous example: £350 million, as written on the side of a bus and apparently given to the EU each week. Don’t worry, I’m not going to relitigate the argument over whether that was the right number. What I want to do instead is ask: is it a big number?

Advertisement

Compared to the average Briton’s salary, of course it is. But that’s not what we should be comparing it to. One thing to compare it to might be the government’s total annual budget, which was expected to be £928 billion in 2020-2021. That £350 million is equivalent to about £18 billion, which is about two per cent of £928 billion.
Is that a large amount? Well, it’s certainly not negligible. (If you’re still annoyed about the £350 million figure, after the rebate it was about £250 million, or 1.4 per cent of the budget.) But “we give two per cent of our budget to the EU a year” might not have sounded so dramatic. If you see a number in the news – “X people died of disease Y this year!” – ask yourself: is that a big number? How can I work that out?
What’s causing it?
Does a refreshing glass of Fanta Orange make you want to glass someone in the face? According to newspaper reports, it might: “Fizzy drinks make teenagers violent,” said headlines in 2011.
Note the causal language: the drinks make teenagers more violent. But the actual study the stories were based on didn’t say that: it said that teenagers who drank fizzy drinks were more likely to be violent. There is a correlation, but that doesn’t mean that it’s a causal relationship.

Advertisement

For example: if lots of people eat ice cream on a given day, then it’s more likely that someone will drown. But that doesn’t mean that ice cream causes drownings; instead, on hot days, more people eat ice cream, and more people go swimming, and a number of them drown.
If two numbers go up and down together – for instance, the number of fizzy drinks consumed and the number of people stabbed – it might be that A causes B, or it might be that B causes A, or some third factor C causes both. It’s very hard to tease out which is which, unless you’re doing a randomised controlled trial, which this study (like most other research into people’s lifestyles) wasn’t. So if you see causal language – ”fizzy drinks cause violence”, “vaping makes children take drugs” – be aware that it’s often not justified.
50 per cent more than what?
Here’s a scary number for would-be parents. Children born to fathers over 45 are 18 per cent more likely to suffer seizures than children born to parents under 35, according to headlines a couple of years ago.
That sounds frightening. But what does it mean?
Unless you know how many children have seizures anyway, you can’t know. If you’re only given “relative risk” like this – how much your risk has gone up, compared to what it was before – you have no sense of how much it matters, unless you are also told what your original risk was, in absolute terms. In this case, the study being discussed found about 0.024 per cent of children born to younger fathers have seizures; about 0.028 per cent of children born to older ones will. In real terms, that means 28 babies in every 100,000, instead of 24 babies: the absolute risk goes up by about four in 100,000.
You’ll see this a lot: eating bacon “raises cancer risk by 20 per cent”, for instance. But unless you’re also told the absolute risk – 20 per cent more than what? – it’s almost entirely unhelpful.
What are we actually measuring?
Over the last half-century, diagnoses of autism have gone up about a hundredfold. In the 1960s and 1970s, the incidence of autism was estimated at one in 5,000; now it’s estimated at one in 54.
What’s happened? Is it bad parenting? Is it pesticides in the water? Is it Bill Gates putting microchips in our vaccines?
No: it’s just that what we call “autism” has changed. The diagnostic criteria for autism have been revised several times; it wasn’t recognised as a separate disorder until 1980, and it was expanded in 1987, 1994 and 2000, then again in 2013. The criteria were widened to include later-diagnosed children, children with less extreme symptoms, and children with previously separate conditions. It may well be that there has been no change in the distribution of the set of traits we now call “autism”.
When you see headlines like “Hate crimes double in five years”, it’s worth asking whether something similar is going on. As it turns out, mercifully, it probably is: the public have got better at reporting, and the police have got better at recording, hate crimes. Crime surveys suggest that most hate crimes have got less, rather than more, likely in recent years. It’s always worth asking whether what we’re measuring has changed.
Is the research any good?
A lot of the time, it’s not fair to blame journalists for the bad numbers in the news. They get them from surveys and research papers, and not all research is born equal.
For instance, there was a study into hydroxychloroquine last year as a treatment for Covid-19 which got some attention; it found that it had an effect. But another trial found no such impact. How should readers – and journalists – know which to trust?
It’s a hard question. In this case, there’s a simple answer: the first was a simple observational study that looked at 42 patients; the second was a full randomised controlled trial of 11,000 patients. But often it’s hard to know. A couple of rules of thumb, though: small trials are usually worse than big ones, all else being equal; if a study returns unexpected findings which aren’t representative of what other studies find, it might be that it’s bad; and studies which are preregistered, so it’s harder for scientists to cherry-pick their findings, are usually more trustworthy than those that aren’t.
More great stories from WIRED
🏥 The terror and trauma of surviving intensive care with Covid-19
🍩 We swapped the office for Slack and Teams. As hybrid working looms, the race is on to fix them
💬 How to spot fake Covid-19 NHS vaccine texts

Advertisement

🔊 Listen to The WIRED Podcast, the week in science, technology and culture, delivered every Friday
👉 Follow WIRED on Twitter, Instagram, Facebook and LinkedIn

Get WIRED Daily, your no-nonsense briefing on all the biggest stories in technology, business and science. In your inbox every weekday at 12pm UK time.

by entering your email address, you agree to our privacy policy

Thank You. You have successfully subscribed to our newsletter. You will hear from us shortly.
Sorry, you have entered an invalid email. Please refresh and try again.

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment

Why You Need A Website

Now