Getty Images / WIRED
Three quarters of police forces in England and Wales now have access to mobile fingerprint scanners issued by the Home Office, new data reveals. In total, 28 of 43 police forces have started using the Strategic Mobile solution technology since it was first trialled, with four conducting their own pilot tests and seven other forces in the process of rolling out the devices.
Between September 2018 and May 2020, police forces have conducted more than 126,800 scans, or approximately 6,000 per month. The use of fingerprint scans has increased dramatically during the pandemic, figures obtained from Freedom of Information requests show, with the devices also disproportionately targeting ethnic minorities.
The Strategic Mobile devices, first trialled in February 2018, are small electronic scanners that clip onto smartphones and let police officers capture a person’s fingerprint at a higher resolution than the sensors built into phones. They were introduced to help officers check the identity of an unknown person and can get results in under 60 seconds. Once a person’s fingerprint has been scanned it is checked against two government databases: IDENT1, which contains fingerprints of those who have been taken into custody by police in the past; and IABS, which holds fingerprints of non-UK citizens who have entered the country.
Of the 32 police forces in England and Wales that have access to these devices, 19 provided data on the number of scans completed. As of July 2020, eight forces have performed at least 100 scans per 100,000 people living in their respective areas.
The police force with the highest number of scans is London’s Metropolitan Police Service, which employs its own mobile fingerprinting technology, INK Biometrics. Between November 2018 and July 2020, the Met conducted 51,048 scans, or on average 2,431 per month. The figures obtained show the Met conducts more scans each and every month than most police forces have in over two years.
While fewer people have been on Britain’s streets during lockdown, Home Office data shows a dramatic increase in fingerprint scans. Between March and May 2020, scans across all police forces that have access to mobile fingerprinting technology increased by 44 per cent year on year. In London, scans rose by 88 per cent between March and May.
The Met says its increased use of INK devices in recent months was due to a greater police presence on London’s streets, as lockdown resulted in a reduction in emergency calls and operational demand. “This meant that the Met was even more proactive, resulting in an increase in the use of police tactics, including stop and search,” a spokesperson says. The news of the increase in police use of fingerprint scanners follows a recent report from the Independent Office for Police Conduct that showed the Met disproportionately uses stop-and-search.
Fingerprint scans by the Met spiked in May at 3,566, or approximately 115 a day. It’s the highest number the Met has performed since November 2018. However, the Met blames tech problems for the sudden rise. “There was an issue for a short period of time in May with the Home Office gateway when using the INK device,” the spokesperson says. “This meant that officers didn’t receive a response to a scan, often leading them to submit the search again.”
In some constabularies, black Britons are between three and 18 times more likely to be stopped and scanned than their white counterparts. Data from seven police forces includes scans performed broken down by ethnicity – Surrey, North Hampshire, Derbyshire, City of London, Leicestershire, Devon and Cornwall, and West Yorkshire. Other police forces said they could not provide the data or did not record it.
In all seven areas the data shows that among communities of colour, and especially black people, the volume of scans per capita was significantly higher than those of white communities within the same police force area.
Devon and Cornwall Police has the worst record – with black people 23 times more likely to be scanned than white people. The data is limited as a racial breakdown was only provided for 208 scans. Matthew Longman, the force’s chief superintendent, acknowledged racial disparities seen in stop-and-scan practices in the police’s area and said that his force was actively working to rectify this, including piloting a new training package which addresses unconscious bias.
“What the world is telling us right now, and these community voices are telling us right now, is we have to explore the unconscious bias. I don’t think – there may be, in an organisation of 5,500 people, some bad apples – but on the whole I don’t believe my officers are going out and making decisions purely based on race,” Longman says. “But we might live within systems or processes that create that unconscious bias. And we need to make that a conscious bias so that we can address this and adjust our behaviours accordingly.”
Police built an AI to predict violent crime. It was seriously flawed
In the Surrey Police force area, black people were 18 times more likely to be stopped than white people – the second greatest disparity in the country. In West Yorkshire, with the least disparity, black people were still 3.4 times more likely to be stopped-and-scanned than white people.
“This technology can only be used where an offence (or suspected offence), has been committed, and where identity is doubted,” says a spokesperson for Surrey Police. They add the force is “acutely aware of, and very sensitive to” concerns about racism within policing.
A spokesperson for the West Yorkshire Police says that in 2019, the force worked with the Racial Justice Network – a West Yorkshire-based community organisation founded to “proactively promote racial justice” – on a voluntary basis to assure the general public that devices were being used fairly, and as such had begun recording suspect ethnicity data in an effort to monitor the use of its biometric devices.
Many other police forces say that while race-based data for the fingerprint scans does exist within their jurisdiction, they would be unable to provide it because the information is not aggregated centrally and is instead held in officer pocketbooks.
From the first Privacy Impact Assessment published in 2017, the Home Office, has tried to assure the general public that the fingerprint scanners it developed would discard biometric information once scans have been completed. But many British civil rights groups rejected the rollout of these devices because, they argued, it gave frontline police officers mobile access to IABS, the UK’s immigration database, effectively turning police officers into border guards. As such, they had concerns that these devices would be used to disproportionately target ethnic minorities.
Despite condemnations from groups such as Liberty and the Racial Justice Network’s Stop the Scan campaign, the Home Office says scanners are a means of positively identifying people when no other means exists. “In order to check fingerprints, officers must suspect someone of committing a crime or need to urgently identify them for medical reasons,” says a spokesperson for the Home Office. “We are absolutely clear that no one should be targeted because of their race or ethnicity.”
Patrick Williams, a senior lecturer in criminology at Manchester Metropolitan University who has written about racial discrimination and data-driven policing, says that there’s an important distinction to be made between police effectiveness – when the police have worked successfully to reduce crime – and police efficiency, when the police have accomplished the tasks they have set out for themselves.
One October 2019 press release from the Met said it had made 13,000 identifications using the tech and didn’t bring people into custody to confirm identities. “Rather than focusing on the police who’ve been effective at tackling crime or reducing levels of crime – irrespective of its form – the police are almost presenting themselves as being effective because they’ve managed to use a fingerprint scanner,” Williams says.
Having fingerprints in a government database is not an indicator of criminality, he adds, and a positive identification is not proof that a crime has been committed. “That’s not indicative of success. It’s indicative of police policing who they know. And that often tends to be those same communities who are always subjected to over-policing.”
There is currently no race-based data available on the Met’s use of fingerprint scanners. What we do know is that of more than 51,000 scans conducted by the Met, approximately 44 per cent resulted in a match. This is higher than in other police forces. Home Office data shows that approximately 30 per cent of all scans across the country result in a positive identification.
Williams believes that the fact that we have this figure and not more figures on race-based data reveals the “hardwiring” of discriminatory policing practices. “My concern over the encroachment of tech into policing is that it increasingly allows the police to speak in an objective, independent and almost scientifically informed way,” he says. “We don’t know who the objects of policing are, we don’t know what’s driving the encounters, but the tech is saying: ‘44 per cent correct’.”
Like stop-and-search, under the Police and Criminal Evidence Act 1984 (PACE), a police officer may scan someone’s fingerprints without their permission so long as the officer “reasonably suspects that the person is committing or attempting to commit an offence, or has committed or attempted to commit an offence”. The officer also need to believe they don’t have any way of working out who a person is or that they may not be telling the truth about their name.
Rebekah Delsol, senior managing policy officer for ethnic profiling with the Open Society Justice Initiative, says that the necessity to provide a name raises more issues about the use of fingerprint scanners. “How are they choosing to scan people? she says. “If the suspicion is that someone is here illegally, how would you determine that just from appearance?”
More great stories from WIRED
🇸🇪 Not every country treated the pandemic the same – did Sweden’s Covid-19 experiment work?
💬 This AI Telegram bot has been abusing thousands of women
🧥 Apple’s new phones have arrived: Should you get the iPhone 12 or iPhone 12 Pro?
🔊 Listen to The WIRED Podcast, the week in science, technology and culture, delivered every Friday
👉 Follow WIRED on Twitter, Instagram, Facebook and LinkedIn
Get WIRED Daily, your no-nonsense briefing on all the biggest stories in technology, business and science. In your inbox every weekday at 12pm UK time.
Thank You. You have successfully subscribed to our newsletter. You will hear from us shortly.
Sorry, you have entered an invalid email. Please refresh and try again.