Skip to main contentSkip to navigationSkip to navigation
A police officer watches a television monitor looking at London’s CCTV camera network.
A police officer watches a television monitor looking at London’s CCTV camera network. Photograph: Daniel Berehulak/Getty Images
A police officer watches a television monitor looking at London’s CCTV camera network. Photograph: Daniel Berehulak/Getty Images

UK police use of computer programs to predict crime sparks discrimination warning

This article is more than 5 years old

Human rights group claims the algorithms threaten a ‘tech veneer to biased practices’

The rapid growth in the use of computer programs to predict crime hotspots and people who are likely to reoffend risks locking discrimination into the criminal justice system, a report has warned.

Amid mounting financial pressure, at least a dozen police forces are using or considering the predictive analytics. Leading police officers have said they want to make sure any data they use has “ethics at its heart”.

But a report by the human rights group Liberty raises concern that the programs encourage racial profiling and discrimination, and threaten privacy and freedom of expression.

Hannah Couchman, a policy and campaigns officer at Liberty, said that when decisions were made on the basis of arrest data it was “already imbued with discrimination and bias from way people policed in the past” and that was “entrenched by algorithms”.

She added: “One of the key risks with that is that it adds a technological veneer to biased policing practices. People think computer programs are neutral but they are just entrenching the pre-existing biases that the police have always shown.”

Using freedom of information data, the report finds that at least 14 forces in the UK are using algorithm programs for policing, have previously done so or conducted research and trials into them.

The campaign group StopWatch said it had “grave concerns around the effectiveness, fairness and accountability of these programs”. Its chief executive, Katrina Ffrench, said: “We cannot be sure that these programs have been developed free of bias and that they will not disproportionately adversely impact on certain communities or demographics. For proper accountability there needs to be full transparency.”

These programs are often referred to as “black boxes” because the role each piece of data plays in the program’s decision-making process is not made public.

“This means the public can’t hold the programs to account – or properly challenge the predictions they make about us or our communities. This is exacerbated by the fact that the police are not open and transparent about their use,” the Liberty report concludes.

The programs used by police work in two main ways. Firstly, predictive mapping looks at police data about past crimes and identify “hotspots” or areas that are likely to experience more crime on a map. Police officers are then directed to patrol these parts of the country.

Secondly, “individual risk assessment” tries to predict the likelihood of a person committing, or even be the victim of, certain crimes.

Durham is among forces using such programs and has a system called Harm Assessment Risk Tool (Hart), says the report. Hart uses machine learning to decide how likely a person is to commit a violent or non-violent offence over the next two years. It gives an individual a risk score of low, medium or high, and is designed to over-estimate the risk. The program bases its prediction on 34 pieces of data, 29 of which relate to someone’s past criminal history.

West Midlands police are also leading on a £48m project funded by the Home Office called National Data Analytics Solution (NDAS). The long-term aim of the project is to analyse vast quantities of data from force databases, social services, the NHS and schools to calculate where officers can be most effectively used. An initial trial combined data on crimes, custody, gangs and criminal records to identify 200 offenders “who were getting others into a life on the wrong side of the law”.

Supt Iain Donnelly, who is the project manager for NDAS, said: “[The project] seeks to use advanced analytics, otherwise known as data science techniques, to generate new insights from existing data already in the possession of police.”

He said the datasets being used were crime recording, incident logs, custody records, crime intelligence and conviction history from the police national computer (PNC) system. “We are not using data from non-police agencies,” he said.

Tom McNeil, strategic adviser to the West Midlands police and crime commissioner, said: “We are determined to ensure that any data science work carried out by West Midlands police has ethics at its heart … These projects must be about supporting communities with a compassionate public health approach.” He said they have adopted a “transparent approach” working with human rights charities.

Until last March, Kent police used PredPol, a mapping program widely deployed in the US. The force is looking to invest in a similar predictive policing program available at a lower cost, or may develop its own. Kent said the £100,000 a year system was part of its focus on “finding innovative ways of working resourcefully” and that it was under ongoing analysis.

Avon and Somerset police use both mapping programs and a broad range of controversial risk assessment programs. They use the latter to explore, among other things, a person’s likelihood of reoffending, of being a victim of a crime and of being reported missing.

“With so many predictive analytics programs or algorithms now in use it’s even more important than ever to be asking questions about how an individual’s risk is calculated, which factors are included and what is the margin of error when using these factors, [and] is someone asking whether the ‘risk factors’ are as accurate for black or BME people as they are for white people?” said Zubaida Haque, the deputy director at the Runnymede trust.

Most viewed

Most viewed