Mark Wilding and Daniel Boffey 

UK police forces lobbied to use biased facial recognition technology

Exclusive: System more likely to suggest incorrect matches for images of women and Black people
  
  

Cameras on top of vehicle driving along Oxford Street in London.
Documents reveal police have known for more than a year that the system was biased. Photograph: Leon Neal/Getty Images

Police forces successfully lobbied to use a facial recognition system known to be biased against women, young people, and members of ethnic minority groups, after complaining that another version produced fewer potential suspects.

UK forces use the police national database (PND) to conduct retrospective facial recognition searches, whereby a “probe image” of a suspect is compared to a database of more than 19 million custody photos for potential matches.

The Home Office admitted last week that the technology was biased, after a review by the National Physical Laboratory (NPL) found it misidentified Black and Asian people and women at significantly higher rates than white men, and said it “had acted on the findings”.

Documents seen by the Guardian and Liberty Investigates reveal that the bias has been known about for more than a year – and that police forces argued to overturn an initial decision designed to address it.

Police bosses were told the system was biased in September 2024, after a Home Office-commissioned review by the NPL found the system was more likely to suggest incorrect matches for probe images depicting women, Black people, and those aged 40 and under.

The National Police Chiefs’ Council (NPCC) ordered that the confidence threshold required for potential matches be increased to a level where the bias was significantly reduced.

That decision was reversed the following month after forces complained the system was producing fewer “investigative leads”. NPCC documents show that the higher threshold reduced the number of searches resulting in potential matches from 56% to 14%.

Though the Home Office and NPCC refused to say what threshold was being used now, the recent NPL study found the system could produce false positives for Black women almost 100 times more frequently than white women at certain settings.

When publishing those results, the Home Office said: “The testing identified that in a limited set of circumstances the algorithm is more likely to incorrectly include some demographic groups in its search results.”

Describing the impact of the brief increase to the system’s confidence threshold, the NPCC documents states of the change in threshold sought by the police forces: “The change significantly reduces the impact of bias across protected characteristics of race, age and gender but had a significant negative impact on operational effectiveness”, adding that forces complained that “a once effective tactic returned results of limited benefit”.

The government has opened a ten-week consultation on its plans to widen the use of facial recognition technology.

Sarah Jones, the policing minister, has described the technology as the “biggest breakthrough since DNA matching”.

Prof Pete Fussey, a former independent reviewer of the Met’s use of facial recognition, said he was concerned by the apparent priorities of police forces.

He said: “This raises the question of whether facial recognition only becomes useful if users accept biases in ethnicity and gender. Convenience is a weak argument for overriding fundamental rights, and one unlikely to withstand legal scrutiny.”

Abimbola Johnson, chair of the independent scrutiny and oversight board for the police race action plan, said: “There was very little discussion through race action plan meetings of the facial recognition rollout despite obvious cross-over with the plan’s concerns.

“These revelations show once again that the anti-racism commitments policing has made through the race action plan are not being translated into wider practice. Our reports have warned that new technologies are being rolled out in a landscape where racial disparities, weak scrutiny and poor data collection already persist.

“Any use of facial recognition must meet strict national standards, be independently scrutinised, and demonstrate it reduces rather than compounds racial disparity.”

A Home Office spokesperson said: “The Home Office takes the findings of the report seriously and we have already taken action. A new algorithm has been independently tested and procured, which has no statistically significant bias. It will be tested early next year and will be subject to evaluation.

“Our priority is protecting the public. This gamechanging technology will support police to put criminals and rapists behind bars. There is human involvement in every step of the process and no further action would be taken without trained officers carefully reviewing results.”

 

Leave a Comment

Required fields are marked *

*

*