Rajeev Syal Home affairs editor 

Home Office admits facial recognition tech issue with black and Asian subjects

Calls for review after technology found to return more false positives for ‘some demographic groups’ on certain settings
  
  

Facial recognition cameras in front of a police van
Facial recognition cameras being used near Arsenal’s Emirates Stadium in north London before a match last month. Photograph: Hannah McKay/Reuters

Ministers are facing calls for stronger safeguards on the use of facial recognition technology after the Home Office admitted it is more likely to incorrectly identify black and Asian people than their white counterparts on some settings.

Following the latest testing conducted by the National Physical Laboratory (NPL) of the technology’s application within the police national database, the Home Office said it was “more likely to incorrectly include some demographic groups in its search results”.

Police and crime commissioners said publication of the NPL’s finding “sheds light on a concerning inbuilt bias” and urged caution over plans for a national expansion.

The findings were released on Thursday, hours after Sarah Jones, the policing minister, had described the technology as the “biggest breakthrough since DNA matching”.

Facial recognition technology scans people’s faces and then cross-references the images against watchlists of known or wanted criminals. It can be used while examining live footage of people passing cameras, comparing their faces with those on wanted lists, or be used by officers to target individuals as they walk by mounted cameras.

Images of suspects can also be run retrospectively through police, passport or immigration databases to identify them and check their backgrounds.

Analysts who examined the police national database’s retrospective facial recognition technology tool at a lower setting found that “the false positive identification rate (FPIR) for white subjects (0.04%) is lower than that for Asian subjects (4.0%) and black subjects (5.5%)”.

The testing went on to find that the number of false positives for black women was particularly high. “The FPIR for black male subjects (0.4%) is lower than that for black female subjects (9.9%),” the report said.

The Association of Police and Crime Commissioners said in a statement that the findings showed an inbuilt bias. It said: “This has meant that in some circumstances it is more likely to incorrectly match black and Asian people than their white counterparts. The language is technical but behind the detail it seems clear that technology has been deployed into operational policing without adequate safeguards in place.”

The statement, signed off by the APCC leads Darryl Preston, Alison Lowe, John Tizard and Chris Nelson, questioned why the findings had not been released at an earlier opportunity or shared with black and Asian communities.

It said: “Although there is no evidence of adverse impact in any individual case, that is more by luck than design. System failures have been known for some time, yet these were not shared with those communities affected, nor with leading sector stakeholders.”

The government announced a 10-week public consultation that it hopes will pave the way for the technology to be used more often. The public will be asked whether police should be able to go beyond their records to access other databases, including passport and driving licence images, to track down criminals.

Civil servants are working with police to establish a new national facial recognition system that will hold millions of images.

Charlie Whelton, a policy and campaigns officer for the campaign group Liberty, said: “The racial bias in these stats shows the damaging real-life impacts of letting police use facial recognition without proper safeguards in place. With thousands of searches a month using this discriminatory algorithm, there are now serious questions to be answered over just how many people of colour were falsely identified, and what consequences this had.

“This report is yet more evidence that this powerful and opaque technology cannot be used without robust safeguards in place to protect us all, including real transparency and meaningful oversight. The government must halt the rapid rollout of facial recognition technology until these are in place to protect each of us and prioritise our rights – something we know the public wants.”

The former cabinet minister David Davis raised concerns after police leaders said the cameras could be placed at shopping centres, stadiums and transport hubs to hunt for wanted criminals. He told the Daily Mail: “Welcome to big brother Britain. It is clear the government intends to roll out this dystopian technology across the country. Something of this magnitude should not happen without full and detailed debate in the House of Commons.”

Officials say the technology is needed to help catch serious offenders. They say there are manual safeguards, written into police training, operational practice and guidance, that require all potential matches returned from the police national database to be visually assessed by a trained user and investigating officer.

A Home Office spokesperson said: “The Home Office takes the findings of the report seriously and we have already taken action. A new algorithm has been independently tested and procured, which has no statistically significant bias. It will be tested early next year and will be subject to evaluation.

“Given the importance of this issue, we have also asked the police inspectorate, alongside the forensic science regulator, to review law enforcement’s use of facial recognition. They will assess the effectiveness of the mitigations, which the National Police Chiefs’ Council supports.”

 

Leave a Comment

Required fields are marked *

*

*