UK police forces lobbied to use biased facial recognition technology

Facial recognition technology
Exclusive: System more likely to suggest incorrect matches for images of women and Black people

By Mark Wilding for Liberty Investigates and Daniel Boffey for the Guardian

quotes

Police forces successfully lobbied to use a facial recognition system known to be biased against women, young people, and members of ethnic minority groups, after complaining that a more accurate version produced fewer potential suspects.

UK forces use the Police National Database (PND) to conduct retrospective facial recognition searches, whereby a ‘probe image’ of a suspect is compared to a database of more than 19 million custody photos for potential matches.

Last week, the Home Office admitted the technology was biased, after a review by the National Physical Laboratory (NPL) found it misidentified Black and Asian people and women at significantly higher rates than white men, and said it “had acted on the findings”. But documents seen by the Guardian and Liberty Investigates reveal that the bias has been known about for over a year – and that police forces argued to overturn a measure that addressed it.

Police chiefs were told the system was biased in September 2024, after a Home Office-commissioned review by the NPL found the system was more likely to suggest incorrect matches for probe images depicting women, Black people, and those aged 40 and under.

The National Police Chiefs’ Council (NPCC) then ordered that the confidence threshold required for potential matches be increased to a level where the bias was significantly reduced. However, that decision was reversed the following month after forces complained the system was producing fewer ‘investigative leads’. NPCC documents show that the higher threshold reduced the number of searches resulting in potential matches from 56% to 14%.

Though the Home Office and NPCC refused to say what threshold is currently being used, the recent NPL study also found the system produces false positives for Black women almost 100 times more frequently than white women at certain settings. When publishing those results, the Home Office said: “The testing identified that in a limited set of circumstances the algorithm is more likely to incorrectly include some demographic groups in its search results.” The system remains in use while the department develops a replacement.

Describing the impact of the brief increase to the system’s confidence threshold, the documents state: “The change significantly reduces the impact of bias across protected characteristics of race, age and gender but had a significant negative impact on operational effectiveness”, adding that forces complained that “a once effective tactic, returned results of limited benefit”.

Professor Pete Fussey, a former independent reviewer of the Met’s use of facial recognition, said: “This raises the question of whether facial recognition only becomes useful if users accept racial and gender biases. Convenience is a weak argument for overriding fundamental rights, and one unlikely to withstand legal scrutiny.

“Incorrect matches not only bring consequences for those wrongly identified but also diverts police resources and raises questions over the most effective delivery of public safety.”

According to the NPCC documents, as of November this year police forces had not reported any wrongful arrests as a result of PND facial recognition searches. However, it is not known how many people have been questioned after being wrongly identified as suspects. Shaun Thompson, an anti-knife crime activist from London, is currently bringing a legal case against the Metropolitan Police after he was wrongly identified by facial recognition cameras. Thompson was not arrested but says he was detained for 30 minutes while he proved his identity.

Data previously obtained by Liberty Investigates revealed that police forces had conducted more than half a million facial recognition searches on the PND between 2018 and 2024, suggesting thousands of suspects could have been misidentified.

Earlier versions of the PND facial recognition system, which has been available to police forces since 2014, offered worse performance and similar issues with bias. In 2021, the system’s search algorithm was updated. Home Office tests the following year reported a seven-fold increase in the number of correct matches being returned but “false positives were higher in women than men” and performance varied across racial groups.

The Home Office said it has procured a new facial recognition algorithm, which is expected to become operational in 2026.

Biometrics and Surveillance Camera Commissioner William Webster said he noted “with interest” last week’s Home Office report confirming the bias.

“It seems to me that this offers a reminder that meaningful oversight requires transparency and the timely publication of relevant research, and that the Home Office should consider as part of its response to the consultation findings, how future oversight and regulators are kept informed by prompt and relevant research data and any knowledge of risks of bias or similar.”

A Home Office spokesperson said: “The Home Office takes the findings of the report seriously and we have already taken action. A new algorithm has been independently tested and procured, which has no statistically significant bias. It will be tested early next year and will be subject to evaluation.

“Our priority is protecting the public. This gamechanging technology will support police to put criminals and rapists behind bars. There is human involvement in every step of the process and no further action would be taken without trained officers carefully reviewing results.”

A version of this story was published with The Guardian.