Essex police pauses facial recognition camera use after study finds racial bias

harlow-lfr-van-night-1024
Academics discover black people ‘significantly more likely’ to be identified when compared with other ethnic groups

By Mark Wilding for Liberty Investigates and Robert Booth for the Guardian

quotes

Essex police has paused its use of live facial recognition (LFR) technology after a study found cameras were significantly more likely to target Black people than people of other ethnicities.

The move to suspend use of the AI-enabled systems was revealed by the Information Commissioner’s Office (ICO), which regulates the use of the technology deployed so far by at least 13 police forces in London, south and north Wales, Leicestershire, Northamptonshire, Hampshire, Bedfordshire, Suffolk, Greater Manchester, West Yorkshire, Surrey and Sussex.

The ICO said Essex police had paused its LFR deployments “after identifying potential accuracy and bias risks” and has warned other forces to have mitigations in place. LFR systems are either mounted to fixed locations or deployed in vans. In January, the home secretary, Shabana Mahmood, announced the number of LFR vans would increase five-fold, with 50 available to every police force in England and Wales.

Essex police commissioned University of Cambridge academics to conduct a study, which involved 188 actors walking past cameras being actively deployed from marked police vans in Chelmsford. The results were published last week and showed about half of the people on a watchlist were correctly identified and incorrect identifications, known as ‘false positives’, were extremely rare. But the system was more likely to correctly identify men than women and it was “statistically significantly more likely to correctly identify Black participants than participants from other ethnic groups”.

This “raises questions about fairness that require continued monitoring”, the report concluded. One of its authors, Matt Bland, a criminologist, told the Guardian and Liberty Investigates: “If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re Black. To me, that warrants further investigation.”

Although the Cambridge report says false positives were rare, it also notes that four of the six people misidentified in the study were Black, despite Black people only representing one in four of the sample, and said this was “unlikely to be due to chance alone”.

Last month Liberty Investigates and the Guardian revealed that police arrested a man for a burglary in a city he had never visited 100 miles (161km) away after retrospective face scanning software confused him with another person of south Asian heritage.

Possible reasons for the latest issue with LFR include overtraining of the algorithm on the faces of Black people. Experts believe it could be rectified by adjusting system settings. A separate study of the same technology by the government’s National Physical Laboratory found Black men were most likely to be correctly matched by the system and white men least likely, but the effect was not statistically significant.

The Home Office has said LFR cameras deployed in London from January 2024 to September 2025, led to more than 1,300 arrests of people wanted for crimes including rape, domestic abuse, burglary and grievous bodily harm. But opponents of facial recognition technology said the latest research showed warnings about bias in LFR technology were being borne out.

“Police across the country must take note of this fiasco,” said Jake Hurfurt, the head of research and investigations at Big Brother Watch. “AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.”

A spokesperson for Essex Police said: “Based on the fact there was potential bias the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software. We then sought further academic assessment.

“As a result of this work we have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals. We will continue to monitor all results to ensure there is no risk of bias against any one section of the community.”