Facial recognition error prompts police to arrest Asian man for burglary 100 miles away

AlviChoudhury
Exclusive: Alvi Choudhury claiming damages against Thames Valley police after biased technology confused him with man looking ‘10 years younger’

By Mark Wilding for Liberty Investigates and Robert Booth for the Guardian

quotes

Police arrested a man for a burglary in a city he had never visited after nationally-deployed face scanning software confused him with another person of south Asian heritage.

Alvi Choudhury, 26, a software engineer, was last month working at the home he shares with his parents in Southampton when police knocked on his door, handcuffed him and held him in custody for close to 10 hours before releasing him at 2am.

Thames Valley Police had used automated facial recognition software which matched him with footage of a suspect of a £3,000 burglary 100 miles away in Milton Keynes, according to documents shared with the Guardian by Liberty Investigates.

But the CCTV footage showed a noticeably younger man with different features apart from similar curly hair, said Choudhury, who was left baffled and suspecting racism in the arrest decision.

“I was very angry, because the kid looked about 10 years younger than me,” said Choudhury who wears a beard. ”Everything was different. Skin was lighter. Suspect looked 18 years old. His nose was bigger. He had no facial hair. His eyes were different. His lips were smaller than mine.”

“I just assumed that the investigative officer saw that I was a brown person with curly hair, so I was a brown person with curly hair, and decided to arrest me,” he said.

UK police use an algorithm procured by the Home Office from Cognitec, a German company. It runs some 25,000 monthly searches against around 19 million police mugshots held on the UK-wide police national database. Facial matches should be treated as intelligence, not fact, according to the National Police Chiefs Council and Thames Valley Police said the decision to arrest Choudhury came after a human visual assessment as well.

But the technology was revealed in December to produce a far higher rate of false positives for black (5.5 %) and Asian (4.0 %) faces than for white faces (0.04 %) at certain settings, according to Home Office commissioned research. Police and crime commissioners warned of “concerning in-built bias”, and said that while “there is no evidence of adverse impact in any individual case, that is more by luck than design”.

Since December Thames Valley Police has also been deploying live facial recognition technology to scan the public in locations in Oxford, Slough, Reading, Wycombe and Milton Keynes. It has captured around 100,000 faces leading to six arrests.

Given the differences between the man on the CCTV and his own face, Choudhury assumed he would be quickly freed. He offered evidence of work meetings in Southampton on the day of the crime but he was instead taken into custody.

Choudhury has filed a complaint against Thames Valley Police. His neighbours saw him being led away in handcuffs, his father was very anxious about him being held and he was unable to work the following day, he said. He is also calling for greater transparency about the number of wrongful arrests involving facial recognition technology.

Choudhury’s mugshot was only held on the police system because he had been previously wrongly arrested in 2021 when he had been attacked on a night out while at university in Portsmouth. The police released him with no further action. Now he has had a second mugshot taken he is afraid the automated system may trigger more wrongful arrests.

“In my head, if a brown person in Scotland robs a bank are they going to come and arrest me?” he said.

He sometimes needs security clearance to work for government clients and he is asked about arrests and said: “This makes me look dodgier and dodgier”.

Thames Valley Police admitted to Choudhury the arrest “may have been the result of bias within facial recognition technology”. But an officer told him that “as the use of facial recognition is already subject to review at a strategic level, I do not feel the need to raise this issue as part of wider organisational learning”.

A Thames Valley Police spokesperson denied the arrest was unlawful and said: “While we apologise for the distress caused to the complainant in this case, their arrest was based on the investigating officers’ own visual assessment that the individual matched the suspect in CCTV footage following a retrospective facial recognition match, and was not influenced by racial profiling.”

But Choudhury said officers at the Hampshire Police station laughed when he asked: “does this look anything like me?” And he said the Thames Valley Police officers who arrived to interview him said “they knew I wasn’t the suspect after looking at footage of the suspect and looking at my picture”.

Warnings have been repeatedly raised about the use of automated facial recognition technology. In December 2024 the UK’s biometrics and surveillance camera commissioner, William Webster, voiced concern that police continue to retain and use images of people who, having been arrested, have never subsequently been charged or summonsed. Last month, South Wales police paid damages to a black man who was wrongfully arrested and held for 13 hours after facial recognition technology.

Choudhury’s lawyer, Iain Gould, a partner at DPP Law, said police “must ensure that artificial intelligence is not substituted for human intelligence and due diligence, but instead is used in careful partnership with it”.

The Home Office said guidance and training to minimise error and maintain public confidence in retrospective facial recognition under review by the Police Inspectorate. It said a new national facial matching system is under development, with an improved, independently tested algorithm.

Habib Kadiri, executive director at the police accountability group StopWatch, said: “This incident exposes exactly the sort of problem we and other organisations have warned about for years.

“Wrongful arrests from misidentifying individuals will continue to happen, too often to believe police claims that facial recognition technologies are significantly better at detecting criminals. This is because police error, data mismanagement, and racial bias are still baked into the system.”

A version of this story was published with The Guardian.

Facial recognition technology. Credit: Alamy / Sergei Babenko