Hundreds of thousands of innocent people on police databases as forces expand use of facial recognition tech

David Warren / Alamy Stock Photo

CCTV control room for Birmingham City Centre.
Police chiefs claim retrospective facial recognition searches show ‘immense potential’. But campaigners say databases contain ‘hundreds of thousands’ of innocent individuals

Reports Mark Wilding for Liberty Investigates and Cahal Milmo for the i.

quotes

The use of facial recognition technology that deploys algorithms to scour a database of millions of custody images, including large numbers of individuals never charged with a crime, has rocketed, despite warnings that it is opaque and open to abuse.

An investigation by i and Liberty Investigates has revealed that there was a 330 per cent increase last year in the number of searches using a form of facial recognition to match suspects for crimes against the Police National Database (PND), which holds more than 16 million images of arrested individuals. These are then compared with pictures of suspects taken from CCTV, mobile phone, dashcam or doorbell footage.

Campaigners told i that there is a disturbing risk that innocent individuals could be pinpointed by the system because the PND and other databases kept by individual forces have retained the hundreds of thousands of images of Britons who were never charged or were later cleared of a criminal offence following their arrest.

There have been concerns about secrecy surrounding the use of the forensic tool – known as retrospective facial recognition (RFR) technology – and the extent to which the public are aware of it.

Earlier this year, 13 of the 45 UK’s territorial police forces denied having used RFR in 2022, despite Home Office figures showing that they had carried out thousands of searches between them.

Chris Philp, the Policing Minister, last week acknowledged that – despite previous police denials – all 45 UK territorial police forces were now using the technology, adding that he expects it has “an enormous impact on our ability to lock up criminals”.

Senior officers have also extolled the virtues of the technology – with the Commissioner of the Metropolitan Police, Sir Mark Rowley, this month saying the technique has “immense potential” and could transform crime-fighting in the same way as DNA some three decades ago.

But human rights groups and the Government’s own surveillance watchdog have raised serious concerns about the rapid rollout of RFR, saying there is a lack of transparency in its use.

Home Office data disclosed to i and Liberty Investigates under freedom of information rules shows that the number of RFR searches of the national database carried out by forces last year reach a little over 85,000 – more than three times as many as 2021. Figures for the first four months of this year suggest that this year’s total is on course to exceed 130,000 – a further 52 per cent annual increase.

Scotland Yard is by far the most prolific user of the technology, boosting its number of searches last year by more than 700 per cent from 3,030 to 24,677 – more than five times greater than the next highest figure of 4,727 searches by the British Transport Police. In Merseyside, police RFR searches rose by near 2,000 per cent from 190 to 3,896.

85,000+ Retrospective facial recognition searches by UK police forces in 2022

Despite the enthusiasm of law enforcement leaders, the effectiveness of RFR remains unclear. When i and Liberty Investigates this week asked the Met to detail the number of arrests made using RFR, the force said such data was not available.

At the same time, the Yard revealed that it had already extended its use of the technology to so-called “cold cases” and is reviewing “historical casework” to see if hitherto unidentified individuals can now be traced.

The sharp increase in the use of the technology is understood to be at least in part due to the introduction since November 2021 of new algorithms allowing the PND to be searched more efficiently.

In a speech to the Police Superintendents Association last week, Mr Philp said he had seen the system being used to match suspects with poor-quality CCTV camera footage which he had assumed could find no leads. Mr Philp told officers: “I would strongly encourage anyone to try and recover every image [and] running [it] through the PND database, and it’s quite likely you’ll get a hit.”

The Home Office is leading attempts to expand the use of facial recognition technologies within policing and earlier this summer ministers called for new ideas from tech companies on the use of facial features to “resolve identity”.

The department said facial recognition is helping police to tackle serious crimes including murder, rape, child sexual exploitation and terrorism. A spokesperson added: “New technology is key to more effective policing and we are supportive of more police forces using facial recognition, in a fair and proportionate way.”

But campaigners and other experts have cast doubt on whether such fairness is sufficiently enshrined within the existing system.

Retrospective facial recognition is different from its better-known and more controversial cousin, live facial recognition, which uses cameras set up in public places to scan passers-by in real time against a police watch list in order to identify wanted individuals. With RFR, images such as CCTV footage of a suspect committing an offence, or of an individual wanted for an offence, are compared with the PND or separate intelligence databases held by forces to try to identify the wanted person.

This retroactive version of the technology nonetheless raises its own battery of concerns, not least the extent to which the public are aware that it goes on and how it is used. Earlier this year, some 13 out of the 45 UK territorial police forces denied having used RFR in 2022 despite the Home Office figures now showing that they had indeed carried out thousands of searches between them.

Humberside Police, one of the forces which had denied using the technology, said its denial had mistakenly referred to use of “live” facial recognition, adding that while it had conducted retrospective searches it had no “in-house” capability for such queries and was reliant on another force.

“It’s almost impossible for the citizen to understand who is using what to watch whom”

Fraser Sampson, biometrics and surveillance camera commissioner

Fraser Sampson, the Government’s biometrics and surveillance camera commissioner, who has a mandate to ensure the use of facial recognition is lawful and proportionate, said the level of transparency around the technology “falls far short” of what is needed, adding: “It’s almost impossible for the citizen to understand who is using what to watch whom.”

Daragh Murray, senior law lecturer at Queen Mary University and a leading expert on surveillance technology, raised concerns that police forces may be breaching their legal obligations by failing to publish policies on just how and when they are making use of facial recognition. He said: “This raises serious questions regarding the police’s commitment to human rights and their respect for the court system.”

A key issue with the retrospective technology is the fact that the PND and other law enforcement databases continue to contain the images of significant numbers of individuals whose details have been retained despite the fact they were never charged or were subsequently cleared of any charge.

Despite the Court of Appeal ruling more than a decade ago that the retention of these images is unlawful, forces have been repeatedly found to be keeping this data indefinitely. Under existing rules, the only way for an innocent individual to be sure their image does not feature on the PND is to formally request that it be removed.

But with an estimated fewer than 5,300 of such requests being made each year, campaigners argue that instead tens of thousands of images of innocent people are being added to the PND each year because few forces proactively delete custody photos where no further action is taken against an individual. Writing in 2021, Mr Sampson said that UK police forces continue to “retain the vast majority of their custody images indefinitely, regardless of whether the individual has been convicted of an offence”.

In a report earlier this year, Big Brother Watch, a privacy rights campaign group, pointed to an assessment by one of Mr Sampson’s predecessors that the PND holds “hundreds of thousands” of custody pictures of innocent people, adding that the Home Office is unable to say how many unlawfully-held images remain on the system. The campaign group said it believes “millions” of such images have been wrongfully retained and that glitches in the RFR technology could lead to those individuals being wrongly matched as suspects by the system.

Madeleine Stone, senior advocacy officer at Big Brother Watch, said there was a dangerous lack of safeguards and parliamentary oversight in the way RFR is being used. She told i: “Police forces’ ongoing failure to comply with the legal requirement to delete the custody images of unconvicted people means that innocent people could find themselves wrongly labelled as criminal.

“We urgently need a democratic, lawful approach to the role of facial biometrics in Britain, and without this, police forces should not be using this Orwellian technology at all.”

Police forces have long insisted that legacy computer systems mean custody images have to be removed individually and they do not have the resources to weed out all wrongly retained data, despite a private warning from the National Police Chiefs Council (NPCC) last year that failure to do so presents a “challenge in our use of these images for technologies such as facial recognition”.

The NPCC insisted that RFR technology allows police to both identify and eliminate suspects with greater speed and accuracy, adding that it had developed guidance for forces to explain to communities their approach to retaining and disposing of images from the custody database. Lindsey Chadwick, the body’s lead on facial recognition, added: “The database itself remains an effective tool from which we can identify harmful offenders and keep the public safe.”

The Metropolitan Police said facial recognition provided “fantastic opportunities” for more effective policing. A spokesperson added: “As the technology has improved, the Met has increased its use of retrospective facial recognition technology as part of its investigative tool kit. We are also reviewing historical casework of unidentified individuals.”

This article was published in partnership with the i.

This article was corrected to read “all 45 UK territorial police forces” rather than 43 as originally stated.