Police are unlawfully storing personal data of suspects after they’re cleared

Sendo Serra / Alamy Stock Photo

Police scientist checking fingerprint records
Police forces are unlawfully storing sensitive data of potentially millions of former suspects who have never been charged with a crime, an investigation has found

Reports Mark Wilding for Liberty Investigates and Anita Mureithi for openDemocracy.

quotes

Police forces are unlawfully storing sensitive data of potentially millions of former suspects who have never been charged with a crime, an investigation has found.

Reports obtained by openDemocracy and Liberty Investigates reveal the government’s biometrics watchdog has repeatedly raised concerns about police breaching rules by retaining information of people who had been arrested and then released.

Fraser Sampson, the biometrics and surveillance camera commissioner, highlighted data protection issues in 17 inspections over the past two years. Incredibly, he told us police forces have failed to get a grip on the problem in part because their ageing computer systems don’t allow them to delete data entries in bulk.

“It really isn’t good enough,” Sampson said. “Not only do you have potentially millions of people whose images are in police records, even though there are no guilty findings against them, but you can’t even know how many there are… It is an intractable problem.”

Campaigners and police monitoring groups say the revelations are yet another example of why public confidence in police is so low, particularly in communities that continue to be overpoliced.

More than half the forces examined by the watchdog were found to be indefinitely retaining custody images, including those of people who were never charged or convicted of a crime – despite a 2012 ruling finding the practice unlawful.

Several other forces were warned about their handling of DNA samples. They included Staffordshire Police, which was found to have adopted a blanket retention policy going far beyond what should only be used as an “exceptional power”.

“These findings raise worrying concerns about the UK police forces’ blanket and indiscriminate retention of biometric data."

Damini Satija, head of the Algorithmic Accountability Lab and deputy director at Amnesty Tech

Sampson also found at least four forces were routinely searching the fingerprints of all arrestees against databases including the Immigration and Asylum Biometrics System, and warned them about the proportionality of running immigration checks on people who haven’t even been charged with a crime.

He told openDemocracy and Liberty Investigates: “If you have no reason to believe there is any immigration or asylum issue involved in an arrest, why would you check [fingerprints] against the database? I haven’t had a convincing response to that question.”

And in one of the most significant data breaches, the Metropolitan Police was found during an inspection in July 2021 to be unlawfully holding almost 300,000 fingerprint records on a counter-terrorism database. The Met had received the records from foreign law enforcement agencies but did not do the proper checks to allow them to be legally retained.

Nour Haidar, lawyer and legal officer at Privacy International, said the findings show police are “not taking data protection seriously”.

She added: “It also shows there is little institutional understanding of the potential harms to individuals and our communities from the widespread collection and storage of their biometrics, and the impact this has on the right to privacy as well as the right to be free from surveillance while participating in civic life.”

Kojo Kyerewaa of Black Lives Matter UK drew parallels with the Met’s Gangs Matrix – a list of alleged gang members that disproportionately targeted Black men and was last year found to be unlawful after a legal challenge by Liberty.

Kyerewaa said our investigation “reinforces the belief that Black communities should not put their trust in police… and gives further reasoning to our calls that the police do not make us feel safe”.

He added: “This shows that there is no regard for protecting people who have not been found guilty.”

According to the Institute of Race Relations, there’s no guarantee automated or data-driven systems do not reinforce discrimination. As of March this year, more than 82% of people on the Gangs Matrix are Black, Asian or from other minority ethnic backgrounds, and 73% are Black. Black people are also seven times more likely to be stopped and searched than white people.

A member of the Copwatch police monitoring network said the revelations came as “no surprise”, adding: “It just goes to illustrate that they can’t manage themselves, and they can’t be trusted to be handling the data in that way.”

A 2017 Home Office review of the use and retention of custody images advised police forces to regularly review and delete photographs.

However, at 11 of the 17 forces inspected in the past two years, the commissioner found “custody images are not proactively reviewed and deleted where appropriate, unless an individual makes a specific request for deletion”. Concerns about this were also raised by both of Sampson’s predecessors in the biometrics commissioner role.

Many of these images go on to be uploaded to the Police National Database, which currently holds more than 16 million images that can be searched using facial recognition technology.

The inspections found many forces were “having difficulty” with deleting custody images because “the process is largely manual and very time consuming”.

Katrina Ffrench, founder and director of UNJUST UK, a group that challenges discrimination in the criminal justice system, said she found the findings “deeply troubling” and suggested there was a “systemic effort to undermine data protection laws”.

“We know that overpolicing, especially for racialised communities, is an ongoing reality, so we’re concerned that the images being retained will disproportionately be of Black, Brown and dual heritage people,” she added.

Police chiefs have recognised the legal and reputational risk posed by unlawfully held custody images. In a letter to forces last year, obtained by openDemocracy and Liberty Investigates, the National Police Chiefs’ Council warned the retention of custody images “poses a significant risk in terms of potential litigation, police legitimacy and wider support and challenge in our use of these images for technologies such as facial recognition”.

“People may assume that because they’re not found guilty and nothing’s happened, the information will be disposed of, not realising that actually, it’s been shared,” Ffrench said.

Evidence from the Institute of Race Relations shows disproportionality in policing goes beyond stop and search and racial profiling. It extends to the use of force, arrests, strip-searches, fines, sentencing, imprisonment and the use of artificial intelligence and automated systems which rely on data collection.

Human rights campaigners have called on the government to take action. Damini Satija, head of the Algorithmic Accountability Lab and deputy director at Amnesty Tech, said: “These findings raise worrying concerns about the UK police forces’ continued exercise of blanket and indiscriminate retention of biometric data. We call for an in-depth review of existing laws, policies and practices on retention and use to understand the full extent of their impact on human rights.”

A Met spokesperson said: “Since the 2021 inspection, the Met has deleted all fingerprint data which was deemed to be held unlawfully. We were unable to delete this data in bulk, and it therefore had to be deleted manually over a period of time. However, as an interim solution, the data was made unsearchable and inaccessible as soon as the issue was identified.”

They added the force “takes the processing, retention and deletion of data very seriously and we always endeavour to adhere to all legal and regulatory requirements.”

Chief constable Jo Farrell, the NPCC’s data lead, said: “This is an incredibly complicated area for law enforcement to manage and the law has changed repeatedly. As such, we have been working to clarify the system under which retention, and deletion, can be undertaken and a programme team has been set up to accelerate this activity.”

This article was published in partnership with openDemocracy.