Live facial recognition cameras may become ‘commonplace’ as police use soars

CCTV control room for Birmingham City Centre.
An investigation by Liberty Investigates and the Guardian has revealed police scanned more than 5m faces with facial recognition cameras last year

Reports Mark Wilding for Liberty Investigates and Daniel Boffey for the Guardian.

quotes

Police believe live facial recognition cameras may become “commonplace” in England and Wales, according to internal documents, with the number of faces scanned having doubled to nearly 5m in the last year.

A joint investigation by Liberty Investigates and the Guardian today highlights the speed at which the technology is becoming a staple of British policing, with documents released following freedom of information (FOI) requests revealing a series of projects that seek to ramp up officers’ facial recognition capabilities.

Major funding is being allocated and hardware purchased while the state is also looking to enable police forces to more easily access the full spread of its image stores, including passport and immigration databases, for retrospective facial recognition searches.

Live facial recognition (LFR) involves the matching of faces caught on surveillance camera footage against a police watchlist in real time, in what campaigners liken to the continuous finger printing of members of the public as they go about their daily lives.

Retrospective facial recognition software is used by the police to match images on databases with those caught on CCTV and other systems.

According to one funding document drawn up by the Metropolitan Police as part of a now-scaled back plan to put London’s West End under surveillance with fixed live facial recognition cameras, it is believed “the use of this technology could become commonplace in our city centres and transport hubs around England and Wales”.

The first fixed live facial recognition cameras will instead be fitted for a trial in Croydon, south London, later this summer, though the Met has said the West End plans may be revived.

The expansion comes despite facial recognition failing to be referenced in any act of Parliament.

Campaigners claim the police have so far been allowed to “self regulate” their use of the technology. Officers have previously used the facial recognition at a setting that has been shown to have a bias for misidentifying black people, rather than at a more precise threshold.

Following a Court of Appeal judgment in 2020 which found that South Wales’s use of live facial recognition cameras had been unlawful, the College of Policing provided guidance that “the threshold needs to be set with care to maximise the probability of returning true alerts while keeping the false alert rate to an acceptable level”.

There remains nothing in law to direct forces on the threshold or technology used.

The policing minister Diane Johnson, told Parliament earlier this month that she recognised “a need to consider whether a bespoke legislative framework governing the use of live facial recognition technology for law enforcement purposes is needed” but the Home Office is yet to provide any bill.

4.7 million faces scanned by police facial recognition cameras last year

Facial recognition cameras were first trialled in London and south Wales in 2016 and 2017 respectively, but the speed at which police forces are rolling out the technology has accelerated over the last 12 months. It can be revealed:

  • Police forces scanned nearly 4.7m faces with live facial recognition cameras last year, according to analysis of data published by forces using the technology — more than twice as many as in 2023. Live facial recognition vans were deployed at least 256 times in 2024, official deployment records show, up from 63 the year before
  • Documents released under FOI laws show a Home Office-funded roving unit of 10 live facial recognition vans that can be sent anywhere in the country will be made available within days — increasing national capacity. Until now only eight of the UK’s 48 police forces are known to have deployed the technology – Bedfordshire, Essex, Hampshire, Northamptonshire, London Metropolitan, North Wales, South Wales and Suffolk. The Met, one of the main early adopters of live facial recognition, currently has four vans
  • Two police forces – South Wales and the Met – bid for £1.6m from the Home Office to install permanent fixed infrastructure creating a ‘zone of safety’ by covering the West End of London and Cardiff train station with a network of live facial recognition cameras. Met officials told reporters the West End plan remains a possibility
  • Forces almost doubled the number of retrospective facial recognition searches made last year using the Police National Database (PND) from 138,720 in 2023 to 252,798, according to figures released in response to a separate FOI request. The PND contains custody mug shots, millions of which have been found to be stored unlawfully of people who have never been charged with or convicted of an offence
  • More than a thousand facial recognition searches using the UK passport database were carried out in the last two years, and officers are increasingly searching for matches on the Home Office immigration database, with requests up to 110 last year. Officials have concluded that using the passport database for facial recognition is “not high risk” and “is not controversial”, according to internal documents, noting that the department had not sought advice from the Information Commissioner
  • When Liberty Investigates first revealed the practice of using photographs in the passport database last year, the Commissioner said he would be raising the matter with the Home Office — but has since declined to confirm what action was taken. A spokesperson for the Commissioner said: “We are working with police forces to ensure that [facial recognition] technology is effective, and people’s rights are protected. Our conversations with the Home Office on the use of the passport database are ongoing and form part of this work.”
  • The Home Office is currently working with the police to establish a new national facial recognition system, known as Strategic Facial Matcher. The platform will be capable of searching a range of databases including custody images and immigration records.

Lindsey Chiswick, director of intelligence at the Met and the National Police Chiefs’ Council (NPCC) lead on facial recognition, said surveys showed that four in five Londoners were in support of the police using innovative technology, including facial recognition cameras.

This week, a registered sex offender, David Cheneler, 73, from Lewisham, was jailed for two years after he was caught alone with a six-year-old girl by a live facial recognition camera. He had previously served nine years for 21 offences against children.

Chiswick said: “Where there’s limited amounts of money and there’s fewer officers, but there’s more demand, and we see criminals exploiting technology to a really grand scale… We’ve got to do something different.

“There’s an opportunity out there. So policing needs to start operating a little bit differently. People talk about harnessing AI [artificial intelligence] like it’s some crazy horse we want to saddle but we do need to harness the opportunities that technology and data can bring us.”

Chiswick said that the Met’s policy was to take “really quite small steps and review them at every stage” but that there would be a “benefit in potentially some sort of framework or statutory guidance”.

The Met is currently deploying its facial recognition cameras at a setting that testing suggests avoids any statistically significant gender or ethnicity bias when it comes to cases of misidentification.

Chiswick said: “I don’t want to use a biased algorithm in London. There’s no point on all counts. I think for government, there’s a question, isn’t there around artificial intelligence? And I think clearly the public sector is going to use, and want to use AI more and more.

“I think the questions around who then decides where algorithms are purchased from, what training data is used, what countries might this technology come from and then, when you use it, are you obliged to test it and if you’re obliged to test it, are you then obliged to operate at a certain setting? That’s not really questions for law enforcement.”

The Home Office declined a request for comment.

A version of this article was published in partnership with the Guardian.