UK police working with controversial tech giant Palantir on real-time surveillance network

Palantir
The Nectar project offers 'advanced data analysis' using a wide range of sensitive personal information

Reports Mark Wilding for Liberty Investigates and Cahal Milmo for the i.

quotes

A controversial US spy tech firm has landed a contract with UK police to develop a surveillance network that will incorporate data about citizens’ political opinions, philosophical beliefs, health records and other sensitive personal information.

Documents obtained by i and Liberty Investigates show Palantir Technologies has partnered with police forces in the East of England to establish a “real-time data-sharing network” that includes the personal details of vulnerable victims, children and witnesses alongside suspects.

Trade union membership, sexual orientation and race are among the other types of personal information being processed.

The project has sparked alarm from campaigners who fear it will trample over Britons’ human rights and “facilitate dystopian predictive policing” and indiscriminate mass surveillance.

Numerous police forces have previously refused to confirm or deny their links with Palantir, citing risks to law enforcement and national security. However, forces in Bedfordshire and Leicestershire have recently confirmed working with the firm.

Liberty Investigates and i have learned that those projects involve processing data from more than a dozen UK police forces and will serve as a pilot for a potential national rollout of the tech giant’s data mining technology — which has reportedly been used by police forces in the US to predict future crimes.

Concerns have been mounting in recent years about Palantir’s growing influence in the UK’s public sector; in 2023 a government decision to award it a secretive £330 million contract to build and manage a new data platform for the NHS triggered a backlash over its access to sensitive patient records.

Documents now disclosed by Bedfordshire Police in response to a freedom of information request show Palantir is working with the force alongside police in Hertfordshire and Cambridgeshire on a pilot project dubbed ‘Nectar’, aiming to provide a “a single, unified view” of data drawn from nine forces initially, with the ambition to “eventually apply it nationally”. A similar project led by Leicestershire Police, involving data from five forces across the East Midlands, is also underway, Palantir has since told reporters.

The document indicates Bedfordshire police will use the software to “assist with decision making” and “aid in the prevention, detection, and investigation of crimes”. Crime records are combined with other intelligence sources such as financial information to create profiles of individuals including suspects and those “about to commit a criminal offence”, although Palantir denied its software was being used for predictive policing.

The platform also provides intelligence on people “who are or may be victims of crime”, witnesses, children, and vulnerable people.

Senior MPs expressed concerns about the project. David Davis, the former Conservative shadow home secretary, said he would like to see the new system and its legal underpinnings examined by MPs, adding that it raised “multiple concerns” about issues including data deletion and the risk that systems used to plot criminal networks end up flagging individuals unconnected with any wrongdoing.

He said: “There is a real problem with technology being applied to policing without the necessary statutory underpinning and police simply appropriating the powers they want. There are lots and lots of reasons to be concerned by this software and it should be scrutinised by Parliament.”

Labour MP Chi Onwurah, chair of the House of Commons technology select committee, said: “For the digital transformation of government to be successful, people must be able to have confidence in public sector technology. Improving the access and use of data can make public services more effective, but this must be accompanied by the appropriate safeguards and transparency.”

A data protection impact assessment prepared by Bedfordshire Police identifies potential disadvantages of the project including “skepticism or resistance from the public regarding data sharing and privacy concerns” and notes that any data breaches “could have significant consequences”. A cyberattack on the Ministry of Justice last month saw hackers claim to have accessed 2.1m pieces of personal data relating to legal aid applicants.

Palantir – which was founded by US billionaire Peter Thiel in 2003 and received early funding from the CIA’s venture capital arm – has proved highly controversial in its partnerships with law enforcement agencies overseas. In 2017 it was reported to be embroiled in a dispute with the New York Police Department over the transfer of data when its contract ended.

The following year, the City of New Orleans ended a six-year relationship with Palantir after the Verge revealed the firm had been operating a secret “predictive policing” programme, sparking a public backlash. (Palantir disputed this characterisation of its work.) In 2023, police forces in two German states were forced to review their work with Palantir after a court ruled that data processing laws governing some uses of the firm’s software were unconstitutional.

David Nolan, a senior investigative researcher in the algorithmic accountability lab at Amnesty International, said: “The establishment and provision of data-driven law enforcement and predictive policing technologies by companies such as Palantir […] raises severe human rights concerns, particularly given such companies have a history of blatant contempt for human rights. Technologies used for ‘crime prediction’ must be banned.

“The development of a “real-time data-sharing network” across UK law enforcement agencies, that creates a 360 profile of individuals using sensitive personal data, violates people’s right to privacy and establishes a system of indiscriminate mass surveillance.”

In recent years Palantir has drawn criticism for providing software to the Israel Defense Forces for its offensive in Gaza, and for its work supporting US Immigration and Customs Enforcement with deportations. Meanwhile, the company has been rapidly expanding its footprint in the UK’s public services, winning contracts with agencies including the Cabinet Office, the Royal Navy, and the Ministry of Housing, Communities & Local Government.

There are lots and lots of reasons to be concerned by this software and it should be scrutinised by Parliament.

David Davis MP

The Good Law Project, which took legal action against NHS England to force disclosure of the details of its contract with the firm, described news of Palantir’s adoption by UK police forces as “alarming”.

Duncan McCann, tech and data policy lead at the Good Law Project, said: “It’s worth remembering that a majority of NHS trusts have rejected their patient data software as it isn’t up to scratch. However, this pilot now gives the spy-tech firm an open door to roll out controversial software in the future that has been used before in the United States to facilitate dystopian predictive policing which entrenches racial profiling.”

A Palantir spokesman said the firm’s software had helped Bedfordshire Police identify more than 120 young people at risk of abuse or exploitation in its first eight days of operation. “We’re proud our software is helping police improve how they tackle crime, including domestic violence, which is a key part of the ‘Nectar’ pilot,” he said.

“Use of the software does not involve the acquisition of any data that the customer in question doesn’t already hold – it simply organises that data in a way that enables faster, better decision-making.”

He added: “As a matter of company policy, Palantir does not permit its software to be used for racial profiling.”

A spokesperson for the Information Commissioner said: “Any software solution must comply with data protection law. This means it will need to process personal data fairly and lawfully and with due regard to individuals’ rights.

“The ICO works with UK policing to encourage them to embed privacy by design when developing or procuring any new data processing system. Anyone who has concerns about how their personal data may have been handled can complain to us.”

A Bedfordshire Police spokesman said: “[The force] is committed to enhancing productivity and delivering an effective service to the communities we serve. As the landscape of policing evolves, it’s imperative that we evolve with it, which means taking an innovative approach to systems and procedures to allow us to be more efficient.

“Currently, the force is trailing a proof of concept which uses technology developed by Palantir. Unique to Bedfordshire Police, this explorative exercise will assess whether technology can accurately and efficiently analyse data.”

A version of this article was published with the i paper.