Raphael Boyd 

Met investigates hundreds of officers after using Palantir AI tool

Met says AI software unearthed rule-breaking ranging from work-from-home violations to suspected corruption
  
  

Line of Metropolitan police officers seen from behind in high vis
The Met said corruption was the most consistent offence detected, with misconduct related to ‘abuse of the IT system that rosters shifts by police officers for personal or financial gain’. Photograph: Cliff Hide General News/Alamy

The Metropolitan police have launched investigations into hundreds of officers after using an AI tool built by the controversial tech company Palantir to root out rogue cops.

The software was deployed by the Met over the course of a week, surveilling staff members using data the force has ready access to, unearthing rule-breaking ranging from work-from-home violations to suspected corruption and even criminal allegations such as rape.

The Met said as a result of the software, evidence had been found tying a small number of officers to serious cases of misconduct and criminality, resulting in the arrest of three officers for offences including abuse of authority for sexual purposes, fraud, sexual assault, misconduct in public office and misuse of police systems.

According to numbers cited by the Met, corruption was the most consistent offence detected by the AI software, with 98 officers being assessed for misconduct related to “abuse of the IT system that rosters shifts by police officers for personal or financial gain”, while another 500 had received prevention notices in relation to the same offence.

The Met said that 42 senior officers, with ranks ranging from chief inspector to chief superintendent, were “being assessed for misconduct for serious noncompliance” for sometimes falsely claiming to have been in the office when they had been working from home or otherwise away from the office for excessive periods of time, when the Met’s guidelines state that in-office attendance cannot dip below 80%.

The software also found officers who had failed to state that they were Freemasons – now a declarable interest within the force – with 12 officers under investigation for gross misconduct for keeping their membership of the group private, and a further 30 officers receiving prevention notices for suspected but uncorroborated undeclared membership.

The implementation of the software is the latest case of the Met embracing AI, with the force recently entering negotiations to buy tech from Palantir to aid in criminal investigations.

Palantir has connections to ICE, Donald Trump’s immigration enforcement programme, and to the Israeli military and earlier this month MPs demanded a £330m contract between Palantir and the NHS be scrapped.

The Met said the software would help “build trust, reduce crime and raise standards” in the UK, citing the introduction of other technologies such as drones and live facial recognition (LFR) as having helped to keep people safe and reduce crime.

The Met commissioner, Mark Rowley, said: “Criminals are constantly adapting how they use technology and policing has to keep pace, not just on the streets but within our own organisation.

“This is the Met using technology, data and stronger legal powers to confront poor behaviour, raise standards and fix our foundations as our communities would expect.

“The vast majority of our officers and staff serve London with dedication and integrity and rightly expect us to act firmly against those who abuse their position or undermine public trust, particularly in leadership roles.

“By bringing together the information we already lawfully hold, we can identify risk earlier, act faster and be fairer and more consistent. Alongside new vetting powers, this gives us the tools we need to remove those who should not be in policing and strengthen culture for the future.”

 

Leave a Comment

Required fields are marked *

*

*