News Signal surveillance privacy

Palantir uses AI to check the Metropolitan Police for corruption. Ethically indefensible but legally airtight.

The Metropolitan Police monitored its own people without telling them. That's called oversight.

In the space of one week, Palantir combed through the absence records, overtime logs, access logs and complaint histories of 46,000 officers. No consent. No notification. The result is called trust restoration. Someone coined that phrase, someone signed off on it, someone sent it to the press, and the press ran it. Nobody in that chain laughed. That is the only genuinely surprising thing here.

Palantir already held £500 million in public contracts before any of this happened. This was not a pilot. It was an invoice with a free demonstration attached, carried out on the people footing the bill.

How Palantir flagged law enforcement as a liability

615 officers were flagged by an algorithm linking overtime to misconduct, because that kind of algorithm does what the client needs it to do, and the client needs evidence, not truth. The false positive question, how many of those 615 were simply burnt out from overwork, is not missing from public communication by accident. It’s missing because it serves no one. Palantir has its proof of concept. The Metropolitan Police has its narrative of self-cleansing. The British government has its justification for the next contract. Three parties, one story, and 615 people caught inside it without anyone checking whether it held up.

The data was already lawfully held. Legally airtight. Of course it was. The system always makes sure it’s legally airtight before doing something that makes no ethical sense.

The data was the product, the officers the raw material

The 46,000 officers who went through this and the 615 who came out the other end both disappear from the story once they’ve served their purpose: moral lubricant for the opening paragraph, raw material for everything after. Palantir now holds something money can’t buy: public legitimacy as proof of concept, delivered by the largest police force in the United Kingdom, funded by the people whose data was the product. We call this process a system.