Anthropic refused the military its AI, after the Pentagon had already flown with it
The contract had red lines. The AI model already had wings. Which one did Venezuela see?
The contract that didnβt matter
A company that supplies weapons systems to the military refuses to supply weapons systems to the military. Lucky we have a president who personally sorts out that kind of nuance on Truth Social.
Anthropic sold its model to the Pentagon in July 2025. Not in secret, because that would be inappropriate. With a press release, two hundred million dollars, classified networks, the highest levels of national security. The company was proud. The military was satisfied. There were probably cake.
Venezuela didnβt call, Palantir did
The model turned out to have been used in military operations while the contractual restrictions sat neatly on paper, where they belong. Anthropic heard about this from Palantir, which is a pleasant way to find out where your product has been flying. Then came the refusal. That refusal is now called principled stance. A phrase that sounds considerably better than: we didnβt know and now we canβt go back.
A company does something, has regrets, canβt go back, draws a line just before the next step. The line is called ethics. The timing is called coincidence. Nobody did anything wrong, because the system delivered exactly what it was designed to deliver.
Everyone is right, which is convenient. The military is right that it wants to use a system it paid for. Anthropic is right that autonomous lethal weapons are dangerous, a conviction that apparently hardens after the second conversation with Palantir. The court is right that the government didnβt follow its own procedures. An outcome with no one responsible. Thatβs what the system looks like when itβs grown up.
Contracts are for people who need to say later that it wasnβt them
Contractual restrictions are not technical restrictions. They determine who gets to say later that it wasnβt them. When the system is deployed in ways the words prohibit, the relevant question isnβt whether thatβs bad. The relevant question is who picks up the tab. Thatβs not in the contracts, because who puts something like that in a contract.
The question of who sets the values in military AI gets answered by the party with the most money, the fewest scruples, and the best lawyers. That party changes its name. The nature stays the same.
Your AI has already flown over Venezuela. What comes after that is called principled stance. Or contract law. Depending on who writes the press release.