Palantir reported record revenue of $1.4bn last quarter, with US commercial sales more than doubling in twelve months. The stock has fallen roughly 25% since January. Alex Karp's Q4 2025 shareholder letter explains why he believes most of the AI industry is building the wrong thing.
# THE LETTER
**Palantir Technologies | Alex Karp | Q4 2025**⏱️ THE 60-SECOND VERSION
📈 Palantir generated $1.4bn in Q4 revenue (up 70% year-on-year) and $609m in profit. US commercial revenue hit $507m, a 137% increase.
🤖 Karp argues that large language models without operational grounding are worthless. Most of the AI industry, he says, is building science projects. Palantir's commercial growth proves enterprises agree.
📜 The letter ventures well beyond financials into surveillance, the Fourth Amendment, cultural identity, and the divide between those who build and those who merely criticise.
🕊️ He closes with an argument for "magnanimity towards adversaries," quoting Lorne Michaels on finding "a drop of humanity" in opponents. Not your typical shareholder letter.
✉️ FROM THE LETTER
The sharpest section is Karp's thesis on why AI adoption is splitting into haves and have-nots. While much of the letter reads like political philosophy (he quotes Christopher Lasch, Alasdair MacIntyre, and Justice Potter Stewart across six numbered sections), this passage is where his argument meets the real world:
"A gulf is emerging between those who perceive the preconditions for harnessing the power of artificial intelligence and others who are far more vulnerable and adrift. Indeed, the world of A.I. adoption is increasingly divided between haves and have-nots.
Some institutions as well as nations are thriving; others have been left naked on the beach.
The large language models alone will not lead us to salvation. They require a means of reliably and efficiently interacting with the byzantine complexity of the modern enterprise—its tangle of datasets and operations and personnel.
The strings of text produced by the language models are little without a software architecture that can lend a grammar and structure to the output of these probabilistic prediction engines. The models must be tethered to objects in the real world, and it is that tether, that means of grounding and orientation, that we have built.
As a result, our U.S. commercial business, once a promising yet mostly theoretical backwater of our company, is now growing at an astonishing rate, generating $507 million last quarter—a 137% increase over the same period the year before.
To be clear, our U.S. commercial business has more than doubled in the space of twelve months.
Such a brisk rate of expansion, all while maintaining record levels of profitability, would be remarkable for a newly hatched startup. It is essentially unprecedented, we believe, for a company entering its third decade of operations.
We have arrived at this moment as a result of a fierce pragmatism—a nearly ruthless focus on results and outcomes at the expense of the performative and the theatrical."
🧠 WHAT WE THINK
The core argument is that LLMs without a software layer to ground them in operations are, as Karp puts it, "little." The market agrees in principle but cannot decide what that insight is worth. A 100x forward earnings multiple prices in a future where Palantir becomes the default operating system for enterprises that want AI to produce results, not demos.
The 137% US commercial growth suggests that future is arriving faster than anyone expected. Yet the stock sells off because investors cannot separate Palantir from the broader AI trade. That is exactly the category error Karp is warning about. He is not building an AI company. He is building an operations company that happens to use AI.
🔮 WHY IT MATTERS NOW
Pentagon AI spending hit a record $13.4bn in the 2026 budget request. DOGE-driven reviews are reshaping how defence technology contracts get allocated. Palantir's argument, backed by a $10bn Army contract and expanding NATO partnerships, is that budget austerity makes software-defined operations more necessary, not less. When you need to do more with fewer people, you need better software.
If Karp is right, the pullback in defence stocks is a buying opportunity. If he is wrong, the 100x multiple unravels fast.
💬 ONE GOOD QUOTE
"Anything lacking a zealous focus on the value being created by these technical systems, the mice that the cat actually catches, will ultimately fade to gray and be forgotten."
Karp borrows Deng Xiaoping's famous test to draw a line between AI that produces results and AI that produces headlines.
Read the full letter
We’ve recently launched and so want as much feedback as we can. Share a comment, hit like, drop us an email, forward to a friend. Whatever takes your fancy.


