Who Really Makes Money in AI? Palantir’s CFO has an Answer

Model developers face margin pressure as the value shifts to software companies that own data and customers

In an interview at Palantir’s AIPCon, CFO David Glazer declared that “LLMs are commodity cognition”. His point was that the raw output of an LLM is getting cheaper and better (“ELO scores [are] up, tokens are cheaper”) so the real challenge is how companies derive value from that output. “We are the product, we have the data, we have the customer relationship and we can vend in whatever intelligence sources we need in order to accomplish the task,” he said.

This framing happens to highlight a crucial divide in AI. Foundation models are improving fast, but as their costs fall and supply expands, they look increasingly like interchangeable infrastructure. The companies that turn those models into workflow tools and integrate them with data are the ones showing durable margins and higher valuations.

Commoditization in Motion

There is evidence that LLMs are already sliding toward commodity status. Analysts at Summit Partners describe the foundation model market as “fiercely competitive, as companies push for better performance at lower costs, often with shrinking margins”. IoT Analytics reaches a similar conclusion: “the availability of more and cheaper models will ultimately lower the cost of including GenAI features in applications”.

That dynamic means LLMs look more like cloud servers or GPUs; vital, but fungible. In financial terms, investors are already pricing this in. In February this year, leading model developers such as OpenAI and Anthropic were valued at lower revenue multiples than they were a year ago.

Contrast that with companies that package AI into products. In May, Perplexity, which combines LLMs into a search and chat application, raised $500 million at a $14 billion valuation. Investors are rewarding its specialization and defensibility, not its access to models.

Where Profits Accrue

Embedding AI into existing software ecosystems tends to sustain high profitability. Palantir’s latest results show ~80% gross margins and 46% operating margins on more than $1 billion in quarterly revenue. Microsoft uses the same strategy, layering Copilot into Office and Azure, while Google treats its AI as a lever for ads and cloud. Analysts estimate Google can run AI workloads at a fraction of OpenAI’s compute cost, underscoring how a cost-advantaged incumbent can treat LLMs as inputs rather than products.

The valuation gap shows up in investor commentary too. Citron Research argued that if Palantir were valued at the same ~17× revenue multiple as OpenAI, its stock would drop by more than 70 percent. Whether fair or not, that comparison underscores that markets view workflow software companies differently from model vendors.

Outside experts agree. Summit Partners predicts that “foundation models will continue to improve and pricing will continue to be a race to the bottom. The most significant opportunities will emerge in how AI is applied”. Palantir’s Chief Revenue Officer Ryan Taylor went further, claiming that “LLMs simply don’t work in the real world without Palantir”.

The Nuance

Glazer is right about the direction of travel. Over time, LLMs will behave like commodity infrastructure. Falling token prices, open-source releases, and rapid performance gains ensure that no single provider can hold on to monopolistic margins. The payoff will flow to the companies that integrate those models with proprietary data, workflows, and user relationships.

But the picture is more complex today. Building a frontier model still requires billions of dollars, privileged access to Nvidia hardware, and elite research talent. That scarcity gives companies like OpenAI, Anthropic strategic leverage, at least in the short term. Their ability to move up the stack (into enterprise solutions, developer platforms, and integrations) will determine whether they escape commoditization.

Bottom line: Glazer’s “commodity cognition” line captures a real trend. LLMs are sliding toward infrastructure status, even as frontier labs still hold sway today. The durable profits in AI will come not from the models themselves but from how well companies turn them into products that customers cannot live without.

📣 Want to advertise in AIM Media House? Book here >

Picture of Mukundan Sivaraj
Mukundan Sivaraj
Mukundan covers the AI startup ecosystem for AIM Media House. Reach out to him at mukundan.sivaraj@aimmediahouse.com.
14 of Nov. 2025
The Biggest Exclusive Gathering of
CDOs & AI Leaders In United States

Subscribe to our Newsletter: AIM Research’s most stimulating intellectual contributions on matters molding the future of AI and Data.