Skip to main content
Nathan Fennel
Back to Blog

Ads Are Coming to AI

Anthropic's Super Bowl ad mocks ChatGPT for adding ads. But the real question is whether ads can actually fix AI's economics. A look at the numbers.

Anthropic just dropped a set of Super Bowl ads. The first one, titled "Betrayal," opens on a woman chatting with her AI assistant about dinner recipes. Mid-conversation, a banner ad slides in for a fast food chain. Her face falls. The tagline: "Ads are coming to AI. But not to Claude."

Anthropic Super Bowl Ad: Betrayal

Click below to load the embedded video.

The ads are a direct shot at OpenAI's announcement that ChatGPT will begin showing ads to free users. Sam Altman called the Anthropic spots "clearly dishonest." Two more ads in the series lean into the same theme:

Anthropic Super Bowl Ad: Deception

Click below to load the embedded video.

Anthropic Super Bowl Ad: Treachery

Click below to load the embedded video.

The comedy lands. But set the marketing aside for a moment. The real question: do the economics of ads in AI actually work?

The profitability problem

AI companies are not profitable. OpenAI projects a $14 billion loss in 2026 on roughly $30 billion in revenue. Cumulative projected cash outflow through 2029 is $143 billion. They have 20 million paying subscribers but 800 million weekly active users. That means roughly 95% of usage is free, and every free query costs money.

The cost of a query

Estimates put average inference cost at roughly 1 cent per query. With 2.5 billion queries per day, that is about $9 billion per year in inference alone, before training, headcount, or any other overhead. Free users account for roughly 95% of that volume. Serving free users costs an estimated $8.7 billion per year just in inference.

That number deserves a pause. $8.7 billion per year to serve people who pay nothing. This is the core tension in consumer AI economics: the product only works if lots of people use it, but lots of people using it is ruinously expensive.

The ad math

OpenAI's ads launch at $60 CPM with a $200,000 minimum buy. For context, Meta's average CPM is under $20. OpenAI is projecting $1 billion in free-user monetization for 2026.

Run the numbers. $1 billion at $60 CPM means roughly 16.7 billion ad impressions. Spread across about 867 billion free-tier queries per year, that is only 1.9% of free queries showing an ad. As a user experience, that is actually quite restrained.

But $1 billion covers only about 11.5% of the estimated $8.7 billion it costs to serve those free users. To break even on inference alone through ads, you would need to show ads on nearly every query and achieve an effective CPM closer to $10,000. For reference, the most expensive digital ad inventory in the world (Super Bowl streaming spots) runs around $500 CPM. The math does not work.

Even in the most optimistic scenario, where ad load increases significantly and CPMs hold, ads might cover 20-25% of free-user inference costs. That helps, but it is not a business model. It is a subsidy reduction.

The numbers at a glance

OpenAI 2026 Projections ($ billions)

Sources: eMarketer, SemiAnalysis, Sacra. Ad revenue covers roughly 11.5% of free-user inference costs.

What this means

Ads are not a path to profitability for AI. They are a small revenue supplement. The real business model for consumer AI is still subscriptions and enterprise contracts. OpenAI knows this. The ad push is about reducing the subsidy on free users, not about building an advertising business.

Anthropic's ad is funny, and it scores a marketing point. But Anthropic has the same fundamental economics problem: serving free users at a loss while trying to convert them to paid. Neither company has solved this yet. The difference is that OpenAI is trying ads as one piece of the puzzle, while Anthropic is betting that not having ads is worth more in brand trust. Both are reasonable strategies. Neither addresses the underlying cost structure.

The real question

The interesting question is not whether AI should have ads. It is whether any consumer AI business model works at this scale of compute cost. The ads are a band-aid. The real challenge is making inference cheap enough that it does not matter.