Amazon Is Spending $200 Billion on AI. Andy Jassy Just Explained Why That Is Not Crazy

"We're not investing approximately $200 billion in capex in 2026 on a hunch, we already have customer commitments for a substantial portion of it."
Amazon CEO Andy Jassy used his annual shareholder letter, published on April 9, to make the most detailed public case yet that the company's massive AI infrastructure spending is not speculative, it is being underwritten by real, fast-growing customer demand.
The letter disclosed that AWS AI services have crossed a $15 billion annualized revenue run rate in Q1 2026. Three years after AWS launched commercially, its total revenue run rate was $58 million.
Three years into the current AI wave, AWS AI revenue alone is over $15 billion, a figure Jassy described as "nearly 260 times larger" than AWS at that same comparable point. AWS overall is now running at a $142 billion annualized revenue run rate, with 24% year on year growth.
The $200 Billion Defence
The headline number in the letter is Amazon's planned capital expenditure for 2026. Approximately $200 billion with the majority directed at AI infrastructure.
Jassy acknowledged directly that this spending has already compressed free cash flow, which fell from $38 billion to $11 billion in 2025 as capital spending rose by $50.7 billion.
His argument against the bubble concern rests on two pillars. The first is demand visibility. "We're not investing approximately $200 billion in capex in 2026 on a hunch," Jassy wrote. "We already have customer commitments for a substantial portion of it."
He cited OpenAI's commitment of over $100 billion to AWS as a single example of the demand backing the investment. He also disclosed that two large customers had asked to purchase all of Amazon's available Graviton chip capacity for 2026, a request Amazon declined in order to serve other customers, but which Jassy used to illustrate the scale of unmet demand.
The second pillar is capacity constraint. AWS added 3.9 gigawatts of new power capacity in 2025 and expects to double total power capacity by the end of 2027.
Jassy said the company is monetising that capacity as fast as it is being installed, and that capacity constraints are currently limiting growth, meaning demand is outrunning supply, not the other way around.
"We're not going to be conservative in how we play this," he wrote. "We're investing to be a meaningful leader, and our future business, operating income, and free cash flow will be much larger because of it."
Why the Chip Business Changes the Economics
Amazon's chip business, encompassing Graviton, its custom CPU, Trainium, its AI training chip, and Nitro, its networking chip, is generating over $20 billion in annualized revenue and growing at triple-digit rates year on year.
That figure represents a doubling from the $10 billion annualized run rate Amazon disclosed alongside its fourth-quarter results, according to Reuters.
Jassy made a pointed comparison to the broader chip industry. Most AI workloads have historically run on Nvidia chips.
Amazon's Graviton, launched in 2018, now runs on 98% of the top 1,000 EC2 customers after offering up to 40% better price-performance than competing x86 processors. Jassy said the same shift is underway in AI.
Trainium2 offered approximately 30% better price-performance than comparable GPUs and has largely sold out.
Trainium3, which began shipping at the start of 2026, is 30 to 40% more price-performant than Trainium2 and is nearly fully subscribed. Trainium4, still roughly 18 months from broad availability, already has significant reservations.
Jassy said Amazon expects Trainium to save the company tens of billions of dollars in capital expenditure per year and provide several hundred basis points of operating margin advantage versus relying on third-party chips for inference.
If the chip business were sold as a standalone company, Jassy estimated its annual revenue run rate would be approximately $50 billion.
He also signalled a potential new revenue stream saying, "There's so much demand for our chips that it's quite possible we'll sell racks of them to third parties in the future."
Google has already pursued a similar strategy, striking a deal to supply Anthropic with one million of its custom AI chips worth tens of billions of dollars.
What This Means for Enterprise AI
The $15 billion AWS AI revenue figure is the most concrete data point yet on where enterprise AI spending actually is. AWS launched commercially in 2006. It took years to convert sceptical enterprises and governments.
The AI wave is moving on a fundamentally different timeline, $15 billion in three years versus decades for comparable cloud adoption milestones, according to Jassy.
For context, Microsoft disclosed in January that its AI business had crossed an annual revenue run rate of $13 billion in late 2024, according to Reuters.
That puts AWS AI revenue ahead of Microsoft's last disclosed figure, though the two numbers are not directly comparable, they measure different periods and use different calculation methods.
Jassy drew the comparison to electricity. When Thomas Edison opened his first commercial power station in 1882, he said, most people saw it as a better way to light a room.
What they could not see was that electricity would eventually reorganise every factory, home, and industry on Earth. "AI may have a comparable impact," Jassy wrote. "The difference is that electricity took 40 years to get where it was going. AI appears to be moving ten times faster."