When everyone's talking about AI bubbles, Nvidia Corp. (NVDA) CEO Jensen Huang has a different story to tell. At the U.S.-Saudi Investment Forum in Washington this week, he sat down alongside Tesla Inc. (TSLA) and SpaceX CEO Elon Musk and made his case: What looks like irrational exuberance is actually a rational response to computing's biggest transformation in decades.
The Moore's Law Problem
Huang's argument starts with something computer scientists have known for a while but most people haven't internalized: Moore's Law is dead. For decades, CPUs got reliably faster, doubling their performance like clockwork. That era is over.
But here's the thing—demand for computing power hasn't stopped growing. If anything, it's accelerating. So if traditional processors can't keep up, what fills the gap? GPUs. Graphics processing units, originally designed to render video games, turn out to be exceptionally good at the kind of parallel processing modern computing demands.
Huang pointed to a stat that crystallizes the shift: Six years ago, CPUs powered 90% of the world's top 500 supercomputers. Today? Less than 15%. The other 85% now run on GPUs. That's not a gradual evolution—that's a revolution that's already happened while most people weren't paying attention.
It's Not About ChatGPT
Here's where Huang's pitch gets interesting. He argues that the bulk of GPU adoption has nothing to do with the generative AI tools everyone's obsessed with. Before ChatGPT became a household name, the world's largest computing workloads had already migrated to GPUs.
Banks processing transactions. Credit networks verifying purchases. E-commerce platforms managing inventory. Advertisers targeting campaigns. All of this runs on data processing infrastructure that consumes hundreds of billions of dollars annually. "Every name, address, age and transaction lives in a data frame—and computing those data frames fuels the economy," Huang explained.
Then there are recommender systems—the algorithms that decide what you see in your social media feed, what products Amazon suggests, what videos YouTube queues up next. Huang calls them "the engine of the internet today," and they've been running on GPUs for years.
Agentic AI Is Just the Cherry on Top
Only after you account for this massive foundational shift does the new wave of AI from OpenAI, xAI, Anthropic, and Alphabet Inc. (GOOGL)'s Google (GOOG) enter the picture. And by Huang's math, once you subtract all the computing power already justified by existing workloads, what's left over for cutting-edge agentic AI is "substantially less than you thought and all of it justified."
It's a clever framing. Huang's essentially saying: Don't look at $400 billion in tech capex and assume it's all betting on robots that can write poetry. Most of it is infrastructure spending that would have happened anyway, just accelerated by the CPU-to-GPU transition.
The Skeptics Aren't Buying It
Not everyone shares Huang's optimism. Earlier this month, billionaire investor Peter Thiel made headlines by dumping his entire Nvidia stake—the company most synonymous with the AI boom. He also slashed his Tesla position, signaling broader concerns about where machine intelligence is headed and how it's being valued.
Michael Burry, the investor who famously predicted the 2008 housing crash, has been even more direct. He argues that the current "Cloud then AI buildout" has triggered capital spending levels comparable to the Dot-Com bubble and pre-crisis housing market—periods that didn't end well.
The numbers support their concern. U.S. tech giants are on pace to spend nearly $400 billion on capital expenditures this year. McKinsey projects total AI investment will hit $5.2 trillion by 2030. Those are staggering figures, and history suggests that when everyone piles into the same trade, someone usually gets hurt.
Who's Right?
The honest answer? It's too early to know. Huang's technical argument about the CPU-to-GPU shift is grounded in real infrastructure changes that have been underway for years. The supercomputer data doesn't lie, and neither does the fact that major workloads have already migrated.
But that doesn't mean every dollar being spent today is justified, or that valuations perfectly reflect underlying economics. Thiel and Burry aren't worried about whether GPUs are useful—they're worried about whether the current spending pace is sustainable and whether expectations have gotten ahead of reality.
What's clear is that Nvidia is sitting at the center of this debate. The company ranks in the 98th percentile for Growth and the 92nd percentile for Quality compared to industry peers, which explains why it became one of the most valuable companies in the world. Whether that position holds depends on whether Huang's revolution thesis or the skeptics' bubble warning proves correct.
For now, the spending continues, the infrastructure gets built, and everyone watches to see if this is the foundation of a new computing era—or just an expensive bet that won't pay off.