If you thought AI was already big, Advanced Micro Devices Inc. (AMD) CEO Lisa Su has news for you: we haven't seen anything yet. Speaking at CES 2026 on Monday night, Su laid out her vision for what she's calling the "yottascale" era of computing—a future where AI models grow so powerful and pervasive that they'll demand a fundamental reimagining of how we think about computing infrastructure.
Welcome To The Yottascale Era
"This moment in tech not only feels different, AI is different," Su told the audience. "AI is the most powerful technology that has ever been created and it can be everywhere for everyone."
That "everywhere" part is key. Su's vision isn't just about building bigger data centers (though there's plenty of that too). She's talking about a much broader computing footprint that spans "solutions from the largest systems in the cloud, to AI PCs, to embedded computing." In other words, AI compute is going to spread out across the entire technology landscape, not just concentrate in warehouse-sized server farms.
The scale of this transformation, according to Su, demands an industry-wide effort. "It takes an open ecosystem built on industry standards," she said, positioning this as a long-term structural shift rather than just another product cycle hype wave. "The world's most important challenges can only be solved by bringing the industry ecosystem together."
Can The AI Spending Spree Continue?
Su's bullish comments arrive at an interesting moment. There's growing chatter in the market about whether the multi-billion-dollar AI infrastructure buildout can maintain its torrid pace. The big tech companies keep insisting they're not slowing down on AI capital expenditures, but analysts are pointing to real constraints—particularly power availability and the sheer financial scale required to keep deploying new infrastructure at this rate.
Interestingly, Perplexity AI CEO Aravind Srinivas recently touched on a related theme that actually supports Su's distributed computing vision. Srinivas suggested that as AI models become more efficient and capable of running locally on devices, the traditional data-center-centric model could face serious disruption. "The biggest threat to a data center is if the intelligence can be packed locally on a chip that's running on the device," he said.
That's not contradicting Su's vision—it's actually reinforcing it. Both are pointing toward a future where AI compute happens everywhere, not just in centralized cloud facilities. The question is whether companies can build that infrastructure fast enough to keep pace with demand, or whether physical and financial realities will force a slowdown.
AMD shares closed down 1.07% on Monday at $221.08, but ticked up 0.4% in after-hours trading. The stock continues to show strong momentum characteristics in medium and long-term price trends, according to market data.




