Here's an uncomfortable thought for anyone investing in the data center boom: what if we don't actually need all those massive facilities we're building?
That's the contrarian warning from Aravind Srinivas, CEO of Perplexity AI, a company backed by Jeff Bezos and Nvidia Corp. (NVDA). In a recent podcast interview with Prakhar Gupta, Srinivas laid out a scenario where artificial intelligence running directly on your devices could make centralized data centers far less relevant than the industry currently assumes.
The Local Intelligence Threat
"The biggest threat to a data center is if the intelligence can be packed locally on a chip that's running on the device and then there's no need to inference all of it on like one centralized data center," Srinivas explained.
It's a fascinating reversal of the current trajectory. Right now, the AI industry is betting enormous sums on centralized infrastructure. We're talking hundreds of billions, possibly trillions of dollars in data center investments worldwide. But what if AI gets good enough to run efficiently on the device in your pocket or on your desk?
Srinivas, who previously worked at OpenAI, Google Brain, and DeepMind, didn't mince words about the stakes: "That really disrupts the whole data center industry like it doesn't make sense to spend all this money $500 billion, $5 trillion whatever on building all the centralized data centers across the world." He called it a "$10 trillion question, hundred trillion dollar question."
The vision he describes is compelling. Imagine AI that runs locally on your device, learns from your repeated tasks, and adapts over time to automate your activities. "It adapts to you and over time starts automating a lot of the things you do. That way you don't have to repeat it. That's your intelligence. You own it. It's your brain," Srinivas said.
Who Wins in an On-Device World?
If this shift happens, the winners would likely be chip companies and device manufacturers. Apple Inc. (AAPL) has "a massive advantage" according to Srinivas, thanks to its M1 chips and power-efficient devices. Qualcomm Inc. (QCOM) would also be well-positioned, along with original equipment manufacturers including Apple, Samsung (SSNLF), Lenovo (LNVGF), and HP Inc. (HPQ).
But there's a catch. Technical barriers remain significant. Srinivas acknowledged that no AI model has yet been released that can run efficiently on a local chip while completing tasks reliably. The technology isn't quite there yet.
When it does arrive, Srinivas expects early adoption on MacBooks or iPads before the technology makes its way to smartphones. That makes sense given the power and thermal constraints of smaller devices.
Broader Economic Implications
Beyond the immediate question of where AI processing happens, Srinivas touched on the potential for AI in robotics and the physical world. He suggested AI could fundamentally transform the labor market by automating tasks currently performed by humans, echoing concerns raised by Geoffrey Hinton, often called the "Godfather of AI."
There's also a systemic risk angle worth considering. The U.S. economy is becoming increasingly dependent on AI infrastructure. If centralized data centers are the foundation of that infrastructure and an AI bubble bursts, those facilities could become a "single point of failure" with widespread economic consequences.
Srinivas's argument raises a fundamental question for the tech industry: will centralized data centers remain the backbone of the digital economy, or will intelligence distributed across billions of personal devices radically reshape the sector? It's not a question with an obvious answer, but it's certainly one worth asking before we commit trillions of dollars to infrastructure that might become less essential than we think.
The irony, of course, is that Srinivas runs a company backed by Nvidia, which profits handsomely from selling chips to those very data centers. But perhaps that's what makes his warning worth taking seriously.




