When Demand Runs Wild
Here's the problem keeping Nvidia Corp. (NVDA) CEO Jensen Huang up at night: artificial intelligence is hungry, and it's getting hungrier faster than anyone can feed it.
Speaking with CSIS President John J. Hamre earlier this month, Huang laid out an uncomfortable truth. Yes, Nvidia keeps pushing computing performance forward by roughly an order of magnitude each year. That's impressive. But AI demand? It's accelerating even faster than that.
The issue isn't just that more companies want AI. It's that AI workloads themselves are becoming more compute-intensive while adoption spreads across industries. You've got multiple exponential growth curves stacking on top of each other, and despite Nvidia's best efforts to innovate as quickly as possible, there's a hard limit nobody can engineer around: energy.
The 1% Argument
Futurum Group CEO Daniel Newman thinks anyone calling this an AI bubble is fundamentally misunderstanding what's happening. In a post on X, Newman argued that we're "about 1% into the AI revolution," and focusing on tools like OpenAI's ChatGPT as some kind of endpoint is "insanely short sighted."
Newman's point is that the real work hasn't even started yet. We're talking about scaling the physical backbone of AI: computing capacity, network connectivity, manufacturing facilities, and most critically, energy production. Those aren't software problems you can patch with clever code. They require building actual things in the actual world, which takes time, money, and resources.
The Coming Power Crunch
The numbers back this up. According to analysis from The Kobeissi Letter reported in November, global electricity demand is expected to jump about 30% by 2035. Data centers, which currently consume roughly 1.5% of total power, are forecast to more than double their share to about 3.5%.
That's a massive shift in the energy landscape, and it's why power providers capable of delivering dependable, large-scale electricity are positioning themselves as major beneficiaries of the AI boom.
But it's also why tech companies are getting creative, sometimes wildly so. Reports suggest OpenAI and Samsung Electronics Co. (SSNLF) are exploring floating data centers that would use seawater for cooling, addressing both power and heat management challenges.
Then there's the really ambitious stuff. Amazon.com, Inc. (AMZN) founder Jeff Bezos and Tesla Inc. (TSLA) CEO Elon Musk have floated the idea of deploying data centers in space, where they could tap virtually unlimited solar energy. Yes, space data centers. That's where we're at.
Why This Matters
Huang's warning and Newman's response highlight the same fundamental tension: AI isn't slowing down, but the infrastructure needed to support it is struggling to keep pace. This isn't about whether AI is real or useful anymore. It's about whether we can physically build enough capacity to meet demand.
The companies that figure out how to solve the energy equation, whether through conventional power generation, innovative cooling solutions, or yes, even floating or orbital data centers, are going to have a serious advantage. Because if we really are only 1% into this revolution, the infrastructure gap is about to get a lot more obvious.




