If you thought building artificial intelligence was expensive, wait until you hear about the numbers being thrown around for artificial general intelligence. International Business Machines Corp. (IBM) CEO Arvind Krishna recently sat down with the "Decoder" podcast and did some back-of-the-envelope math that might make even the most ambitious tech investor wince.
The headline figure? A 100-gigawatt AGI effort could run about $8 trillion. Krishna's calculation starts with what he calls "today's number"—filling a 1-gigawatt data center costs roughly $80 billion. Scale that up, and the numbers get astronomical fast.
Here's where it gets interesting. Krishna told host Nilay Patel that a company committing to 20–30 gigawatts of capacity could be looking at approximately $1.5 trillion in spending based on current cost structures. That estimate, according to Krishna, reflects what companies are actually announcing in their AI infrastructure plans, not some theoretical exercise.
And there's a catch: AI hardware has a brutally short shelf life. Krishna pointed out that these chips need to be used within about five years before they're essentially obsolete and need replacing. So you're not just spending $80 billion once—you're on a perpetual upgrade treadmill.
The Math Behind the Madness
Krishna laid out the financial reality in stark terms. An $8 trillion investment would require about $800 billion in annual profit just to cover interest payments. That's not paying down principal, not generating returns for investors—just servicing the debt.
Patel brought up comments from OpenAI CEO Sam Altman, who has expressed confidence that OpenAI can generate returns on its capital spending. OpenAI has certainly put its money where its mouth is, announcing roughly $1.4 trillion in long-term buildout agreements with partners including Nvidia (NVDA), Broadcom (AVGO), Oracle (ORCL), and Alphabet (GOOGL), according to CNBC.
But Krishna suggested that expectations around AGI spending remain more in the realm of belief than confirmed financial models. He characterized the current wave of infrastructure commitments as illustrating how rapidly companies are scaling AI capacity, though he stopped short of saying this guarantees a path to actual AGI. The pace reflects industry activity and ambition—not necessarily a roadmap with a guaranteed destination.
The Skeptics Speak Up
Krishna himself isn't exactly bullish on current technology reaching AGI. He gives today's large language models only a "zero to 1%" chance of achieving artificial general intelligence. His reasoning? Current LLMs simply don't operate at the level associated with AGI, and significant additional breakthroughs will be necessary. That said, he emphasized that current tools already deliver substantial value in enterprise settings—they don't need to be AGI to be useful.
Other industry leaders have voiced similar doubts. Palantir Technologies Inc. (PLTR) Chief Technology Officer Shyam Sankar told the "Interesting Times with Ross Douthat" podcast in late October that extreme AGI narratives often function as "a fundraising shtick."
Even some of the architects of today's AI are pumping the brakes. OpenAI co-founder Ilya Sutskever said in an interview last month on the "Dwarkesh Podcast" that scaling current models by "another 100x" wouldn't be transformative. He described the next phase of progress as a return to the "age of research"—suggesting brute-force scaling isn't the answer.
Researcher Gary Marcus put it bluntly at the Axios AI+ Summit earlier this month: today's large language models are "a dress rehearsal" for AGI and fall short of the capabilities sometimes implied by their boosters.
What Comes Next?
Krishna told the "Decoder" podcast that current AI tools can absolutely deliver major productivity gains for businesses. But reaching AGI? That's going to require combining large language models with forms of hard knowledge—structured, reliable information rather than statistical pattern matching.
Even with that hybrid approach, Krishna said he remains "a maybe" on whether it would be sufficient to achieve AGI. Which is a remarkably candid admission from someone running one of the world's major technology companies actively investing in AI development.
The bottom line is that AI companies are pouring extraordinary sums into massive data centers, and the price tags keep climbing. Whether those investments lead to AGI or just very sophisticated narrow AI systems remains an open question—one that $8 trillion may or may not answer.




