a better way of thinking about the ai bubble

The Nuances of a Potential AI Bubble
The concept of tech bubbles is frequently discussed in dire terms, however, the reality doesn’t necessarily have to be catastrophic. Economically speaking, a bubble represents an overestimation of potential returns, resulting in a surplus of supply relative to demand.
The key takeaway is that outcomes aren’t always binary; even well-considered investments can falter if not executed with sufficient prudence.
The Challenge of Assessing the AI Landscape
Determining whether an AI bubble exists is particularly complex due to the disparity in development speeds. AI software is evolving at an incredibly rapid rate, while the construction and provisioning of power for data centers proceed at a considerably slower pace.
Given the multi-year timelines associated with data center construction, significant changes are likely to occur between project initiation and operational readiness. The AI supply chain is characterized by its intricacy and volatility, making accurate forecasting of future supply needs exceptionally difficult.
Predicting AI demand in, for example, 2028 isn’t enough; understanding how AI will be utilized, and anticipating potential advancements in areas like energy efficiency, semiconductor technology, and power delivery are equally crucial.
The sheer scale of these investments introduces numerous potential failure points – and AI-related investments are reaching substantial levels.
Massive Infrastructure Investments
Recent reports from Reuters indicate that a data center campus linked to Oracle in New Mexico has secured up to $18 billion in credit from a group of 20 banks.
Oracle has already committed to $300 billion in cloud services for OpenAI, and the two companies, in collaboration with SoftBank, are planning a total of $500 billion in AI infrastructure development under the “Stargate” initiative.
Meta is also making significant investments, with a planned expenditure of $600 billion on infrastructure over the next three years. Tracking these commitments has proven challenging due to their sheer volume.
Uncertainty Surrounding Demand
Despite these large investments, the growth rate of demand for AI services remains uncertain.
A recent McKinsey survey examined the adoption of AI tools by leading companies. The findings were mixed; while nearly all businesses surveyed are experimenting with AI in some capacity, widespread implementation remains limited.
AI has enabled cost reductions in specific applications, but has yet to significantly impact overall business performance. Many companies are currently adopting a “wait and see” approach. Relying on these companies to immediately fill data center capacity may prove optimistic.
Infrastructure Bottlenecks
Even with robust AI demand, infrastructure limitations could present obstacles. Satya Nadella recently expressed greater concern about the availability of data center space than about chip shortages.
He noted that the primary constraint isn’t chip supply, but rather the lack of suitable facilities to house them. Furthermore, existing data centers are often unable to meet the power requirements of the latest chip generations.
While Nvidia and OpenAI are progressing rapidly, the electrical grid and construction industries operate at their traditional pace. This discrepancy creates opportunities for costly bottlenecks, even under ideal circumstances.
A more detailed discussion of these issues can be found in this week’s Equity podcast, available for listening below.
Related Posts

openai says it’s turned off app suggestions that look like ads

pat gelsinger wants to save moore’s law, with a little help from the feds

ex-googler’s yoodli triples valuation to $300m+ with ai built to assist, not replace, people

sources: ai synthetic research startup aaru raised a series a at a $1b ‘headline’ valuation

meta acquires ai device startup limitless
