altman and nadella need more power for ai, but they’re not sure how much

The Growing Power Demand of Artificial Intelligence
The precise amount of power required to sustain the advancement of Artificial Intelligence (AI) remains unknown, even to industry leaders such as Sam Altman, CEO of OpenAI, and Satya Nadella, CEO of Microsoft.
This uncertainty presents a significant challenge for software-centric organizations like OpenAI and Microsoft. A substantial portion of the technology sector has identified compute capacity as a primary obstacle to AI implementation. Despite vigorous efforts to secure sufficient power resources, these endeavors have not kept pace with GPU acquisitions, resulting in Microsoft reportedly procuring an excess of chips relative to its contracted power supply.
Power Constraints Outweigh Compute Availability
“Predicting the cycles of supply and demand in this specific scenario is exceptionally difficult,” Nadella stated during an appearance on the BG2 podcast. “Currently, the most pressing issue isn’t an oversupply of compute, but rather the availability of power and the speed at which we can complete [data center] construction near adequate power sources.”
Nadella further explained, “Without the ability to do so, a considerable number of chips may remain unused due to a lack of suitable infrastructure. Indeed, this is the predicament I face today – not a shortage of chips, but a deficiency of ‘warm shells’ ready for deployment,” referencing the real estate term for move-in ready buildings.
A Shift in Focus for Tech Companies
The situation highlights the challenges faced by companies more accustomed to managing silicon and software, technologies that scale and deploy rapidly, as they now navigate the complexities of expanding energy infrastructure.
For over a decade, electricity demand in the United States remained relatively stable. However, over the past five years, demand originating from data centers has begun to surge, exceeding the projected capacity increases from utility providers. This has prompted data center developers to pursue ‘behind-the-meter’ arrangements, delivering electricity directly to their facilities, bypassing the traditional power grid.
Potential Risks and Future Investments
Altman, also a guest on the podcast, cautioned about potential future difficulties: “Should a significantly inexpensive energy source become available at scale in the near future, many will find themselves burdened by existing contracts.”
He continued, “If we maintain the current rate of cost reduction per unit of intelligence – averaging around 40x annually – the infrastructure requirements will become increasingly daunting.”
Altman has made personal investments in various energy technologies, including fission startup Oklo, fusion startup Helion, and Exowatt, a solar company specializing in concentrated solar power with thermal storage.
Challenges in Scaling Energy Production
However, none of these technologies are currently ready for widespread implementation. Traditional fossil fuel-based power plants, such as those utilizing natural gas, require years for construction, and new gas turbine orders may not be fulfilled until later in this decade.
This is a key reason why tech companies are rapidly increasing their adoption of solar energy, attracted by its low cost, emission-free operation, and relatively quick deployment capabilities.
Parallels Between Semiconductors and Solar Technology
Underlying this trend may be subconscious parallels. Photovoltaic solar technology shares similarities with semiconductors, having undergone a process of de-risking and commoditization. Both technologies utilize silicon substrates and are manufactured as modular components assembled into parallel arrays, enhancing overall power output.
The modularity and rapid deployment of solar energy align more closely with the construction timelines of data centers.
Demand Fluctuations and Jevons Paradox
Despite these advantages, both energy sources require significant build times, and demand can shift more quickly than either a data center or a solar project can be completed. Altman acknowledged the possibility of companies being left with underutilized power plants if AI efficiency improves or demand growth slows.
However, he expressed skepticism about this scenario, appearing to subscribe to Jevons paradox, which posits that increased resource efficiency leads to greater resource consumption and overall demand.
“A 100-fold reduction in the cost of compute per unit of intelligence would result in a usage increase exceeding 100-fold, unlocking numerous applications currently economically unfeasible at existing costs,” Altman explained.
Related Posts

openai says it’s turned off app suggestions that look like ads

pat gelsinger wants to save moore’s law, with a little help from the feds

ex-googler’s yoodli triples valuation to $300m+ with ai built to assist, not replace, people

sources: ai synthetic research startup aaru raised a series a at a $1b ‘headline’ valuation

meta acquires ai device startup limitless
