while openai races to build ai data centers, nadella reminds us that microsoft already has them

Microsoft Deploys First Large-Scale AI System
Satya Nadella, CEO of Microsoft, announced on Thursday the deployment of the company’s inaugural large-scale AI system via a video shared on Twitter. He indicated that this represents the initial implementation of numerous Nvidia AI “factories” planned for integration throughout Microsoft Azure’s worldwide data center network, specifically to support OpenAI’s computational needs.
System Architecture and Components
Each of these systems comprises a cluster exceeding 4,600 Nvidia GB300 rack computers. These computers are equipped with the highly sought-after Blackwell Ultra GPU chip. Connectivity within the cluster is facilitated by Nvidia’s high-speed InfiniBand networking technology.
Nvidia CEO Jensen Huang strategically expanded his company’s influence beyond AI chips by acquiring Mellanox for $6.9 billion in 2019, effectively securing a dominant position in the InfiniBand market.
Deployment Scale and Timing
Microsoft intends to deploy “hundreds of thousands of Blackwell Ultra GPUs” as these systems are rolled out globally. The sheer scale of these deployments is significant. Detailed technical specifications have been released for the benefit of hardware specialists.
The timing of this announcement is particularly relevant, following recent data center agreements between OpenAI – a key partner and competitor – and both Nvidia and AMD.
OpenAI’s Infrastructure Investments
OpenAI has secured commitments estimated at $1 trillion for the construction of its own data centers, slated for completion in 2025. CEO Sam Altman has also signaled further expansion of these infrastructure projects.
Microsoft’s Competitive Position
Microsoft is emphasizing its existing infrastructure capabilities, highlighting its network of over 300 data centers spanning 34 countries. The company asserts it is “uniquely positioned” to address the demands of cutting-edge AI applications.
These powerful AI systems are designed to handle next-generation models featuring “hundreds of trillions of parameters.”
Future Announcements
Further details regarding Microsoft’s preparations for supporting AI workloads are anticipated later this month. Kevin Scott, Microsoft CTO, is scheduled to speak at TechCrunch Disrupt, taking place from October 27th to 29th in San Francisco.
Related Posts

openai says it’s turned off app suggestions that look like ads

pat gelsinger wants to save moore’s law, with a little help from the feds

ex-googler’s yoodli triples valuation to $300m+ with ai built to assist, not replace, people

sources: ai synthetic research startup aaru raised a series a at a $1b ‘headline’ valuation

meta acquires ai device startup limitless
