
Google rejects claims that Gmail uses your emails to train Gemini AI
editorial_staff
October 21, 2025
Microsoft CEO Satya Nadella has reminded the tech world that while OpenAI races to build AI data centers, Microsoft already has them up and running. On Thursday, Nadella shared a video of the company’s first deployed large-scale AI system, calling it the first of many Nvidia-powered “AI factories” that will run OpenAI workloads across Azure’s global network. Each system contains more than 4,600 Nvidia GB300 rack computers equipped with Blackwell Ultra GPUs, connected through Nvidia’s InfiniBand networking technology. Microsoft says it plans to deploy hundreds of thousands of these GPUs as it expands the systems worldwide.
The announcement comes shortly after OpenAI secured major data center partnerships with Nvidia and AMD, backed by an estimated $1 trillion in commitments. OpenAI CEO Sam Altman has also indicated that more such deals are on the way. Microsoft’s message is clear: it already operates more than 300 data centers in 34 countries and is positioned to handle the most advanced AI workloads today. The company emphasized that these facilities can support the next generation of AI models with hundreds of trillions of parameters, highlighting the scale and readiness of its infrastructure.
The timing underscores the competitive tension between Microsoft and OpenAI, which are partners but also rivals in the AI race. By showcasing its operational advantage, Microsoft is signaling to both investors and customers that it has the hardware muscle to stay ahead. More details on Microsoft’s AI infrastructure push are expected later this month, when CTO Kevin Scott speaks at TechCrunch Disrupt in San Francisco.