Initially, Microsoft planned to launch Braga by 2025 to serve its Azure cloud servers, providing a cheaper option to Nvidia’s market-leading GPUs. Yet unforeseen obstacles, such as last-minute design changes, staff scarcity, and staff turnover, albeit largely unavoidable, have impeded advancement.
That’s right. Braga’s early benchmarks indicate performance will be under Nvidia’s Blackwell chip, which debuted in late 2024. That would put Microsoft at a serious disadvantage in the AI hardware arms race.
Microsoft’s partnership with OpenAI added a third, chilling layer. According to high-level sources OpenAI pressured changes to features in late stages, which created instability in simulations. Microsoft went and quietly canned a dedicated AI training chip earlier this year, throwing more sand in the gears of its plans.
Rivals Are Leaving Us in the Dust
As Microsoft flails to regain parity with its competitors, those same rivals are continuing to move forward and aren’t going to sit still in the meantime.
– Google is increasing use of its proprietary form of data center processor, the custom Tensor Processing Units (TPUs).
- Amazon is preparing its next-generation Trainium3 chip for AI workloads.
Both of them could reap big dividends in performance and cost savings, with Microsoft forced to play catch-up.
The extension may be a threat to Azure’s potential profits. Remaining locked into Nvidia’s chips translates to higher costs and less leverage. This is miserable math for a company placing huge chips on AI-fueled cloud services.
The setback highlights just how difficult it is out there in the new AI chip gold rush. Development of AI capabilities has rapidly accelerated with Nvidia off to a blazing start, and competition with its rivals increasingly developing their own silicon. Microsoft’s road to autonomy has become significantly steeper.
إرسال تعليق