Business

Microsoft acquires twice as many Nvidia AI chips as tech rivals

Advertisements

Microsoft bought twice as many of Nvidia’s flagship chips as any of its largest rivals in the US and China this year, as OpenAI’s biggest investor accelerated its investment in artificial intelligence infrastructure.

Analysts at Omdia, a technology consultancy, estimate that Microsoft bought 485,000 of Nvidia’s “Hopper” chips this year. That put Microsoft far ahead of Nvidia’s next biggest US customer Meta, which bought 224,000 Hopper chips, as well as its cloud computing rivals Amazon and Google.

With demand outstripping supply of Nvidia’s most advanced graphics processing units for much of the past two years, Microsoft’s chip hoard has given it an edge in the race to build the next generation of AI systems.

This year, Big Tech companies have spent tens of billions of dollars on data centres running Nvidia’s latest chips, which have become the hottest commodity in Silicon Valley since the debut of ChatGPT two years ago kick-started an unprecedented surge of investment in AI.

Microsoft’s Azure cloud infrastructure was used to train OpenAI’s latest o1 model, as they race against a resurgent Google, start-ups such as Anthropic and Elon Musk’s xAI, and rivals in China for dominance of the next generation of computing.

Omdia estimates ByteDance and Tencent each ordered about 230,000 of Nvidia’s chips this year, including the H20 model, a less powerful version of Hopper that was modified to meet US export controls for Chinese customers.

Amazon and Google, which along with Meta are stepping up deployment of their own custom AI chips as an alternative to Nvidia’s, bought 196,000 and 169,000 Hopper chips respectively, the analysts said.

Omdia analyses companies’ publicly disclosed capital spending, server shipments and supply chain intelligence to calculate its estimates.

Bar chart of 2024 shipments of Nvidia Hopper GPUs ('000) showing Microsoft's spending on Nvidia's AI chips has far outstripped rivals'

The value of Nvidia

, which is now starting to roll out Hopper’s successor Blackwell, has soared to more than $3tn this year as Big Tech companies rush to assemble increasingly large clusters of its GPUs.

However, the stock’s extraordinary surge has waned in recent months amid concerns about slower growth, competition from Big Tech companies’ own custom AI chips and potential disruption to its business in China from Donald Trump’s incoming administration in the US.

ByteDance and Tencent have emerged as two of Nvidia’s biggest customers this year, despite US government restrictions on the capabilities of American AI chips that can be sold in China.

Microsoft, which has invested $13bn in OpenAI, has been the most aggressive of the US Big Tech companies in building out data centre infrastructure, both to run its own AI services such as its Copilot assistant and to rent out to customers through its Azure division.

Microsoft’s Nvidia chip orders are more than triple the number of the same generation of Nvidia’s AI processors that it purchased in 2023, when Nvidia was racing to scale up production of Hopper following ChatGPT’s breakout success.

“Good data centre infrastructure, they’re very complex, capital intensive projects,” Alistair Speirs, Microsoft’s senior director of Azure Global Infrastructure, told the Financial Times. “They take multi-years of planning. And so forecasting where our growth will be with a little bit of buffer is important.”

Tech companies around the world will spend an estimated $229bn on servers in 2024, according to Omdia, led by Microsoft’s $31bn in capital expenditure and Amazon’s $26bn. The top 10 buyers of data centre infrastructure — which now include relative newcomers xAI and CoreWeave — make up 60 per cent of global investment in computing power.

Vlad Galabov, director of cloud and data centre research at Omdia, said some 43 per cent of spending on servers went to Nvidia in 2024.

“Nvidia GPUs claimed a tremendously high share of the server capex,” he said. “We’re close to the peak.”

Bar chart of Capital expenditure on servers, 2024 ($bn) showing Big Tech's biggest spenders in AI data centre boom

While Nvidia still dominates the AI chip market, its Silicon Valley rival AMD has been making inroads. Meta bought 173,000 of AMD’s MI300 chips this year, while Microsoft bought 96,000, according to Omdia.

Big Tech companies have also stepped up usage of their own AI chips this year, as they try to reduce their reliance on Nvidia. Google, which has for a decade been developing its “tensor processing units”, or TPUs, and Meta, which debuted the first generation of its Meta Training and Inference Accelerator chip last year, each deployed about 1.5mn of their own chips.

Amazon, which is investing heavily in its Trainium and Inferentia chips for cloud computing customers, deployed about 1.3mn of those chips this year. Amazon said this month that it plans to build a new cluster using hundreds of thousands of its latest Trainium chips for Anthropic, an OpenAI rival in which Amazon has invested $8bn, to train the next generation of its AI models.

Microsoft, however, is far earlier in its effort to build an AI accelerator to rival Nvidia’s, with only about 200,000 of its Maia chips installed this year.

Speirs said that using Nvidia’s chips still required Microsoft to make significant investments in its own technology to offer a “unique” service to customers. 

“To build the AI infrastructure, in our experience, is not just about having the best chip, it’s also about having the right storage components, the right infrastructure, the right software layer, the right host management layer, error correction and all these other components to build that system,” he said.

https://www.ft.com/__origami/service/image/v2/images/raw/https%3A%2F%2Fd1e00ek4ebabms.cloudfront.net%2Fproduction%2Facf1a0d2-f774-476b-8643-0803a484b38b.jpg?source=next-article&fit=scale-down&quality=highest&width=700&dpr=1

2024-12-18 05:00:49

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button