Tech giant Meta Platforms has sent ripples through the market with its ambitious AI expansion plans, earmarking a staggering $115 billion to $135 billion for capital expenditure this year. The lion’s share of this investment will fuel its artificial intelligence infrastructure.
This massive financial commitment, representing an 87% year-on-year surge, has positioned chip manufacturers as key beneficiaries. Meta recently unveiled substantial agreements with both Nvidia and Advanced Micro Devices (AMD), securing access to their cutting-edge GPUs and CPUs for its expansive data centres. These multi-billion dollar deals signal a strategic move to optimise Meta’s AI capabilities.

Meta’s AI endeavours are far-reaching. While its Llama large language model (LLM) grabs headlines, the company’s entire ecosystem relies on sophisticated AI algorithms to curate content for its 3.5 billion monthly active users. Optimising data centres for peak efficiency is therefore paramount. Meta believes different chips are best suited for different tasks.
The agreement with Nvidia encompasses its complete chip suite, including GPUs, CPUs and ethernet switches. This multi-year collaboration involves co-designing Meta’s next-generation AI models to harness the full potential of Nvidia’s hardware. The primary focus appears to be developing data centres tailored for LLM training.
Conversely, Meta’s deal with AMD centres on deploying its MI450 GPUs and 6th-generation EPYC CPUs within its Helios rack-scale architecture. According to Meta CEO Mark Zuckerberg, the partnership with AMD will focus on "efficient inference compute". This suggests that AMD chips will be primarily dedicated to inference tasks.
AMD has demonstrated significant strides in AI inference performance compared to Nvidia. The MI450 GPUs, integrated with the Helios rack system, are projected to deliver superior price performance compared to Nvidia’s Rubin platform. This advantage stems from its advanced architecture and increased high-bandwidth memory within the chip package. Furthermore, AMD’s MI450 chips will be custom-designed for Meta’s AI models, enhancing performance.
AMD is gaining ground on Nvidia in AI inference. Analysts predict that inference will constitute the bulk of AI computing this year, and AMD’s long-term agreement with Meta positions it favourably to capitalise on this expanding market.
Here’s a breakdown of the key aspects of each deal:
| Feature | Nvidia | AMD |
|---|---|---|
| Chip Focus | GPUs, CPUs, Ethernet Switches | MI450 GPUs, 6th-Gen EPYC CPUs |
| Primary Application | LLM Training | AI Inference |
| Key Advantage | Co-designing next-gen AI models | Custom-designed chips for optimal inference |
| Architecture | N/A | Helios rack-scale |





