Nvidia server racks reportedly support third-party AI chips to drive networking revenue
News

Nvidia server racks reportedly support third-party AI chips to drive networking revenue

Monday, March 23, 2026 at 09:58 PM

Nvidia's latest AI server rack designs are reportedly compatible with third-party accelerators, allowing the company to monetize its high-performance networking infrastructure even when non-Nvidia chips are deployed. This move signals a shift towards capturing value from the broader data center connectivity layer.

Context

Reports on March 24, 2026, indicate that Nvidia is shifting its data center strategy by allowing its latest server racks to support third-party AI chips. This move marks a departure from a strictly closed ecosystem, as the company aims to decouple its high-margin networking hardware from its dominant GPU sales. By making its rack-scale infrastructure compatible with non-Nvidia accelerators, the company seeks to capture a larger share of the global networking market, which reached $31 billion in annual revenue for the firm in fiscal 2026. This strategic pivot focuses on driving adoption of Nvidia's high-performance interconnects, such as Spectrum-X Ethernet and NVLink, across diverse hardware environments. As hyperscalers increasingly develop in-house silicon, Nvidia's ability to provide the underlying fabric—including the GB200 NVL72 rack architecture recently contributed to the Open Compute Project—positions it as the essential provider of AI factory infrastructure. Investors are monitoring this shift as it transforms Nvidia into a specialized networking powerhouse capable of generating recurring revenue even as the chip landscape becomes more competitive.

Related Companies

Nvidia
Nvidia
NVDA
US