News
Morgan Stanley: Google's 2024 CAPEX on Nvidia to be Double Spending on TPUs
Wednesday, November 26, 2025 at 06:37 AM
An analyst report suggests Google's capital expenditure on Nvidia hardware is estimated to be approximately $20 billion in 2024, which is about twice the amount spent on its custom Tensor Processing Units (TPUs). The report anticipates this spending mix will partially rebalance next year, but the LLM competition will remain fierce.
Context
Recent analysis highlights a significant capital cost advantage for Google when building large-scale AI infrastructure. For clusters of approximately 9,000 accelerators, Google's custom networking is estimated to be up to 2.5x cheaper than systems using Nvidia's proprietary InfiniBand fabric. This cost differential is critical as networking becomes a dominant capital expense in massive AI supercomputers.
The advantage stems from Google's use of a highly optimized, large-scale Ethernet-based fabric for its 8,960-chip TPU pods, which avoids the premium hardware and support costs associated with InfiniBand. For investors, this suggests Google may achieve higher margins on its cloud AI services and possess a structural cost advantage. It also presents a long-term risk to Nvidia's high-margin networking business, as other major cloud providers are also exploring Ethernet to reduce their infrastructure spending.
Sources (39)
Google Vs. Nvidia: Inside The AI Hardware ShowdownAre Google chips really competitive? : r/NVDA_StockNvidia's strengths are not in its X posts - Yahoo Finance100000 H100 Clusters: Power, Network Topology, Ethernet ...Nvidia-Google AI Chip Rivalry Escalates on Report of Meta ...The $250 Billion Shock: How Google Just Shattered ...Report: OpenAI Business Breakdown & Founding StoryWhy Nvidia Stock Could Reach A $20 Trillion Market Cap ...
Related Companies
Nvidia
NVDA
Google
GOOGL