Counterpoint Research projects 35-fold growth in HBM demand for AI ASICs by 2028
News

Counterpoint Research projects 35-fold growth in HBM demand for AI ASICs by 2028

Saturday, March 14, 2026 at 02:51 AM

A Counterpoint Research report forecasts a 35-fold increase in High Bandwidth Memory (HBM) demand for AI ASICs between 2024 and 2028. This growth is driven by hyperscalers like Google, AWS, Meta, and Microsoft expanding their custom chip programs such as TPU, Trainium, MTIA, and Maia. The average HBM capacity per ASIC is expected to grow fivefold by 2028, with HBM3E likely becoming the dominant standard representing 56% of the market share.

Context

According to a new report from Counterpoint Research dated March 2026, high-bandwidth memory (HBM) demand for AI ASICs is projected to achieve a massive 35-fold growth between 2024 and 2028. This surge is primarily driven by hyperscalers including Google, Amazon, Meta, and Microsoft as they accelerate the deployment of custom silicon like the TPU, Trainium, MTIA, and Maia. These companies are shifting toward internal chip designs to optimize heterogeneous computing infrastructure and reduce reliance on general-purpose GPUs. The research highlights that average HBM capacity per chip will increase 5-fold by 2028 to satisfy the intensifying requirements of AI training and inference. HBM3E is expected to become the dominant industry standard, accounting for 56% of ASIC-related demand due to its optimal balance of performance and cost. While this specific 35-fold projection was shared via social media and attributed to Counterpoint, the underlying trend aligns with the firm's earlier 2026 analysis predicting that global AI ASIC shipments would triple by 2027.

Related Companies

Microsoft
Microsoft
MSFT
US
Meta
Meta
META
US
Google
Google
GOOGL
US
Amazon
Amazon
AMZN
US