News
AI demand projected to consume 20% of global DRAM capacity by 2026 as inference needs surge
Monday, December 1, 2025 at 09:54 AM
Experts and industry sources estimate that AI applications, driven by high-bandwidth memory like HBM and GDDR7, could consume nearly 20% of global DRAM capacity by 2026. This surge is attributed to the high memory requirements of long-context inference, which creates a 'memory black hole' effect that may squeeze supply and raise prices for PC, mobile, and server markets.
Context
AI demand is projected to consume nearly 20% of global DRAM capacity by 2026 as the industry pivots from raw compute power to memory-intensive inference. High-speed memory like HBM and GDDR7 is the new battlefield, with cloud demand expected to reach 3EB. Major players including Google, Amazon, Meta, and Apple are fueling this "memory black hole" as they scale long-context models that require massive amounts of intermediate data during real-time processing.
This shift creates a significant supply-side squeeze because 1GB of HBM consumes roughly 4GB of standard DRAM wafer capacity. With global capacity growing at only 10%–15% annually, the AI surge is expected to trigger shortages and price hikes for standard DDR5 in PCs and smartphones. This trend prioritizes high-margin AI production, potentially leaving traditional hardware markets facing restricted supply and increased costs through the end of the decade.
Related Companies
SK Hynix
000660
Samsung Electronics
005930