The latest capital expenditure forecasts from major tech players have stirred the market, revealing a complex reality under the surface. A closer look suggests that the growth in spending might be more subdued than the headlines indicate.
This insight comes from analysts at RBC Capital Markets, who recently published their findings.
Leading companies like Amazon, Google, Meta, and Microsoft are projected to invest nearly $600 billion this year on data centers, chips, networking, and various related technologies to satisfy the escalating demand for AI capabilities.
While this may seem like an unending acceleration in spending, RBC’s research suggests that inflated growth figures are being influenced by one primary factor: skyrocketing memory prices.
The analysts identified that rising costs for memory chips—including DRAM, high-bandwidth memory (HBM), and NAND flash—could contribute to approximately 45% of the dollar increase in cloud capital expenditures anticipated for 2026. Notably, this surge is not primarily due to companies acquiring a significantly larger amount of hardware but rather paying substantially more for the same components.
RBC forecasts that spending on data center memory among the top ten hyperscalers will leap from around $107 billion in 2025 to roughly $237 billion in 2026. This substantial $130 billion uptick would account for about 45% of total capital expenditure growth for these entities. Even more remarkably, around 75% of the rise in memory spending—approximately $98 billion—will be due solely to inflated prices, not increased unit volumes.
The price surge is stark. Projections from TrendForce, cited by RBC, indicate that DRAM prices could more than double in 2026, with NAND prices projected to increase by over 85%. Memory has emerged as one of the most constrained resources in AI infrastructure, particularly because advanced GPUs require vast amounts of high-performance DRAM and HBM, while AI data centers rely heavily on extensive flash storage.
When memory costs are removed from the analysis, the perception of Big Tech’s spending growth dramatically shifts. Without considering memory expenses, capital expenditure growth is expected to decline to around 40% in 2026 from an estimated 80% growth in 2025. Though this reflects continued expansion, it is significantly less explosive than the raw capital expenditure figures would imply.
The analysts characterized this slowdown as “a notable deceleration,” but clarified that it is “not necessarily cause for alarm.”
RBC emphasized that underlying investments in AI are still vibrant, but cautioned that memory pricing poses a significant uncertainty regarding capital expenditure trends as we approach 2027.
In essence, while Big Tech might be investing considerably more in certain equipment, they may not be expanding their infrastructure in proportion to this spending. This scenario highlights the intersection of the AI development race and an overheated memory market.






