
Supply tight! NVIDIA chips are reported to switch to mobile-style memory, analysis: this move may lead to a doubling of server memory prices next year

Counterpoint report states that NVIDIA is switching AI server memory from DDR5 to mobile LPDDR chips to reduce power consumption, with demand equivalent to that of a major smartphone manufacturer. The current memory market is already in short supply due to manufacturers shifting to produce high-end AI chips, and NVIDIA's move exacerbates the situation. Analysts expect server memory prices to double by the end of 2026, increasing costs for cloud service providers and AI developers
NVIDIA's decision to adopt smartphone-style memory chips in AI servers may lead to a doubling of server memory prices by the end of 2026 and exacerbate the global electronic supply chain's memory shortage dilemma.
On November 19, market research firm Counterpoint Research released a report stating that NVIDIA recently decided to switch the memory chips for AI servers from the traditional DDR5 used in servers to low-power LPDDR chips typically used in smartphones and tablets to reduce server energy costs. This shift will make NVIDIA a customer of LPDDR chips on par with major smartphone manufacturers, making it difficult for the supply chain to easily absorb such a scale of demand.
In the past two months, the global electronic supply chain has been under pressure due to a shortage of traditional memory chips, as manufacturers have shifted their focus to high-end memory chips suitable for AI applications. NVIDIA's shift may lead to supply tightness in the low-end market being transmitted upstream, forcing chip manufacturers to weigh whether to allocate more capacity to LPDDR production.
Analysts point out that rising server memory prices will increase costs for cloud service providers and AI developers, further intensifying budget pressures on data centers. Currently, these companies are already facing financial strain due to record spending on graphics processors and power upgrades.
Technological Shift Triggers Supply Chain Shock
Counterpoint Research noted in the report that NVIDIA's technological route adjustment will cause a "seismic" impact on the memory supply chain. The company originally used DDR5 memory chips for AI servers but recently decided to switch to LPDDR low-power memory chips.
The core driving factor behind this shift is to reduce the power cost of AI servers. However, the number of memory chips required for each AI server far exceeds that of mobile devices like smartphones, making NVIDIA's demand scale equivalent to that of a major smartphone manufacturer, which will create a sudden surge in demand in a short period.
Counterpoint stated that the current supply chain is not prepared to handle this scale of demand. The agency warned:
"The greater risk looming is in the advanced memory sector, as NVIDIA's recent shift to LPDDR means they become a customer on par with major smartphone manufacturers—this is a seismic shift for the supply chain."
Multiple Shortages Compound Pressure
The global memory market is currently facing multi-layered supply tightness. In the past two months, the shortage of traditional memory chips has impacted the global electronic supply chain, stemming from manufacturers shifting production focus to high-end memory chips suitable for AI semiconductor applications.
Major memory suppliers like Samsung Electronics, SK Hynix, and Micron have faced supply shortages after reducing production of older dynamic random-access memory products. These manufacturers have concentrated their capacity on high-bandwidth memory production, which is essential for the advanced accelerators driving the global AI boom.
Counterpoint pointed out that the supply tightness in the low-end market poses a risk of being transmitted upstream. Chip manufacturers are currently weighing whether to allocate more factory capacity to LPDDR production to meet NVIDIA's demand, which may further exacerbate the supply tightness of other memory products. **
Price Surge Pressure Passed to End Users
Counterpoint predicts that server memory chip prices will double by the end of 2026. This price increase will directly raise the operating costs for cloud service providers and AI developers.
For data center operators, the rising memory costs will add new pressure to already tight budgets. These companies are currently experiencing record levels of spending due to upgrades in graphics processing units and power infrastructure, and the doubling of server memory prices may further squeeze profit margins.
This cost pressure may ultimately be passed on to cloud computing service pricing and AI application development costs, affecting the return on investment for AI across the entire technology industry.

