Betting on the "AI Memory Supercycle," SK Hynix's 10nm DRAM production will increase eightfold next year

Wallstreetcn
2025.11.20 13:00
portai
I'm PortAI, I can summarize articles.

SK Hynix plans to significantly expand its sixth-generation 10-nanometer DRAM capacity, expecting to increase its monthly production capacity from 20,000 wafers to 160,000 to 190,000 wafers next year to meet the market demand for AI inference applications. The company has completed HBM4 supply negotiations with NVIDIA and plans to add 140,000 wafers of monthly capacity at its Licang campus. Industry insiders expect its facility investment next year to exceed 30 trillion won, with operating profit potentially exceeding 70 trillion won, setting a new historical high

SK Hynix is aggressively expanding its advanced memory chip production capacity, betting on market opportunities arising from the shift of artificial intelligence applications from training to inference.

On November 20, according to South Korean media reports, South Korea's memory chip giant—SK Hynix plans to increase its monthly production capacity of sixth-generation 10-nanometer DRAM (1c DRAM) from the current approximately 20,000 pieces of 300mm wafers to 160,000 to 190,000 pieces, an increase of 8 to 9 times, accounting for more than one-third of its total DRAM production capacity.

Reports indicate that, according to semiconductor industry insiders, the expanded 1c DRAM will primarily be used to produce products such as GDDR7 and SOCAMM2, to meet the order demands of large tech companies like NVIDIA. This strategic adjustment reflects the surge in demand for cost-effective general DRAM driven by AI inference applications, as the company shifts its strategic focus from high-bandwidth memory (HBM) to a broader AI memory market.

At the same time, according to a previous article from Wall Street Insight, SK Hynix recently completed HBM4 supply negotiations with NVIDIA, successfully raising prices by over 50% to more than $500 per unit. According to media reports, the company has already sold out its production capacity for next year in advance, securing a favorable pricing position in both the HBM and general DRAM markets.

Industry insiders expect that SK Hynix's capital investment next year will easily exceed 30 trillion won, a significant increase from the estimated 25 trillion won this year. Market forecasts suggest that the company's operating profit next year is expected to exceed 70 trillion won, setting a new historical high.

Leap in Advanced Process Capacity Expansion

SK Hynix's capacity enhancement plan focuses on the most advanced 1c DRAM technology node.

Reports indicate that, according to industry insiders, the company plans to add 140,000 pieces of monthly capacity through process upgrades at its Icheon plant next year, which is seen as the "minimum increase." Some industry insiders have stated that SK Hynix is also considering raising monthly capacity to 160,000 to 170,000 pieces.

Based on SK Hynix's current average monthly input of 500,000 DRAM wafers, more than one-third of the capacity will be allocated to advanced 1c DRAM production.

The company has improved the yield rate of 1c DRAM to over 80%, and this process is primarily used to manufacture the latest general DRAM products such as DDR5, LPDDR, and GDDR7.

This aggressive capacity expansion plan demonstrates SK Hynix's strong confidence in the sustained prosperity of AI-driven memory demand. Analysts point out that compared to HBM, which requires complex stacking processes, the production efficiency of 1c DRAM is higher, allowing for a quicker response to explosive market demand.

AI Inference Applications Reshaping Market Demand Structure

The core logic behind SK Hynix's strategic adjustment lies in the shift of focus in AI applications.

Reports indicate that the company previously focused its production capacity on HBM, anticipating that HBM demand growth would exceed that of general DRAM. However, as AI models expand from learning to inference, the growth rate of demand for general DRAM is expected to be comparable to that of HBM In AI inference applications, advanced general-purpose DRAM that is more energy-efficient and cost-effective than HBM has become the mainstream choice.

The AI accelerator Rubin CPX recently released by NVIDIA uses GDDR memory instead of HBM memory, deployed directly next to the processor. Major tech companies such as Google, OpenAI, and Amazon Web Services are also developing custom AI accelerators that integrate a large amount of general-purpose DRAM.

The SOCAMM2 memory module used by NVIDIA also employs 1c DRAM. SOCAMM is a memory module standard led by NVIDIA for AI servers and PCs, which has lower bandwidth compared to HBM but higher energy efficiency.

NVIDIA plans to deploy SOCAMM2 next to its proprietary CPU Vera, industry insiders predict that SK Hynix will receive a certain number of supply orders.

According to DRAMeXchange data, the fixed trading price of DDR4 broke $7 in September, reaching a new high in 6 years and 10 months. This is due to major storage chip companies focusing on expanding HBM production lines in recent years, leading to a supply bottleneck for general-purpose DRAM.

An industry insider stated that as the inference AI market rapidly expands, the supply of storage chips will not be able to keep up with demand in the short term.

HBM4 Price Increase of Over 50% Strengthens Profitability

According to an article from Wall Street Watch, earlier media reports indicated that SK Hynix successfully raised the price of HBM4 in negotiations with NVIDIA by over 50% to more than $500 per chip. This HBM4 will be featured in NVIDIA's next-generation AI chip Rubin, set to be released in the second half of next year.

The significant price increase of HBM4 is supported by technical foundations. The product's data transmission channels reach 2048, which is double that of the previous generation HBM3E.

Additionally, new logic processes for computing efficiency and energy management have been added to the base chips connecting the graphics processor and HBM. Considering the cost increases brought about by these technological advancements, SK Hynix has outsourced the previously self-produced base chips for HBM4 to TSMC.

NVIDIA initially showed resistance to the significant price increase, considering that Samsung Electronics and Micron would supply HBM4 on a large scale, leading to a stalemate between the two parties.

However, the final supply price was settled at the level proposed by SK Hynix. An executive from SK Hynix stated that considering the process advancements and input costs, HBM4 has factors that justify a significant price increase.

In March of this year, SK Hynix was the first to deliver the world's first HBM4 12-layer stack sample to NVIDIA, and began supplying the initial batch in June.

The company recently informed institutional investors that it has locked in prices and supply volumes for products that meet NVIDIA's specifications, maintaining current profitability. This means that even if Samsung Electronics and Micron enter the HBM4 market, it will not adversely affect SK Hynix's performance

Dual-Engine Drive Boosts Next Year's Performance Expectations

The report also pointed out that while expanding general DRAM capacity, SK Hynix has not slowed down its HBM layout.

The company is expanding its 1b DRAM capacity to officially supply HBM4 to NVIDIA next year. The new M15X factory located in the Cheongju campus is scheduled to start production by the end of this year, and this 1b DRAM production line can produce an average of 60,000 wafers per month.

Industry estimates suggest that SK Hynix's HBM4 profit margin is around 60%. The market expects the company's HBM sales next year to be approximately 40 trillion to 42 trillion Korean won.

If the profit margin remains the same as this year, SK Hynix's HBM business alone will generate about 25 trillion Korean won in operating profit, an increase of nearly 50% compared to this year's 17 trillion Korean won.

With the global AI infrastructure investment boom driving, general DRAM prices have surged simultaneously. Analysis shows that as DRAM prices rise, SK Hynix's operating profit margin for general DRAM next year may also approach 50% to 60%. An industry insider stated that SK Hynix has already sold out its production capacity for next year before product production, thus maintaining a high profit margin.

Considering the high profit margin of HBM4 and the rising prices of general DRAM, the market predicts that SK Hynix's operating profit next year is expected to exceed 70 trillion Korean won, setting a new historical high. This expectation is based on the company's ensured profitability of HBM4, locked-in capacity in advance, and sustained strong demand for AI memory.

Risk Warning and Disclaimer

The market has risks, and investment should be cautious. This article does not constitute personal investment advice and does not take into account individual users' specific investment objectives, financial conditions, or needs. Users should consider whether any opinions, views, or conclusions in this article align with their specific circumstances. Investment based on this is at one's own risk