top of page

Nvidia supplier SK Hynix reports high-bandwidth memory chips used in AI processors almost sold out for 2025

SK Hynix reports that its high-bandwidth memory (HBM) chips used in AI processors are almost sold out for 2025. The company will begin sending samples of its latest 12-layer HBM3E chip in May and plans to start mass production in Q3. Demand for HBM chips is expected to continue growing as data and AI model sizes increase.

This comes as businesses aggressively expand their AI services, driving up the demand for these chips.


The South Korean company, which is the world's second-largest memory chip maker, will begin sending samples of its latest HBM chip, the 12-layer HBM3E, in May and plans to start mass-producing them in the third quarter.


SK Hynix CEO Kwak Noh-Jung stated during a news conference that the HBM market is expected to continue growing as data and AI model sizes increase. He also mentioned that annual demand growth is projected to be around 60% in the mid- to long-term.


Until March, SK Hynix was the sole supplier of HBM chips to Nvidia, competing with US rival Micron Technology and domestic giant Samsung Electronics in this market. However, analysts suggest that major AI chip buyers are looking to diversify their suppliers to better maintain operating margins. Currently, Nvidia holds approximately 80% of the global AI chip market.


Micron Technology has also reported that its HBM chips are sold out for 2024, with most of its 2025 supply already allocated. The company plans to provide samples of its 12-layer HBM3E chips to customers in March.


Je Kim, head of research at KB Securities, explained that the demand for ultra-high-performance chips like the 12-layer chips is increasing faster than for 8-layer HBM3Es due to the rapid upgrade of AI functions and performance.


Samsung, another major player in the HBM market, is set to produce its HBM3E 12-layer chips in the second quarter. The company recently announced that shipments of HBM chips for this year are expected to increase more than three-fold. However, no further details were provided.


In a bid to strengthen its position in the market, SK Hynix revealed plans to invest $3.87 billion in building an advanced chip packaging plant in Indiana, USA, which will include an HBM chip line. Additionally, the company will invest 5.3 trillion won (approximately $3.9 billion) in a new DRAM chip factory in South Korea, with a focus on HBMs.


SK Hynix CEO Kwak highlighted that the investment in HBM chips differs from past patterns in the memory chip industry. Instead of increasing production capacity first, the company is ensuring demand before scaling up production. According to Justin Kim, SK Hynix's head of AI infrastructure, by 2028, AI chips, including HBMs and high-capacity DRAM modules, are expected to account for 61% of all memory volume in terms of value, compared to just 5% last year.


During a recent post-earnings conference call, SK Hynix also warned of a potential shortage of regular memory chips for smartphones, personal computers, and network servers by the end of the year if demand for tech devices exceeds expectations.

 
  • SK Hynix reports that its high-bandwidth memory (HBM) chips used in AI processors are almost sold out for 2025.

  • The company will begin sending samples of its latest 12-layer HBM3E chip in May and plans to start mass production in Q3.

  • Demand for HBM chips is expected to continue growing as data and AI model sizes increase.


Source: SCMP

As Asia becomes the fastest growing tech adoption region, biz360tv is committed to keeping readers up to date on the latest developments in business technology news in Asia and beyond.

While we use new technologies such as AI to improve our storytelling capabilities, our team carefully select the stories and topics to cover and goes through fact-checking, editing, and oversight before publication. Please contact us at editorial@tech360.tv if you notice any errors or inaccuracies. Your feedback will be vital in ensuring that our articles are accurate for all of our readers.

bottom of page