Offering some rare insight into the scale of HBM memory sales – and on its growth in the face of unprecedented demand from AI accelerator vendors – the company recently disclosed that it expects HBM sales to make up "a double-digit percentage of its DRAM chip sales" this year. Which if it comes to pass, would represent a significant jump in sales for the high-bandwidth, high-priced memory.

As first reported by Reuters, SK hynix CEO Kwak Noh-Jung has commented that he expects HBM sales will constitute a double-digit percentage of its DRAM chip sales in 2024. This prediction corroborate with estimates from TrendForce, who believe that, industry-wide, HBM will account for 20.1% of DRAM revenue in 2024, more than doubling HBM's 8.4% revenue share in 2023.

And while SK hynix does not break down its DRAM revenue by memory type on a regular basis, a bit of extrapolation indicates that they're on track to take in billions in HBM revenue for 2024 – having likely already crossed the billion dollar mark itself in 2023. Last year, SK hynix's DRAM revenue $15.941 billion, according to Statista and TrendForce. So SK hynix only needs 12.5% of its 2024 revenues to come from HBM (assuming flat or positive revenue overall) in order to pass 2 billion in HBM sales. And even this is a low-ball estimate.

Overall, SK hynix currently commands about 50% of HBM market, having largely split the market with Samsung over the last couple of years. Given that share, and that DRAM industry revenue is expected to increase to $84.150 billion in 2024, SK hynix could earn as much as $8.45 billion on HBM in 2024 if TrendForce's estimates prove accurate.

It should be noted that with demand for AI servers at record levels, all three leading makers of DRAM are poised to increase their HBM production capacity this year. Most notable here is a nearly-absent Micron, who was the first vendor to start shipping HBM3E memory to NVIDIA earlier this year. So SK hynix's near-majority of the HBM market may falter some this year, though with a growing pie they'll have little reason to complain. Ultimately, if sales of HBM reach $16.9 billion as projected, then all memory makers will be enjoying significant HBM revenue growth in the coming months.

Sources: Reuters, TrendForce

POST A COMMENT

6 Comments

View All Comments

  • Diogene7 - Friday, March 29, 2024 - link

    I am wondering if down the line, in a few years time, when the HBM manufacturing fabs and tools will have depreciated, it would realistic to hope to see HBM compete with GDDR or LPDDR DRAM memory in premium consumer devices ?

    Would be be possible to make a low-power consumption HBM memory for premium mobile devices in order to further lower memory power consumption in those devices ? (Ex: a 32GB low-power HBM stack for premium smartphones / tablets).

    Then one step further, a Non-Volatile-Memory (NVM) SOT-MRAM HBM stack : imagine what new opportunities a 32GB HBM NVM SOT-MRAM stack could enable, especially in mobile devices !!!
    Reply
  • meacupla - Friday, March 29, 2024 - link

    That would be a dream come true, but I doubt we will see HBM on CPUs for at least 5 years.
    Cost-wise, it is: HBM >>>>>>>>> GDDR >>> LPDDR/DDR

    On-packaged LPDDR5x seems to be "fast enough" for the next few years.
    Apple uses it on their M-series chips, and Intel will use it on Lunar Lake MX.
    I'm not sure if AMD or Snapdragon has an on-packaged design in the works.
    I know AMD is planning to use quad-channel DRAM for their upcoming Strix Halo.
    Reply
  • Diogene7 - Saturday, March 30, 2024 - link

    Thanks for the feedback.

    Yes, unfortunately, realistically if HBM would come to consumer devices, I wouldn’t expect it before 2028/2030 (I wish it would be much sooner than that).

    But really, one of my dream is to see spintronics Non-Volatile-Memory (NVM) MRAM be integrated in computing devices, especially mobile devices as it would enable / unlock so many new opportunities.

    Spintronics is really disruptive and there should be much more of the US CHIPS Act funding allocated to scale-up MRAM to High-Volume-Manufacturing (HVM) (ex: Avalanche technology MRAM on 300mm wafers) : it would help kickstart the next evolution in Beyond CMOS computing technology
    Reply
  • BaronMatrix - Monday, April 1, 2024 - link

    It has been years since I've posted here... The most interesting thing about the growth of HBM and Hynix's share is that AMD partnered with Hynix to create HBM for Radeon Fiji GPUs and passed it on to JEDEC making it royalty free for Micron and Samsung to make and Nvidia and Intel to use... Reply
  • lemurbutton - Monday, April 1, 2024 - link

    It wouldn't have been developed further if AMD didn't make it royalty free. Someone else would have invented an alternative. Reply
  • Oxford Guy - Monday, April 8, 2024 - link

    It depends upon the size of the royalty and other variables. Reply

Log in

Don't have an account? Sign up now