SK Hynix completed development of a next-generation high-bandwidth memory (HBM) chip used for AI workloads ahead of rivals, with the HBM4 ready for mass production.
The South Korean chipmaker expects the HBM4 to improve AI service performance by up to 69 per cent, which can significantly reduce data centre power costs. Bandwidth doubled and power efficiency improved 40 per cent compared with the previous generation, it noted.
Justin Kim, head of AI infra at SK Hynix, said HBM4 will be a core product for overcoming technological challenges.
The company said it used the advanced mass refill moulded underfill process and fifth-generation 10nm technology to minimise risks in mass production.
It shipped samples of its sixth-generation, 12-layer HBM4 to customers in March, ahead of rivals Samsung and Micron Technology.
Subscribe to our newsletter
Get breaking news, exclusive insight, and expert analysis - before anyone else.
In April SK Hynix, a major supplier of HBM to Nvidia, forecast demand to double this year.
Samsung has aggressively invested in sixth-generation DRAM capacity for HBM4, but Chosun Biz reported it is about two months behind competitors in testing and is working to accelerate its timeline. It is preparing samples to sent to clients for testing.
Micron is reportedly entering the final testing phase of HBM for Nvidia.
Earlier in the week, SK Hynix began shipping its latest high-performance mobile NAND product, the ZUFS 4.1, to smartphone customers.
Subscribe to our newsletter
Get breaking news, exclusive insight, and expert analysis - before anyone else.
Comments