On Thursday, Nvidia supplier SK Hynix reported its best quarterly earnings since 2018 and predicted continued strong AI chip demand, including for high bandwidth memory (HBM) used in generative AI chipsets
“The demand for AI is exceeding expectations.” In a recent earnings call, Head of DRAM Marketing Kim Kyu Hyun stated that the anticipated HBM shipments for the upcoming year will surpass those of the current year.
Nevertheless, SK Hynix shares were down as much as 8.4% in morning trade, consistent with declining U.S. shares such as Nvidia, as tech firms’ results failed to satisfy investors’ high expectations.
Following its early entry and substantial investment in the segments, the world’s second-largest memory chip manufacturer has experienced the most significant benefit among its peers from the AI-driven demand for high-end processors and enterprise solid-state drives (SSD).
The operating profit for April-June was 5.47 trillion won ($3.96 billion), the company’s highest since the third quarter of 2018.
The figure was consistent with the LSEG SmartEstimate, which is biased toward analysts who are more consistently accurate, and was in contrast to a loss of 2.9 trillion won a year earlier.
Revenue increased by 125% to a record-breaking 16.4 trillion won.
Artificial Intelligence Demand
The demand for high-end DRAM chips, including HBM, has resulted in a significant increase in prices. These chips are used in data center servers and devices operating on-device AI services.
“DRAM prices are rising despite the lack of complete recovery in traditional buyer demand, as chipmakers concentrate production capacity on HBM,” Kim Woohyun, CFO, reported.
SK Hynix is the primary supplier of HBM processors to Nvidia and is the market leader in the HBM market. It competes with Samsung Electronics and Micron, which are located in the United States. Approximately 80% of the artificial intelligence chip market is controlled by Nvidia.
Sources previously informed Reuters that SK Hynix initiated bulk production of fifth-generation HBM chips, known as HBM3E, in March, with the initial shipments directed to Nvidia.
The chipmaker announced on Thursday that it intends to distribute the subsequent iterations of its HBM processors. The 12-layer HBM3E will be available in the fourth quarter, while the HBM4 will be available in the second half of 2025.
Nvidia has approved Samsung’s fourth-generation HBM, HBM3, for use in its less sophisticated graphics processors, the H20, developed for the Chinese market. However, Samsung still needs to meet Nvidia’s standards for HBM3E circuits. Sources have informed Reuters of this development.
“SK Hynix is asserting its commitment to maintaining its technical leadership and remaining ahead of its competitors,” stated analyst Lee Min-hee of BNK Investment & Securities. “But investor expectations are so high they may be hard to meet, and in the short term, the stock price may not rise as much.”
Analysts have predicted that HBMs could contribute 20% of SK Hynix’s profit for DRAM chips by the end of 2024, up from nearly 0% in the first half of 2023. This is because Nvidia is expected to expedite the development of next-generation graphics processors to meet the increased demand generated by the generative AI surge.
Kwak Noh-Jung, CEO of SK Hynix, announced in May that the company had sold out of its HBM processors for the current year and was nearly sold out for 2025.
SK Hynix shares have increased by 47% year-to-date as of Wednesday, making it one of South Korea’s top stock selections for the AI boom.