A man walks past an SK Hynix logo in the lobby of the company’s Bundang office in Seongnam on January 29, 2021.
Jung Yeon-Je | AFP | Getty Images
SK Hynix, one of the world’s largest memory chip makers, said on Thursday that second-quarter profit reached its highest level in 6 years as it maintains its leadership in advanced memory chips vital to artificial intelligence .
Here are SK Hynix’s second-quarter results compared to the LSEG SmartEstimate, which is weighted by analyst forecasts that are more consistently accurate:
- Income: 16.42 trillion Korean won (about $11.86 billion), up from 16.4 trillion Korean won
- Operating profit: 5.47 trillion Korean won, versus 5.4 trillion Korean won
Operating profit in the June quarter hit the highest level since the second quarter of 2018, recovering from a loss of 2.88 trillion won in the same period a year ago.
Revenue from April to June rose 124.7 percent from 7.3 trillion won recorded a year ago. That was the highest quarterly revenue in the company’s history, according to LSEG data available since 2009.
SK Hynix said on Thursday that the continued rise in overall prices of its memory products — thanks to strong demand for AI memory, including high-bandwidth memory — led to a 32% increase in revenue compared to the previous quarter.
The South Korean giant supplies high-bandwidth memory chips that power AI chipsets for companies such as Nvidia.
Shares of SK Hynix fell as much as 7.81% on Thursday morning.
The falls came as did South Korea Kospi index lost as much as 1.91% after US tech stocks sold off overnight following a disappointing Alphabet and Tesla profits. These reports mark investors’ first look at how megacaps fared in the second quarter.
“In the second half of this year, strong demand from AI servers is expected to continue, as well as a gradual recovery in conventional markets with the launch of AI-enabled PCs and mobile devices,” the company said in its earnings call on Thursday.
Capitalizing on strong AI demand, SK Hynix plans to “continue its leadership in the HBM market with the mass production of 12-layer HBM3E products.”
The company will begin mass production of the 12-layer HBM3E this quarter after providing samples to major customers and expects to ship to customers by the fourth quarter.
Tight supply
Memory leaders such as SK Hynix have aggressively expanded HBM capacity to meet the growing demand for AI processors.
HBM requires more wafer capacity than regular dynamic random access memory products – a type of computer memory used to store data – for which SK Hynix said it also faces tight supply.
“Investment needs are also increasing to meet demand for conventional DRAM as well as HBM that requires more wafer capacity than conventional DRAM. Therefore, this year’s capex level is expected to be higher than what we expected at the beginning of the year,” said SK. Hynix.
“While spare capacity is expected to increase next year due to increased industry investment, a significant portion of it will be used to increase HBM production. Therefore, the tight supply situation for conventional DRAMs is likely to continue.”
Daiwa Capital Markets’ SK Kim said in a June 12 note that they expect “the tight supply of HBM and memory to be maintained until 2025 at a bottleneck in HBM production.”
“Consequently, we expect a favorable price environment to continue and SK Hynix to post strong earnings in 2024-25, benefiting from its competitiveness in HBM GPU for AI GPU and high-density enterprise SSD (eSSD) for AI servers, leading to a re-rating of the stock,” Kim said.
High-bandwidth memory chip supplies have expanded thanks to the explosive adoption of artificial intelligence powered by large language models like ChatGPT.
The AI boom is set to continue Supply of high-end memory chips is tight this year, analysts warn. SK Hynix and Micron in May said they are out of high-bandwidth memory chips for 2024, while inventory for 2025 is also nearly depleted.
Large language models require many high-performance memory chips, as these chips allow these models to remember details of past conversations and user preferences in order to generate human-like responses.
SK Hynix has largely led the high-bandwidth memory chip market, having been the sole supplier of HBM3 chips to Nvidia before rival Samsung has reportedly cleared the tests for using its HBM3 chips in Nvidia processors for the Chinese market.
The company said it expects to ship the next-generation 12-layer HBM4 from the second half of 2025.