fbpx

Type to search

Samsung, Nvidia Discussing HBM4, Next Level of Memory Chips

Samsung’s earnings have recovered and it’s now in ‘close discussion’ with Nvidia on HBM4 – the next-generation of high-bandwidth memory chips


People walk past a Samsung Electronics booth during CES 2024, an annual consumer electronics trade show
People walk past a Samsung Electronics booth during CES 2024, an annual consumer electronics trade show, in Las Vegas (Reuters 2024 image).

 

Samsung Electronics said on Friday it is talking to Nvidia about supplying its next-generation high-bandwidth memory (HBM) chips, known as HBM4.

The news was no surprise, given it was splashed in the local press with a photo of Samsung chairman Jay Y Lee sharing a beer and fried chicken with Nvidia CEO Jensen Huang during the latter’s trip to Korea to attend the Asia-Pacific Economic Cooperation CEO Summit.

Recently, Samsung was rushing to catch up with rivals in the AI chip production race. But the outlook for the South Korean chipmaker has improved.

 

ALSO SEE: Trump Cuts US Tariffs to 47%, Xi Vows to Ease Rare Earth Curbs

 

Samsung, which plans to market the new chip next year, did not specify when it aims to ship the latest version of its HBM chip, a key building block of artificial intelligence chipsets.

SK Hynix, Nvidia’s top HBM chip supplier, on Wednesday said it aims to start shipping its latest HBM4 chips in the fourth quarter and expand sales next year.

Nvidia, in a statement announcing cooperation with Samsung, said it is in “key supply collaboration for HBM3E and HBM4”, without elaborating.

 

AI-enhanced chip factory

In a separate deal, Samsung said it will purchase 50,000 high-end Nvidia chips to build an AI-enhanced semiconductor factory aimed at improving chip manufacturing speed and yields.

Samsung’s share price rose as much as 4.3% after the announcements.

Lee said Nvidia is a key customer and strategic partner and highlighted more than two decades of collaboration.

Jeff Kim, head of research at KB Securities, said HBM4 likely needs further testing but Samsung is widely seen to be in a favourable position given its production capacity.

“If Samsung supplies HBM4 chips to Nvidia, it could secure a significant market share that it was unable to achieve with previous HBM series products,” Kim said.

 

Samsung profit and shares jump

Samsung has been slower to capitalise on the AI-driven memory chip boom, leading to weaker earnings performance and a reshuffle of its chip division last year. Its earnings recovered in the latest quarter, driven by conventional memory chip demand.

This week, the company said it sells its current-generation HBM3E chips to “all related customers”, indicating it has joined rivals in supplying the latest 12-layer HBM3E chips to Nvidia.

The launch of HBM4 chips will be a major test of Samsung’s ability to regain its edge in the market, analysts said.

HBM – a type of dynamic random access memory (DRAM) standard first produced in 2013 – involves stacking chips vertically to save space and reduce power consumption, helping to process the large volume of data generated by complex AI applications.

Investors are watching for whether Samsung’s HBM4 can cut SK Hynix’s lead in advanced memory chips. The chipmaker, which is also a leading smartphone maker, said in July it had provided HBM4 samples to customers, with plans to begin supply next year.

Samsung’s share price has risen nearly 60% since July as investors expect the chipmaker to benefit from the current uptrend in memory prices and advance in the AI race.

 

  • Reuters with additional editing by Jim Pollard

 

ALSO SEE:

China Steps up Checks on Nvidia AI Chips at Major Ports, FT Says

China Curbs Rare Earth Exports to Defence Firms, Chipmakers

Samsung, SK Hynix Building ‘Stargate Korea’ Using Open AI

Subsidies And Homegrown Tech: How China Plans To Rival Stargate

Nvidia’s $5bn Intel Investment Seen Hurting, Helping Asian Chipmakers

Chinese Tech Giants ‘Want Nvidia Chips’ Despite Beijing Pushback

TSMC Sees 30% Revenue Rise in Third Quarter Amid AI Boom

China Bought $40bn Advanced Chip Tools Despite Curbs: US Lawmakers

AI Data Centres Spark Fears on Memory Storage Devices – TH

Jim Pollard

Jim Pollard is an Australian journalist based in Thailand since 1999. He worked for News Ltd papers in Sydney, Perth, London and Melbourne before travelling through SE Asia in the late 90s. He was a senior editor at The Nation for 17+ years.