Samsung To Launch HBM3P Memory, Codenamed “Snowbolt” With Up To 5 TB/s Bandwith Per Stack
By: Jason R. Wilson
Samsung Electronics is set to reveal the next-generation high-bandwidth memory, HBM3P, which is codenamed “Snowbolt.” Snowbolt will join the ranks of previous generations HBM & products from Samsung, such as Flarebolt, Aquabolt, Flashbolt, and Icebolt.
Snowbolt Is The Name For Samsung’s Next-gen HBM3P Memory With Up To 7.2 Gbps Transfer Speeds By 2024
ZDNet Korea reports that Samsung filed a trademark application to the Patent Information Search Service (KIPRIS) for its latest DRAM, HBM3P Snowbolt, on April 26, 2023. It is expected to release in the second half of this year.
The documents reveal that the codename Snowbolt will coincide with the release of the high-bandwidth DRAM modules found in HPC cloud systems, supercomputers, and in use for artificial intelligence.
Snowbolt is the brand name of Samsung Electronics’ next-generation HBM DRAM products, and it is still undecided whether the name will be used for which generation products.
— an official representative from Samsung Electronics
Previous generations from the Samsung Electronics HBM branch were:
- Flarebolt: the first generation HBM2 memory
- Aquabolt: Samsung Electronics’ second-generation HBM2 memory from 2018
- Flashbolt: The company’s third-generation HBM2E memory (2020)
- Icebolt: This HBM3 memory was initially released in prototype stages but is expected to be mass-produced later this year
During a recent conference call to investors and the media last month, a spokesperson for Samsung Electronics was quoted stating,
We have already supplied HBM2 and HBM2E products to major customers to provide the highest performance and highest capability products promptly that meet the needs and technology trends of the AI market, and HBM3 (16 GB and 12 GB). However, 24 GB products are also being sampled, and preparations for mass production have already been completed.
Not only the current HBM3 but also the next-generation HBM3P product with higher performance and capacity required by the market is being prepared for the second half of the year with the industry’s best performance.
This will increase the overall performance for HBM3P by 10% over its predecessors plus customers will also take advantage of advanced high multi-layer stacking and memory width implementations that are expected to be featured within HBM3P next year. The roadmap also shows HBM3P with PIM (Programming In Memory) by 2025 and HBM4 by 2026.
More large tech giants are utilizing high-bandwidth memory modules to increase the process of data significantly to enhance machine learning AI. The marketplace for HBM modules has grown exponentially over the last few years, with HBMs slowly outselling DRAM memory modules. Three companies hold the market share on high-bandwidth memory, SK Hynix, Samsung Electronics, and Micron. SK Hynix is taking a 50% chunk of the market share. Samsung Electronics follows SK Hynix at 40%, and Micron falls into third place with only 10% of the leftover marketplace holdings.