Samsung's New HBM-PIM Memory Doubles Performance, Packs AI Brainpower For Data Centers
How do you make high bandwidth memory
(HBM) even better? Lower for the price, for one. But aside from that, infusing it with artificial intelligence (AI) processing power is a surefire way to make HBM even more attractive for certain market segments, and that is precisely what Samsung has done—it has developed the world's HBM with AI processing power, and is calling it HBM-PIM (processing in memory).
"Our groundbreaking HBM-PIM is the industry’s first programmable PIM solution tailored for diverse AI-driven workloads such as HPC, training and inference. We plan to build upon this breakthrough by further collaborating with AI solution providers for even more advanced PIM-powered applications," said Kwangil Park, senior vice president of memory product planning at Samsung.
Samsung's nifty solution integrates a DRAM-optimized AI engine
inside each memory bank to enable parallel processing while minimizing data movement. This is in contrast to how most of today's systems shuffle processor and memory units. The constant movement of data back in forth can be a bottleneck when dealing with large volumes of a data, and that is what HBM-PIM intends to address.
"When applied to Samsung’s existing HBM2 Aquabolt solution, the new architecture is able to deliver over twice the system performance while reducing energy consumption by more than 70%. The HBM-PIM also does not require any hardware or software changes, allowing faster integration into existing systems," Samsung explains.
Samsung introduced its HBM2 Aquabolt memory chip back in 2018. According to Samsung, four Aquabolt packages in a system can deliver 1.2 TFLOPs of performance. That kind of speed and integration paves the way for HBM-PIM to take on tasks that are typically doled out to other pieces of hardware, including specialized ASIC and GPU components.
The first application for this sort of thing will be the data center
, and it remains to be seen if it trickles into the consumer space. Gaming GPUs could potentially benefit, though by and large, GDDR has been the go-to memory in the consumer space (with a few exceptions).
Samsung says its HBM-PIM solution is currently being tested inside AI accelerators at leading AI solution partners. It expects to finish validation by mid-year.