Samsung's New HBM2 Memory Thinks for Itself: 1.2 TFLOPS of Embedded Processing Power
Today, Samsung announced that its new HBM2-based memory has an integrated AI processor that can push out (up to) 1.2 TFLOPS of embedded computing power, allowing the memory chip itself to perform operations that are usually reserved for CPUs, GPUs, ASICs, or FPGAs.
The new HBM-PIM (processing-in-memory) chips inject an AI engine inside each memory bank, thus offloading processing operations to the HBM itself. The new class of memory is designed to alleviate the burden of moving data between memory and processors, which is often more expensive in terms of power consumption and time than the actual compute operations.
[...] As with most in-memory processing techniques, we expect this tech will press the boundaries of the memory chips' cooling limitations, especially given that HBM chips are typically deployed in stacks that aren't exactly conducive to easy cooling. Samsung's presentation did not cover how HBM-PIM addresses those challenges.
HBM: High Bandwidth Memory.
ASIC: Application-Specific Integrated Circuit.
FPGA: Field-Programmable Gate Array.
(Score: 5, Funny) by crm114 on Thursday February 18 2021, @01:13AM (1 child)
user: give me access to address 0x1225817701, for 128bits
HAL^HMemory : I'm sorry, Dave. I'm afraid I can't do that.
(Score: 1, Funny) by Anonymous Coward on Thursday February 18 2021, @01:43AM
You see this? It's my Kill-O-Zap assault rifle. You see that? That's your memory bank.
What was it you said again?