Jawed
Legend
Wow, literally in the memory:
Samsung's New HBM2 Memory Has 1.2 TFLOPS of Embedded Processing Power | Tom's Hardware
The cost/packaging/thermal constraints of HBM and the 20nm node being used here seem to indicate this is a prototyping sample for partners and I suppose there'll be a couple of years of testing...
Samsung's New HBM2 Memory Has 1.2 TFLOPS of Embedded Processing Power | Tom's Hardware
Now it can be argued that this much compute is barely worth bothering with (it's only FP16 FLOPS being counted).Today, Samsung announced that its new HBM2-based memory has an integrated AI processor that can push out (up to) 1.2 TFLOPS of embedded computing power, allowing the memory chip itself to perform operations that are usually reserved for CPUs, GPUs, ASICs, or FPGAs.
The cost/packaging/thermal constraints of HBM and the 20nm node being used here seem to indicate this is a prototyping sample for partners and I suppose there'll be a couple of years of testing...