When will we see it? (And what will it mean?)
When asked, AMD stated that Kaveri (due in the second half of 2013) will be the first chip to use these second-generation HSA features. The G-series embedded parts announced last week, based on Kabini, will not. I’m going to go out on a limb and say I’ll be surprised if this new technology doesn’t show up in the Xbox Durango and PS4, even if the graphics cores in those products are otherwise based on GCN.
Why? Because it makes perfect sense for Microsoft and Sony to adopt this technology. The ability to exchange data and maintain coherency between CPU and GPU is a major benefit in console operations. A recent interview with Mark Cerny at Gamasutra seems to confirm that the PS4 at least will employ AMD’s hUMA tech.
AMD even spoke, at one point, about the idea of using an embedded eDRAM chip as a cache for GPU memory — essentially speaking to the Xbox Durango’s expected memory structure. The following quote comes from AMD’s HSA briefing/seminar:
“Game developers and other 3D rendering programs have wanted to use extremely large textures for a number of years and they’ve had to go through a lot of tricks to pack pieces of textures into smaller textures, or split the textures into smaller textures, because of problems with the legacy memory model… Today, a whole texture has to be locked down in physical memory before the GPU is allowed to touch any part of it. If the GPU is only going to touch a small part of it, you’d like to only bring those pages into physical memory and therefore be able to accommodate other large textures.
With a hUMA approach to 3D rendering, applications will be able to code much more naturally with large textures and yet not run out of physical memory, because only the real working set will be brought into physical memory.”
This is broadly analogous to hardware support for the MegaTexturing technology that John Carmack debuted in Rage.