The rumored high amount of eDRAM IS in the GPU die...CPU has a lot less as a cache.
So now it's thought that both the GPU and CPU have a separate pool of eDRAM?
The rumored high amount of eDRAM IS in the GPU die...CPU has a lot less as a cache.
So now it's thought that both the GPU and CPU have a separate pool of eDRAM?
mmhm... well, FWIW, the new Tekken Tag 2 trailer still exhibits that dynamic scaling implementation.
Sounds more like this chip has squat to do with POWER7. It's a tricore PPC with 2 MBs eDRAM cache.
Do we know if the GPU can access the eDRAM in any way? If the rumored amount is true, wouldn't that be a lot for a CPU?
Considering GC's (3MB) and Wii's CPU (both the 24MB and 3MB) could, I don't see that changing.
16 bit HDR = 2 bytes x 4 channels - 8 bpp x ~1 million pixels ~ 8 megs for a non AA FB. x4 = 32 MBs for 4 samples per pixel.
You also don't just want the backbuffer in there. A Full 32 MB scratchpad eDRAM would be superb for many graphics tasks. You could put your particle textures in there and read/write to your heart's content. You can render to multiple rendertargets and have direct access to those buffers with no need to export/import from system RAM. I would anticipate Wii U being strong in graphical special FX if not raw polygon and pixel power. Making the most of that BW would also require a specialist engine, meaning ports not doing as well on the hardware at first.
Supposition, but what else are you going to put in there?The 16bit HD and others buffers in eDRAM is confirming or it's just supposition?
3MB. Main core 2MB, 512KB for the other two.
The 16bit HD and others buffers in eDRAM is confirming or it's just supposition?
It's perhaps worth recapping why this is the current belief.To state things again we now believe that the CPU has a small amount of edram and the GPU has a big amount of edram, used for different purposes.
1) The rumours are wrong, and there's a load of eDRAM on the CPU
2) The PR is wrong and there's no eDRAM at all on the CPU
3) The eDRAM is being used as cache
Point 1 makes little sense. Lots of eDRAM on a low power CPU is pointless, and the die is tiny.
Point 2 requires the PR to be all-out wrong. Now where it was regards tweets, I don't believe a full press-release would be that inaccurate, so I'm happy to accept there is eDRAM on the CPU.
Point 3 seems a bit odd to replace the tried and tested SRAM with slower eDRAM, but it has precedence with IBM. There are cost and power benefits there and that seems to be Nintendo's focus with Wii U, so this explanation fits all the info so far.
Or rather, the core would be larger.The Wii U CPU is tiny, probably 40mm2 or less. Assuming 10mm2 per MB of SRAM on IBM's 45nm process, there would be almost no room left for the 3 CPU cores if it were to use 3MB of SRAM.
You also get a more costly process. Steps are needed to create the trench capacitor for the DRAM cells, and all the metal layer steps needed for the logic area are wasted on the DRAM cell.
DRAM and logic processes are optimized differently. For DRAM you want very low static power, for logic you want very high dynamic performance. DRAM processes typically has less than 1% leakage power compared to a typical logic processes.
Integrating the DRAM, you end up compromising both DRAM and logic performance. You also get a more costly process. Steps are needed to create the trench capacitor for the DRAM cells, and all the metal layer steps needed for the logic area are wasted on the DRAM cell.
The compromised performance has extra consequences in a console where you cannot bin and sell slower units at a discount. In order to maximize yield you'll need to provision for higher power consumption of your lower quality bins. This impact the cost of the entire system (cooling, reliability, PSU). I'm guessing that's why MS hasn't opted for integrating the eDRAM of Xenos; They don't need the performance so it is cheaper overall to have a separate die and spend a few dozen cents on adding a substrate to connect the CPU/GPU to the eDRAM die.
Cheers
And still, Nintendo apparently thought it worthwhile to include eDRAM on the CPU as well as the GPU. On the CPU it allows a smaller die, and lower power draw and cost. If the core clocks aren't very high, the timing on IBMs 45nm eDRAM is just fine for L2 cache, no excuses needed.
Sure, there have been a couple theories about that particular level of specs for a long time now.it must be capable of achieving roughly E6760 levels of performance in order to justify its existance.
Justification, same as always... Simpler/cheaper external memory subsystem; more bandwidth, lower latency, higher efficiency. CPU-GPU interconnect is probably simplest possible, but details will probably be very scarce. I wonder if Nintendo would even release details to their devs... After all, they pretty much refused to document microcode for the reality coprocessor in the N64. If the CPU really is based on good ol' gekko from the gamecube it very well might be the same CPU bus as used there, just modified (if neccessary) to handle multiple cores.I'm far more interested in the data path between the CPU and the GPU and the justification of eDRAM on the GPU.
Not sure what "risks" you're talking about; eDRAM have been used in consumer GPUs for a decade and a half soon. There's just more in the Wuu GPU than has been the case previously, but such is life in the the semiconductor industry; it's always more, faster, better (well, not always, with Nintendo... )Going the eDRAM route isn't a given by any means, but apparently Nintendo is confident in the benefits or they would have saved themselves the cost and risk.
Short answer: no.
Slightly longer answer: absolutely not.
Satisfied?