Emulating the 360's edram

Hazuki Ryu

Regular
The xbox 360's edram has a huge bandwidth by even today's high end gpu standards so I would like to have an idea of what would it take to emulate something like that, do you need to have a GPU memory that exceeds 256GB/s or are there ways around it? I'm not a programmer so I'd appreciate if the response is somewhat simplified.
 
New GPUs have much more advanced bandwidth saving techniques than 2005 GPUs. We now have (lossless) depth compression and MSAA color compression. Early depth culling (hi-z) has also improved. Games also tend to use longer shaders (both more ALU and TEX) while the fill rate requirements haven't increased that much. Basically we can now survive with (relatively) lower backbuffer bandwidth.
 
From what I understand the eDRAM bandwidth pretty much comes from only 4xMSAA and is not anywhere near as high in a situation where you aren't using it like in most deferred engines. The bandwidth between the GPU and eDRAM is "only" 32GB/s and that's several times less than what modern GPUs have.
 
I don't think that 32GB/s is touched when doing alpha blending, which is all done on the daughter die iirc.
 
It is also good to remember that edram was used only for framebuffer operations and is pretty much write only.
This should make the actual 'emulation' very easy compared to GS. ;)
 
That clears things up for me a bit, now for a little comparison would it be harder to emulate the 360 hardware architecture than it is to emulate a Wii or a PS2? Disregarding the drastic gap in performance of course.
 
Last edited by a moderator:
Emulating PS2 is a nightmare. Emulating wii is relatively trivial considering we already have decent ones that work better than PS2 ones. I would say emulating XB360 is nothing complicated but will need pretty fancy hardware to pull off considering the overhead of converting code from to-the-metal to something more generic. I would dare to say that the eDRAM is the last thing to worry about when emulating XB.
 
It does have it but from what I've gathered from developer interviews most aren't using it so that they can skip some of the overhead from OS/drivers.
 
Back
Top