How to calculate ROP BW use?

Shifty Geezer

uber-Troll!
Moderator
Legend
Tying in to the Wii U BW discussion, this is a question I'd like clarification on. How does one calculate the peak BW consumption of the consoles' ROPS? Taking RSX as a known quantity, 8 ROPS at 550 MHz = 4.4 gigapixels per second?

My understanding says 4.4 billion 32bit (4 byte) writes per second, maxxing out at ~16 GBps, but I'm sure I'm missing something. MSAA multiplies this BW accordingly?
 
for 32bit Color+32bit Z/Stencil peak bandwidth require(Alpha blending + Z Test):
32bit Color read+32bit Color Write + 32bit Z/S read + 32bit Z/S write
remember PS2's eDram bandwidth: 1024bit read + 1024bit write(+ 512bit TextureBuffer)。that's exactly the max bandwidth GS requires(GS has 16 rops)
for MSAA situation, different HW has different peak BW require:non-Z/Color Compression HW,like X360,the BW require simply scale by the MSAA mutiplier.but like PC desktop GPU as well as RSX,the scale factor is smaller than the MSAA mutiplier obviously.
 
And one should not forget, that the ROPs of somewhat recent GPUs have dedicated color/Z caches (the de-/compression is probably done when transfering framebuffer tiles between memory and ROP caches). If everything works out well (it doesn't during fillrate tests), they may reduce the needed memory bandwidth. But information about the size of these caches is scarce to non existing. So it's difficult to make an estimate if it is a relevant reduction. It's probably very dependent on the exact situation.
 
Back
Top