RSX: memory bottlenecks?

thekey

Newcomer
The Playstation 3 feautures, as I assume most of you know, a Nvidia graphics processor by the name of RSX ( a custom version of the 7800 gtx) - it´s said to have 302 millon transistors, 24 shading pipes and 6/4 vertex pipes running at 550 MHZ.
Now all that wouldnt surprise if the memory of the card werent running at a mere 22 gb/s- that with a 128 bit bus gives us 1400 mhz ddr3, no more no less than the memory speed of the 6800 or 6800 LE- both cards that are about three times slower than RSX.
How is it supposed to give us at 1080p -not to mention dual HDMI- HDR ( a terrible frame killer) ,4xFSAA and 16xAF? I myself own a 6600 GT with memory at 1ghz and simply cant play with hdr ( I hate the fact that cards "support" features they can´t use in real games) at any resolution ( yes not even 640 x 480).
Is RSX going to use a compression technique or other method to operate at 22gb/s? If so would this be avaible for the PC? And if there are techniques that could save brandwith and they could be used for the PC, why does NVidia keep realeasing different cards that only differ in about 400mhz memory brandwith?
For instance: believe it or not, 6800 LE is suppousedly better than the 6600 GT for the mere fact that 1400> 1000 (memory brandwith) - yet the 6600 GT is, ignoring memory brandwith, even more powerful than the 6800 500x8 > 325x12)... so despite the 6800 LE has 8 shader pipes at 320 mhz its considered more powerful than
the 6600gt that has also 8 pipes but at 500 because of a 400 mhz increase in memory brandwith . If memory brandwith is so important then why while the 7800 gtx clocking at 550 mhz graphic card in market now comes with a brandwith of 51 gb/s , rsx wich has the same speed only has memory clocking at 22 gb/s? All this doesnt make any sense.
Any answers appreciated.
 
Well, first of all, it is highly unlikely that the RSX will support multisampling AA in conjunction with HDR (though I suppose there is still a glimmer of hope...). I also doubt that 1080p will be the target resolution for most games, but rather 720p. This card should have more than enough memory bandwidth, even at 128-bit, to drive 720p at 4x FSAA and 16-degree anisotropic. Or, if the game developers so desire, to drive 720p with HDR and 16-degree anisotropic.

Additionally, since I think we're all expecting pretty much all graphics-intensive next-gen games to make use of a high math to texture operation ratio, the RSX should still be able to make use of its rather significant shader processing power even without the memory bandwidth of its desktop counterpart.
 
thekey said:
The Playstation 3 feautures, as I assume most of you know, a Nvidia graphics processor by the name of RSX ( a custom version of the 7800 gtx) - it´s said to have 302 millon transistors, 24 shading pipes and 6/4 vertex pipes running at 550 MHZ.
302M? I thought they said 300M+, no?
 
zidane1strife said:
302M? I thought they said 300M+, no?
Yeah and they haven't said how many pipes and it has two memory busses, one for gddr3 and one for xdr ram, total of around 48gb/s. Don't know how that compares to a 256-bit bus though.
 
weaksauce said:
Yeah and they haven't said how many pipes and it has two memory busses, one for gddr3 and one for xdr ram, total of around 48gb/s. Don't know how that compares to a 256-bit bus though.

One for GDDR3 and one for Cell you mean? ;)
RSX has to go through Cell to get to XDR. Still, the 35GB/s link to Cell will be an interesting aspect of PS3.
 
it's only my opinion, but I'm sure the RSX won't be only a customized G70...sure it shares similarities, but it will be a totally new GPU, much MUCH more powerful than a G70 !
Sony/Nvidia would never release just a customized version of a 1 year old GPU(when PS3 will be released) they would never !

we should wait for final specs...and speculate about the RSX architecture !:D
 
Bliss said:
it's only my opinion, but I'm sure the RSX won't be only a customized G70...sure it shares similarities, but it will be a totally new GPU, much MUCH more powerful than a G70 !
Sony/Nvidia would never release just a customized version of a 1 year old GPU(when PS3 will be released) they would never !

Not saying they would, but... why not? If it gets the job done well enough, it would surely be more cost efficient for them to have an older (but still VERY powerful GPU) which we all know will produce amazing graphics no matter what, and cut their already large per-unit losses.
I don't see why people get so hung up about PS3 having the need for the best of the best of the best hardware, when Sony showed you can win and destroy the competition with lower specs than your competitors, for the last 10 years!
Whatever happens, a PS3 with even a "normal" G70 would make everyone (exept internet forum geeks) happy. There is no real need for PS3 to have a GPU that "has just been released on the PC market" because it would just be a money sucker.
 
Chalnoth said:
Well, first of all, it is highly unlikely that the RSX will support multisampling AA in conjunction with HDR.
Oooo, that statement's just asking for trouble! HS features 'a nice amount' of MSAA with HDR. You don't need FP16 for HDR. People have to stop thinking of FP as HDR and vice versa.
 
Shifty Geezer said:
Oooo, that statement's just asking for trouble! HS features 'a nice amount' of MSAA with HDR. You don't need FP16 for HDR. People have to stop thinking of FP as HDR and vice versa.
Next question would then be, why bother wasting transistors for FP framebuffers (for HDR and more), when one can get "similar" results using other methods... But that has been answered already.
 
I thought I'd better clarify for those who were uncertain what you were referring to.
 
Last edited by a moderator:
In the end nAo's way of doing HDR+MSAA will probably the the most popular way of doing that (or slight variations of course). And in the end, G70 or RSX or G80 or Live8 or whatever the GPU ultimately is, will produce graphics that will keep everyone happy. Everyone except internet forum freaks who feel the need to put down companies they don't like for any reason.
 
Bliss said:
it's only my opinion, but I'm sure the RSX won't be only a customized G70...sure it shares similarities, but it will be a totally new GPU, much MUCH more powerful than a G70 !
Sony/Nvidia would never release just a customized version of a 1 year old GPU(when PS3 will be released) they would never !

we should wait for final specs...and speculate about the RSX architecture !:D

I agree, when they announced RSX you got the feeling they were basically announcing the g70 specs, only at a faster clock. Probably to save face on the pc side of the equation.
london-boy said:
Not saying they would, but... why not? If it gets the job done well enough, it would surely be more cost efficient for them to have an older (but still VERY powerful GPU) which we all know will produce amazing graphics no matter what, and cut their already large per-unit losses.
I don't see why people get so hung up about PS3 having the need for the best of the best of the best hardware, when Sony showed you can win and destroy the competition with lower specs than your competitors, for the last 10 years!
Whatever happens, a PS3 with even a "normal" G70 would make everyone (exept internet forum geeks) happy. There is no real need for PS3 to have a GPU that "has just been released on the PC market" because it would just be a money sucker.
True but they always tried to deliver cutting edge h/w, that's as powerful or more powerful than what's outhere. I expect b/w boost, and speed boost on rsx plus a beefier solution. It's only in their best interest to try to show as big of a difference as possible from xbox360 in the gphx dept., and it shouldn't cost an arm and a leg to release a more cutting edge gpu, it doesn't have to best the top products at the end of this year to deliver improved perf.
 
Well G70 taped out a full 8months before the RSX (based off of nVidia statements, finances). I do believe we should expect something similar to the G70 but with tweaks to allow higher yields. Which for a home console, is still a very powerful GPU.

Memory bottlenecks? Like any system, try to do enough stuff and it'll choke. Whether you would call that a bottleneck or a limitation....I doubt it will be memory bandwidth that holds up the system. The bandwidth is not bad at all...comparitively. Of course delivering 1080p/60x2 is a pipedream (for anything other than the dash/OS features anyway).
 
The framebuffer will probably be bound by what's available locally to it (the 22.4GB/s to VRAM). You're not going to see 1080p with 4xMSAA, I certainly don't think anyone is execting that!

In terms of bandwidth for "the rest", RSX should be relatively well fed. It can access XDR too. Even if the framebuffer consumed all of VRAM's bandwidth - which only a silly dev would let happen ;) - there's still 25.6GB/s to split between the GPU (for texture/vertex fetch etc.) and CPU, which is more than a certain other system has for extra-framebuffer activity. In the absence of further detail, that may be a bit of a simplification, but that's how things are on paper anyway.
 
Nicked said:
Of course delivering 1080p/60x2 is a pipedream (for anything other than the dash/OS features anyway).
Not really. In a game with simple enough (relatively speaking) GFX, I'm sure it could be done and still get great results. Think F-Zero for N64 for example, not that I sugges the game would have to be THAT spartan to run well.

It's overdraw and smoke/fire effects that usually are the big framerate killers, avoid both - possibly do some transparent blends on Cell for example - and opaque fillrate isn't an issue, even at 2x1080P @60Hz. Not for a chip of the suggested performance of RSX anyway.
 
Guden Oden said:
Not really. In a game with simple enough (relatively speaking) GFX, I'm sure it could be done and still get great results. Think F-Zero for N64 for example, not that I sugges the game would have to be THAT spartan to run well.

True. Looking at anandtech's benchmarks, for the 512MB 7800GTX, it can run Black&White 2 at 1600x1200 (nearly the same resolution as 1080p) AND with 4xSSAA at ~30fps. Sure, RSX has less available bandwidth than that card, but it seems to suggest that 1080p with no or little AA, at least, should be possible for some games. B&W2 isn't a visually simplest of games by any stretch either. There's actually a number of other examples in there too - most games they bench, including the likes of FEAR etc. run at 30fps or up at 1600x1200 with 4xAA on that card, if it's any indication.
 
Last edited by a moderator:
I think the RSX with interface 128Bit I will be able to make 1080P with 2XFSAA go to be a "great business" for all gamers.

(off topic -> if I not me deceit Xenos consume early =~ 32 MB RAM for 1080i with 4FSAA ! RSX -> 1080P ...)
 
Last edited by a moderator:
First I'm all about NAO32, but with that said since I imagine the RSX will most likely derive from the G71 rather than the G70, G71 is rumored to improve NVidia's AA performance, and will supposedly allow for both HDR and MSAA at the same time. So NAO32 aside, I still think HDR and AA will be possible concurrently via RSX.

To the original poster though, memory bandwidth is definitely something RSX could stand to have more of in it's present configuration. Some have pondered whether we might see the final version utilize XDR rather than GDDR-3, as that would have the potential of greatly increasing the bandwidth, pin-for-pin.
 
Last edited by a moderator:
Back
Top