nvidia:rsx complete.

Alas, you can guess user names and passwords till you're blue in the face.

Even with a valid username and password, the site will only accept logins from a range of IPs that the developer must pre-register with Sony.
 
function said:
N.8732/f - is that the one where RSX has 8 extra ROPs, or another 2 vertex shaders?
More ROPs?
With a 128bits memory, what would be the point?
Actually if anything, they should disable 8 Rops, since they only have half the memory bandwidth of the 7800GTX, it could also help with the yield.
 
psp111 said:
Alas, you can guess user names and passwords till you're blue in the face.

Even with a valid username and password, the site will only accept logins from a range of IPs that the developer must pre-register with Sony.

Damnit! Well 3 more months of waiting for me until I hear 1% of anything from Sony about the vaporware PS3.:devilish:
 
Vysez said:
More ROPs?
With a 128bits memory, what would be the point?
Actually if anything, they should disable 8 Rops, since they only have half the memory bandwidth of the 7800GTX, it could also help with the yield.

If you review my post, I think you'll find it (I hope) in the spirit of London Boy's quoted post ...
 
Edge said:
It just amazes me also all this talk with a 550 MHz G70 not being good enough. Such a chip is a monster, and to see what developers will get out of it over the years will be interesting to say the least.

Flagship PC GPU "enough"? Are you smoking crack?! We need Quad SLI! ;)

I think it says a LOT about Sony's committment to developers and the quality of the PS3 that they would contract Nvidia for their very best GPU. Until the X1900XT(X) was released a month or two ago the 7800GTX 512MB was arguably the best GPU on the market (and even now it still has some advantages like TMUs and ROPs). Best as in you could not get anything better in a single chip regardless of how much money you had. I could rant on this specific point everything from the original press release last year to the fact at some point Sony is going to need general consumers to be able to afford the PS3. Sticking in CELL+G80ish GPU is just nuts. Consoles are as much about install base and developer support as they are the hardware. I would hope people would be pretty stoked with a G70, and if it were not for some of the advanced features in Xenos (which are important for MS's Vista plans as they view DX10 as a "platform") I think there would be no doubt people would be estatic. Or not.

I'd be happy with an extra Quad and the ability to do HDR+AA at the same time.

While AA+HDR is a great feature the problem is even if the ROPs support it the question then becomes bandwidth. Unless NV incorperated something like ATI's Fp10 blending (woot!) it would be a pretty useless feature in most scenarios due to the significant impact HDR has on performance.

But FP16 blending with shader based effects like DOF and motion blur should be more than adequate in most scenarios. Yeah, it would be nice to have it like Xenos does, but that is how the cookie crumbles sometimes when dealing with tradeoffs.
 
stuart_r said:
does anyone know what the TFLOP performance of the G70 is?

G70 @ 550MHz is something like 1.8TFLOPs (if memory serves; this includes fixed function) or ~255GFLOPs for 32bit programmable shaders. That is what my memory tells me, but I could be mistaken.
 
Vysez said:
Actually if anything, they should disable 8 Rops, since they only have half the memory bandwidth of the 7800GTX, it could also help with the yield.
Disabling ROPs would disable shading processors as well, since the two are not decoupled (at least in NV's design). I don't see how that would be beneficial to the console...
 
Guden Oden said:
Disabling ROPs would disable shading processors as well, since the two are not decoupled (at least in NV's design). I don't see how that would be beneficial to the console...

I thought ROPs were decoupled in the GF7 series. e.g. the 7800GTX has 24 Pixel Shaders, 24 TMUs, but only 16 ROPs. But maybe I flip flopped a number there, but that is what I remember remembering when I used to remember.
 
Guden Oden said:
Are you sure? I was certain 7800 had equal number of shaders/ROPs... Oh well, ignore my comment then! :D


Acert93 said it. 16 ROPs, 24 fragment pixel shaders


since RSX has a 128-bit bus, I'm guessing only 8 to 12 ROPs and 12 to 16 fragment pixel shaders will be active ?
that might account for the redundancy.
 
Last edited by a moderator:
Megadrive1988 said:
Acert93 said it. 16 ROPs, 24 fragment pixel shaders


since RSX has a 128-bit bus, I'm guessing only 8 to 12 ROPs and 12 to 16 fragment pixel shaders will be active ?
that might account for the redundancy.

How big are ROPs? (transistor count wise)

Are they even a practical thing to bother with for redundancy?

And... doesn't the fact that it also have access to the XDR (making it's total bandwidth available to it something like ~45gb/s, best case) mean that its ROPs have to deal with more than just a 128bit bus? I'm not sure, but it seems that limiting it 8 ROPs, because it only has 128bit to the GDDR3, might hinder its ability to use its other path effectively (assuming the 8 ROPs get relatively consumed by just the GDDR3). I'm not sure, just something that came to mind (it might not even be an issue, depending on how little the XDR is used for GPU stuff...as it's hard to predict future usage of such a fat pipeline between gpu and cpu/xdr).

Maybe 12 ROPs would be sufficient? (leaving 4 for redundancy, or just completely removing them?)
 
Back
Top