RSX redundancy (I included an option for everyone).

Given RSX's "136 shader ops per clock", what do you think its redundancy scheme is?

  • RSX is G70 with Pure Video and legacy hardware disabled for redundancy.

    Votes: 9 15.3%
  • RSX is actually based on nVidia's upcoming 32 pixel pipe part and has 2 quads disabled.

    Votes: 9 15.3%
  • RSX is only based on G70's [i]architecture[/i], so probably has a different layout and redundancy.

    Votes: 12 20.3%
  • FP16 normalize is not part of the "136 shader ops per clock" - RSX isn’t an 8+24 pipe configuratio

    Votes: 3 5.1%
  • Sony’s counting MADDs (2ops/clk per component) as shader ops - only a handful of pipes are enabled

    Votes: 2 3.4%
  • *Shrug* Who knows?

    Votes: 18 30.5%
  • Other.

    Votes: 6 10.2%

  • Total voters
    59

ultimate_end

Newcomer
RSX redundancy

This is something that has bugged me for ages, ever since Ken Kutaragi made the comment that RSX will also include redundancy measures to improve yields - just as Cell with its disabled SPE. I keep wondering what these redundancy measures are. The topic of RSX redundancy has been briefly touched on by people here before, but I've been waiting all this time for someone to make an actual thread of its own (I'm not a very pro-active person). So I thought, what better way to discuss this than with a poll?

Just for the record, I've always thought option 1 was a good idea, but I'm unsure of how effective it would be. But I really want to know what everyone thinks, even if it's very much just speculation at this point. If I haven't been very clear or if you don't know what a FP16 normalize is (I don't really either), please vote anyway. Also, any further speculation is very welcome.

Where appropriate, for the purpose of this poll I have made the following assumptions:

1. RSX is directly based on an nVidia G7x GPU.

2. KK isn't BS-ing about RSX redundancy :smile:.

3. The FP16 normalize operation has been counted as part of RSX's "136 shader ops per clock" (E3 presentation), giving a maximum of (2x vec2 + (FP16nrm + 2x vec2) or (vec3 + scalar + (FP16nrm + vec3 + scalar) operations per clock, i.e. 5ops/cycle, per pixel shader.

4. Vertex shaders represent 2 shader ops per clock each.

5. I don't know what a normalize operation is, or or is used for, so I'm not sure whether it should be classified as a "shader op" or not. I only know that it runs in parallel with the main shader ALU(s), on the first shader unit in each pixel pipe.

6. AFAIK the "mini-ALU" are there for shader model backwards compatability and as such do not run in parallel with the main ALUs they are attached to (thereby not adding to maximum shader ops per clock). Please Dave or somebody correct me if I'm wrong about what these mini ALUs do.

---

Here is a diagram of G70's pixel shader for reference (hijacked from Dave's G70 article :D):

ps.gif



Disclaimer: Option 5 is just a little joke. Otherwise, why would Sony put an SLI 6800 Ultra setup in the PS3 devkits when a single 6200 or 6600GT would do? :oops:
 
Last edited by a moderator:
Voted other cause then it cant be wrong me thinks..

EDIT

Oh and btw for the hundred time the 136ins and all was just nVidia MEGA PR for the 7800GTX!

Mark those words
 
Last edited by a moderator:
overclocked said:
Voted other cause then it cant be wrong me thinks..
Well it's not like there's money riding on this or anything...:D
Oh and btw for the hundred time the 136ins and all was just nVidia MEGA PR for the 7800GTX!

Mark those words
Oh I definately wouldn't take anything nVidia says as gospel either. But it's sadly all we got to go on for now...
 
If I recall some nVidia PR guy said by the time the PS3 gets released the PC GPU's would have nearly closed the gap.
Meaning the RSX would be still more powerfull than the upcoming GPU's (meaning the GeForce 8800 range).
 
weaksauce said:
How about it being 128-bit?
Do you mean cutting support for 128bit HDR etc? I guess they could do things like reducing FP32 support in parts of the pipeline. In light of recent developments, there seems like there's some uneccessary stuff in there. Not being an expert I don't know how it would be done though. In any case, it wouldn't be very good for the old PR machine though, as RSX would seem less capable than 7800GTX for example.
 
Last edited by a moderator:
ultimate_end said:
Do you mean cutting support for 128bit HDR etc? Yes they could do all sorts of things like reducing FP32 support throughout the pipeline etc. Particularly in light of recent developments, there seems like there's a lot of uneccessary stuff in there. But not being an expert I don't know how easy this would be for them to do (unless RSX really is customised - as opposed to just being a G70). Not very good for the old PR machine though, as RSX would seem less capable than 7800GTX for example. But overall, I think it's a good idea.

No the RSX only has a 128-bit memory bus.
 
Does anyone actually know what the MINI-ALU's do can't seem to find any info on them and how do they help?

thanks
 
They're for PS1.4 modifiers, if my brain serves me correctly.
 
Guilty Bystander said:
If I recall some nVidia PR guy said by the time the PS3 gets released the PC GPU's would have nearly closed the gap.
Meaning the RSX would be still more powerfull than the upcoming GPU's (meaning the GeForce 8800 range).

Yes I vaguely remember something along those lines. I also remember that David Roman said that at the time of release of RSX, it will be the most powerful, most feature rich GPU nVidia has ever produced. That has always puzzled me, especially now with the release of the 7800GTX512, which is already running at 550MHz. RSX is still 3+ months from release. The optimist in me wants to believe that RSX will be fully customised, using nVidia tech. If it is, then it's very hard to predict what redundancy there will be. That's why I want people's speculation, even if it's wild speculation.
 
weaksauce said:
No the RSX only has a 128-bit memory bus.
Oh I get it. Yes, half of the crossbar memory interface disabled like in the low/mid range desktop parts. Good point.
 
Last edited by a moderator:
The external memory interface isn't likely a candidate for redundancy, only transistor budget. If they needed 256-bit, they'd build it in there.

The redundancy is likely just the same as on the desktop NVIDIA parts. There'll likely be one more VP and one more FP quad built in, and that's it.
 
ultimate_end said:
Yes I vaguely remember something along those lines. I also remember that David Roman said that at the time of release of RSX, it will be the most powerful, most feature rich GPU nVidia has ever produced. That has always puzzled me, especially now with the release of the 7800GTX512, which is already running at 550MHz. RSX is still 3+ months from release. The optimist in me wants to believe that RSX will be fully customised, using nVidia tech. If it is, then it's very hard to predict what redundancy there will be. That's why I want people's speculation, even if it's wild speculation.

Well think about the first xbox, was it a geforce2 first then(?) then they got something like a 4200TI in the end. Im not saying that it wouldnt be nice with G70@550MHz as minimum but its already been dissected to death so for the more hardcore geeks we want something
atleast a "little" different than G7x! Even if the chip was worse in real apps! :)
 
SCE is never the one to take the easy solution. If RSX is G70@550mhz and PS3 due in March, nvidia G71@700mhz due in Jan will beat RSX. It does not tie in with the hints given.

"The two products share the same heritage, the same technology. But RSX is faster," said Kirk.

But for how much longer, we wonder? With the PlayStation 3 not due until March 2006, won't the next generation of PC graphics be here by then? "At the time consoles are announced, they are so far beyond what people are used to, it's unimaginable," David comments. "At the time they're shipped, there's a narrower window until the next PC architecture." In other words, RSX looks incredible now, but when it launches, there'll be a smaller time until PC looks better.

"However, what consoles have is a huge price advantage." And 'huge' is the appropriate word: pricing is still to be announed by Sony, but Playstation 3 could debut at £399 - the price of a 7800GTX board, yet offering so much more.

Whilst their relationship with Microsoft has become publicly tenuous, what about NVIDIA's relationship with their new console partner?

"So far our relationship with Sony has been great. We have a much closer relationship and share a much broader vision for the future of computing and graphics.

"When we came together a few years ago, we found a vision and experience that we shared. It sounds cliched, but Japanese companies are often trying to create a vision and make the technology follow that, not the other way round. We believed in that."
 
Rys said:
The redundancy is likely just the same as on the desktop NVIDIA parts. There'll likely be one more VP and one more FP quad built in, and that's it.
But the top of the range desktop parts don't have any redundancy, per se. The entire die must be working - at least all the vertex and fragment pipes. That's what keeps the cost up. And why the 550MHz+ 7800GTX is so rare.

7800GTX minus a fragment quad is 7800GT.

Jawed
 
That's what I meant. Making a 7800 GT from a full G70 is the analogy to RSX and its production. There's only one product to be made from the die, and the extra VP and FP quad will likely be all the redundancy that's built in, given NVIDIA's desktop chip history. Because there's only one SKU doesn't mean it's done any different IMO.
 
tema said:
SCE is never the one to take the easy solution. If RSX is G70@550mhz and PS3 due in March, nvidia G71@700mhz due in Jan will beat RSX. It does not tie in with the hints given.

Arent the G71 and the G72 like the 7600GT's and 7200's of the G70 family????
 
Whether G71 is the true 32-pipe (rumored) chip designation or not, that arrangement is what everyone's refering to when they say G71. I've heard it could be the 7600 as well, but I'm still refering to it as G71 because hey, gotta call it something. ;)
 
Back
Top