Could be more RSX info...

Xen said:
Jen-Hsun Huang
"the heart and soul of the rsx, the programmable shading processors, the rsx can process 136 shader operations simultaneously in one clock"

THIS IS QFS: "the rsx can process 136 shader operations simultaneously in one clock"

WTF more do you need. Listen, the RSX is what it is, but don't try to go down a track that will not be what it will be because you can't parse the english language.

This is exactly the same as the G70 in this regard.

Now when I read this sentence its obvious he's trying to get across the importance of these shading processors. Going as far as to say they are "the heart and soul of the RSX." Then an idea of performance: 136 shader operations simultaneously in one clock.

No, he's just filling space. They are important though, without them the RSX is useless.
Jen-Hsun Huang
"we want to achieve that level of realism in order to do that we've incorporated a farm of programmable shading processors."


To me means there will be a plurality of said shading processors. Its up for debate, but as these are "the heart and soul of rsx" I feel this is related to the first sentence. Don't agree, then up to you, the first sentence means a plurality of them anyway.

The G70 has either 2 or 32 shading processors depending on how you want to count.

So there you have it 2 opposing views, one offering much more performance than the other (closer with the performance needed for killzone, motorstorm et al) anyway should be fun to see the final specs, hope all this heated debate was worth it.

No, there are not 2 opposing views. There is 1 correct interpretation of the data, and one incorrect interpretation of the data. You have decided to go gang busters on the incorrect interpretation of the data. Your interpretation can be proven wrong in many ways: the actual text of the speech and slides, engineering analysis of area and power, etc. Give it up already

Aaron Spink
speaking for myself inc.
 
aaaaa00 said:
Rambus is not an alternative to embedded memories. After all, it is still an external memory bus, so it won't give you the same absurd performance advantage that on-die memory can give you.

With eDRAM you are trading logic for memory unless you go with the secodary chip like Xenos has. With Rambus you get to have all your GPU die size dedicated to logic.
 
Brimstone said:
With eDRAM you are trading logic for memory unless you go with the secodary chip like Xenos has. With Rambus you get to have all your GPU die size dedicated to logic.

Which is no different to GDDR3.
 
aaronspink, why are you being so rude. Xen is not being rude, so I wonder what is the source of your frustration? You can't handle that someone is in disagreement with you?

This is not the first time you reaction has been out of line for this forum, where most people do debate things in a reasonable manner. Questioning people's intelligence is also out of line.

Moderator?
 
Last edited by a moderator:
Edge said:
aaronspink, why are you being so rude. Xen is not being rude, so I wonder what is the source of your frustration? You can't handle that someone is in disagreement with you?

I'm not being rude, I'm being blunt. Many of us have nicely told xen he's out in left field. He hasn't taken the hint, so I'm telling him directly. He's been corrected 7 ways to sunday and still keeps going out to left field.

Aaron Spink
speaking for myself inc.
 
Edge, all of us enjoy beating our head against a brick wall once in a while. :) You have to admit that arguing against this kind of evidence can be a bit frustrating, especially if you work in the industry (as I believe aaron does) and your debater doesn't (as Xen appears not to). The first three paragraphs in that post basically negate themselves, and the fourth expects RSX to offer whole integer multiples of G70 performance.

Yeah, "keeping it real" doesn't make for the best learning environment, but that's a two-way street.

To expect RSX--a console GPU with probably greater affordability and heat concerns than a high-end PC chip--to basically offer multiples of performance seems a bit far-fetched, no matter how well NV's been executing lately. I really don't expect either IHV to be too far off the other with any given GPU generation.

Xen, those Jen-Hsun quotes you posted seem to remove all doubt that RSX isn't hiding a whole other GPU in its muscley folds. RSX's rumored "32 pixel pipes" and this mysterious(ly puzzling) "farm of FP shaders" sound interchangable: "32" for "farm," "pixel pipe" for "FP shaders." Voila.

n*136 ops/cycle on top of whatever PS & VS units the supposedly conventional part of RSX will be packing all drawing from a 128-bit bus sounds like, "Bottleneck off the port bow!"
 
Jov said:
Didn't a dev on this board hinted another 128 bit bus bandwidth reserved for the OS? Maybe I misread the post, but it sure sounded like a hint.
Faf was just kidding when he said that half the graphical BW would be reserved for the OS. ;)
 
Vysez said:
Faf was just kidding when he said that half the graphical BW would be reserved for the OS. ;)

Fair enuff! It sounded weird to have an OS taking up such resource, but given it was Sony, anything was possible.

Could the official 128bit bus BW statement from Sony a decoy to throw the competition off guard?
 
Jov said:
Could the official 128bit bus BW statement from Sony a decoy to throw the competition off guard?

Well everything is possible but what good would that do. It is not as if MS can change the specs now, hell even when they showed the RSX specs MS could have hardly done any changes to their design, without MAJOR delays...
 
Jov said:
Could the official 128bit bus BW statement from Sony a decoy to throw the competition off guard?
256-bit busses limit your abilities to scale costs down. Although process scales down, the pads (where the chip connects to the flipchip connections) don't, and as 256-bit busses require lots of pins the chips quickly become pad limited, hence cannot scale down that far - smallest 256-bit graphics chips so far have been R430 & NV42, which are no means "small".
 
Dave Baumann said:
256-bit busses limit your abilities to scale costs down. Although process scales down, the pads (where the chip connects to the flipchip connections) don't....

Speaking of which, someone (I suppose this would be chip fabricators and/or chip packagers) really needs to come up with a technology that DOES allow for 256 bit bus chips to scale down to smaller sizes...I'm very disappointed that 256 bit PC cards seem unable to break the $199 barrier in any meaningful way. (And that 128 bit chips have not over taken 64 bit chips in the value segment...)
 
My Views...

At the moment its all guess work, we have evidence that contradicts each other. Now i believe IMO that RSX has got somthing magic up its sleve for a few reasons :

1. Sony does'nt know the word "standard" every thing they have to do is custom.
2. They have all ready stated they have been working with Nvidia for over 2 years now, now i dout it would take longer than 9 months to remove the crap from G70 and add in the Cell interface.
3. If they have removed all the crap from G70 that leaves some spare transisters to play with.

I say we all go to DeanoC's house and make him tell us about RSX.;)

I say we wait till February to Hopefully get some real info.
 
Brimstone's right though that the RSX seemingly has much to gain from going XDR vs GDDR-3. We've been told that GDDR-3 will be what it gets, but if there were any way to have changed that in the last six months I would hope that Sony would have pursued it.

I've personally taken Sony's announcments at face value and am expecting GDDR-3 (though perhaps clocked higher than the original 700MHz), yet on a Rambus bus the pin-for-pin value of XDR seems decisively clear to the degree that I wonder what the constraining factors in not going that route would be.

Sony seems comortable enough with Rambus memory now and in the past in general that I'd be surprised if it were simply the expected cost differential XDR:GDDR-3, but who knows...

PS - since it's 'public' info might as well take a shot... can any of the PS3 devs here deny or confirm that RSX is still on GDDR-3?
 
Last edited by a moderator:
a new nvidia patent
rs.JPG
 
!eVo!-X Ant UK said:
1. Sony does'nt know the word "standard" every thing they have to do is custom.
Wasn't PSX very generic?
2. They have all ready stated they have been working with Nvidia for over 2 years now, now i dout it would take longer than 9 months to remove the crap from G70 and add in the Cell interface.
That RSX is more than just a 7800 chip with different interface is plausible and IMO likely. That it's insanely more powerful than the top of the line GPU SLI'd 4 ways isn't.
3. If they have removed all the crap from G70 that leaves some spare transisters to play with.
A few million 'spare' transisitors isn't going to account for a 4x increase over G70's performance. Extra features are likely, but super-uber performance isn't. A Quad for redundancy, adding nothing to performance but helping yields, is one probability given we've been told redundancy will feature in RSX fo rthat very purpose.
 
Back
Top