ATI - PS3 is Unrefined

MrWibble said:
XGPU's design may have been in design for longer and be more "consoleish", than RSX, but I think it's just going to mean differences in how to target it for developers - and not all of the effects will be positive. For those targetting cross-platform, it may come off worse (just as people are arguing that Cell will do for PS3 - though I might argue that going multi-core PPC is also not exactly like writing for a P4).

May be the the fact that MS write API can help, there is some convergences in the specific xenos directx APi and directX10.
I agree with you for the cpu anyway
 
MrWibble said:
You're rght - 12 to 24 months after any part launches, it'll be eclipsed by the new kids on the block. The difference is that in the PC space things her replaced, whereas with a console we're stuck with it for 5 years and comparing it to PC tech makes it look increasingly obsolete.

But I'm going to stick my neck out here and suggest that the XGPU is going to suffer exactly the same fate at pretty much the same time. While it may be more radically different to a PC GPU than RSX (and please note, I really don't know if that's true or not), even if it is, it's not using some kind of magic technology that's going to keep it ahead of the curve for any length of time.

Of course all console parts will be eclipsed by their PC counterparts in fairly short order, but that not the point – the point is that parts that are designed for the PC are designed for that short lifespan, not for the long haul, and that will have ramifications on what is designed in there. Although I doubt you’ll hear many say it now, prior to their involvement with PS3 NVIDIA have openly stated that they design in such a fashion. However, is that necessarily the best approach when designing for a part that will last 4-5 years?

Xenos doesn’t sit anywhere as a PC, it clearly sits well in a closed box environment and I think that, given its time to market and cost targets, I think they have made choices and compromises based on that target. Its architecture is a year ahead of when ATI will implement something similar in the PC space, but once it does come to the PC it will need to be quite significantly altered to meet the demands of the API its been designed for. Conversely, while, from a theoretical stand point, the dynamic branching performance of Xenos would appear to be fairly small it is still slightly less efficient than the PC part that was supposed to appear prior to its release.

MrWibble said:
XGPU's design may have been in design for longer and be more "consoleish", than RSX, but I think it's just going to mean differences in how to target it for developers - and not all of the effects will be positive. For those targetting cross-platform, it may come off worse (just as people are arguing that Cell will do for PS3 - though I might argue that going multi-core PPC is also not exactly like writing for a P4).

Multicore will be the way of life everywhere, AMD X2’s and Dual core Pentiums will be the order of the day next year from the CPU vendors.

I would suggest that the biggest issues with Cross portability will come not from the different platforms but specific differences between the vendors capabilities that will manifests themselves in the PC, as is already an issue (the DX “backdoor” for NVIDIA’s shadowing). I may be the case that if devs start targeting large-scale vertex processing on Xenos this could cause issues on non-unified processors.

Entropy said:
It seems to me the factual statements of Richard Huddy were flat out wrong, or lies, depending on your view of him. Lies, I'd say, I think it can be assumed that he knew full well about the E3 demonstrations.
Richard’s European Dev Rel., and as such wasn’t even at E3 – I doubt he’s followed the minutia of the undertaking there, and subsequent “who did what” that eked out over several weeks post the event.
 
liolio said:
May be the the fact that MS write API can help, there is some convergences in the specific xenos directx APi and directX10.
I agree with you for the cpu anyway

Of course Sony also have a standard API this time around - an OpenGL variant - and also, being a fixed platform, they can expose the hardware features much more directly than is possible on a PC where things have to be abstracted.

This is another reason sticking a PC-like GPU in a console would not mean entirely predictable or comparable behaviour. In a PC, most GPUs are actually limited slightly by not having all their features exposed. To a certain degree what you see in PC graphics is the lowest-common denominator (albeit amongst only a few high-performing targets). Lose the need for a retargetable applications and it's a different ballgame.

Overall MS may have a platform which is slightly easier to work on than Sony (and that's far from proven anyway) but this time around the margin is much closer. It didn't really work out for them last time anyway, so why would it be important now? I really think the only true advantage X360 has is the "x" months between them launching, and PS3 coming out. What they do with those is important - and if all they're going to do is trash-talk the opposition, maybe they don't have a lot of cards left to play. Fortunately this is just an ATI tech-guy coming out with this. If it was Allard it'd be a lot more significant.
 
I said this once before --I think the best chance that there are goodies in RSX we don't know about yet, is that there are goodies in G70 we don't know about yet.
 
Goodies are, well, good. The name is a tip-off, you see. :D

Edit: For instance, I seem to recall a certain someone who at one time thot that G70 was hiding a whole 'nother two quads. And of course we've never seen a die shot worth a damn. So there could be some unexposed goodness there, waiting for it's unveiling as RSX. Tho I would agree this theory would still tend to limit somewhat just how large an investment transistor-wise whatever theorized goodies we're talking about.
 
Last edited by a moderator:
xbox 1 was "unrefined" as well and look how its games turned out. ps3 will be fine as well as 360. I am going to laugh when all said and done games on both systems will be pretty much even.
 
I think one of the biggest differences between RSX and Xenon is cost.

Sony had a swtich that wasn't expected to a conventional GPU that was not entirely planned. They bought in the best GPU technology they could get their hands on in the quickest time.

If people expect extra goodies in RSX then they dont realise what an undertaking it is to even do a respin (3 month turnaround?)... even taking the extra functions not needed out of RSX like PureVideo is a massive undertaking in such a short period of time. I am sure the hardware engineers do not go, "where is the scalpel Harvey?" and voila.. PureVideo removed.
 
Tahir said:
Sony had a swtich that wasn't expected to a conventional GPU that was not entirely planned. They bought in the best GPU technology they could get their hands on in the quickest time.

Oh please.
 
Just one question for those who have knowledge, what kind of work could it be done (given the time and resources) to make RSX more "refined":?:

I guess that less pixelpipes would be one for BW reasons, take out legacy HW, video HW and improve the HDR performance should be the ones they would try first.
 
That word "aspects" looks like a classic misdirection hedge to me tho. It could, for instance, be a reference to licensing tools, and wedged into a statement to make it look like the cooperation was broader from the beginning.

Or not. But that statement doesn't do much for me frankly one way or the other on how long RSX was an actively cooperative venture before the announcement.
 
Last edited by a moderator:
MrWibble,

nobody is arguing RSX will not do the job for PLAYSTATION 3 or that it will not manage to paint 2-3 pixels on screens per time-slice, but whether in a forward looking architecture such as PLAYSTATION 3 (the XDR<->CELL BroadBand Engine<->FlexIO very high-bandwidth set-up is to me something very exciting to think about) RSX will be holding the console back or not.

Talking about PSTwo, the forward looking elements (VU1, VU0 to a somewhat lesser extent, MMI extensions to the R5900i core, the GS's e-DRAM which guaranteed an awesome fill-rate to the GS which in and of itself was exactly the opposite of forward lookingyet still awesome in its own right... a graphics processor designed to be the absolute best in the evolutionary path that DX6 graphics/32 bits/64 bits era 3D was tracing... in itself the GS is a marvel of engineering in a way, it is like the dream that a engineer wold have had designing a graphics processor in 1995-1996) did help a lot in keepign the platform veyr competitive with more powerful consoles and PC hardware which showed up quite quickly (many of those features feel missing when using more PC oriented designs... with the little time spent with PSP Homebrew I surely did miss the ADC bit for example).

Maybe it will not matter, there are plenty of PSTwo developers that will just shrug their shoulders if someone talks about the R5900i and say it is no biggie over-all, that it could be worked around...

Will RSX be the same thing for PLAYSTATION 3 as the GS is for PSTwo (some other parts are fulfilling their legacy, well maybe with some improovements... ah, I'll let others do the rightful hands-on bitching I do not really have ;)) ? Will it be the technical marvel engineers were dreaming about in the years 2000-2001 ? If so, has the market evolved in 2006 as it did from the years 1995-1996 to the years 2000-2001 ?

Will Xenos help keep Xbox 360 more competitive as time goes by (as I think the CBBE will do for PLAYSTATION 3) more than what RSX will do for PLAYSTATION 3 ?

Yes, we do not know all the details about RSX yet, so it is early to speak, but I will be frank I do not expect HUGE changes from G70: I am talking about pixel batches' size which I do not see shrinking to Xenos nor to R520's levels to make one clear example.

It is true that peak performance wise every 12 months you see big changes, but sometimes risking on some forward-looking features (such as emphasis on dynamic brnaching latency and fill-rate efficiency [see EDRAM daughter die]) proper of what many call "console oriented" designs might help you to be competitive as programmers familiarize with the tricks and tools of your platform.

Would have PSTwo shown the same graphical progression over the years if it had used a customized/adapted PC GPU based on a design shipped and sold around mid 1999 like a nVidia's GeForce 256++/NV1A without e-DRAM and normal DDR-SDRAM based VRAM ?
 
NaMo4184 said:
It has been proven over and over that Sony and Nvida have been working together for 3 years now.

RSX is not a last minute effort...

Believe what you want ;).

Choose to believe in an "RS" before "RSX" or not ;).
 
pc999 said:
Just one question for those who have knowledge, what kind of work could it be done (given the time and resources) to make RSX more "refined":?:

I guess that less pixelpipes would be one for BW reasons, take out legacy HW, video HW and improve the HDR performance should be the ones they would try first.

Given G70 to start with, I would want TurboCache added and tweaked to allow reads/writes from/to XDR and add tweaked/improoved cache hierarchy to cover latencies accessing data in XDR memory comapred to local VRAM and pixel batches' size to be dramatically lowered down (32*32 is still huge if you want good dynamic brnaching performance as the R520 vs G70 GPGPU tests showed).
 
Last edited by a moderator:
"TurboCache" will already be there - its standard from NV44 on (assuming something in FlexIO doesn't prohibit it, though, but I doubt that). But, wouldn't you also want some cache locking as well?
 
Dave Baumann said:
"TurboCache" will already be there - its standard from NV44 on (assuming something in FlexIO doesn't prohibit it, though, but I doubt that). But, wouldn't you also want some cache locking as well?

Yes, to write directly from the SPE's ;).

Unfortunately I do not see it happening, I see SPE's DMA-ing into VRAM or into XDR memory and RSX reading from there. I think RSX's caches will be transparent to the SPE's and the PPE.
 
There are three types of people in this thread:

1. The people who believe (wish) that the PS3 will include a unique, fully customized version of a graphics chip that Nvidia and Sony have been secretly working on for two years. It's not a G70 or a G71, they hope, but a G80 or G81 with Sony's special GPU magic because we all know how well Sony designs GPU's. Those Cell grid patents were just something scribbled on one of Ken Kutaragi's napkins, coffee stains and all.

2. The people who believe (wish) that the PS3 had a crappy graphics card. Why???

3. The people that really don't know, but have realistic expectations for a game console that has to be small and less than $500.

I like to put myself in with the last group.

dukmahsik said:
xbox 1 was "unrefined" as well and look how its games turned out. ps3 will be fine as well as 360. I am going to laugh when all said and done games on both systems will be pretty much even.
You're going to laugh your ass off then.

dukmahsik said:
i think ps4 will be very very powerful with cell matured
Cell will go the way of the Emotion Engine.
 
4. People who have bothered to both read AND understand the public patents and talks/interviews from Sony and PS3 developers.

Everything you need to know about the Cell/RSX graphics system is out there right now. What details that aren't out in the public are of little overall consequence.

This desperate need to believe 'Sony had to a put pc graphics card in the PS3' is pathetic and tiresome.
 
Back
Top