ATI - PS3 is Unrefined

Would have PSTwo shown the same graphical progression over the years if it had used a customized/adapted PC GPU based on a design shipped and sold around mid 1999 like a nVidia's GeForce 256++/NV1A without e-DRAM and normal DDR-SDRAM based VRAM ?

I think we could have gained a lot more insight into this matter if we had seen something like it before. Perhaps if ATi had ended up producing a GPU for one of the last gen consoles and paired it with eDRAM and an IBM processor and nVidia had used a slightly modified PC based GPU and put it into another console we could have had something to gauge things better by. Maybe that would have clearly demonstrated how vastly superior going with a custom designed chip is over using slightly modified versions of high end PC hardware.

Certainly it would have silenced all of the rabid loyalists who are trying to insinuate that somehow going with off the shelf parts is going to end up being anything but an absolute thrashing.
 
geo said:
For instance, this article: http://news.com.com/2100-1040-935595.html which identifies Sony as one of the first adherents to CG when it was first announced in 2002.

Geo, this merely states that a development house (SOE) will be using CG in their games.

Not a collaboration of Sony and Nvidia on a customized chip.

I feel since it's taking so long for Sony and Nvidia to announce further details of RSX and to have the final or reference toolset in developers hands bodes well for a more "refined" and customized GPU in the PS3.

Speng.
 
liolio said:
how many bandwidth current cpu (pc) have to Gpu? (pciexpress is 2X2Gb/s? my memory fail again)
Gpu in XBOX360 can read from L2 cache.
We known that the bandwidth is 10.6gb/s up + 10.6gb/s down to northbridge/gpu.
Yes the consoles have far more bandwidth twixt CPU and GPU than PCs do.

For current rendering techniques that "limited" PC bandwidth isn't a limitation at all.

Looking into the future, these consoles are meant to be the dawn of the "procedural" era - where geometry and texture data is generated on the fly, hence vastly greater quantities of both are transmitted from the CPU to the GPU. As far as I can gather there have been several false dawns for procedural graphics, but this time around it looks viable.

In terms of Cell->RSX, you have 20GB/s of bandwidth, roughly twice Xenon->Xenos.

Additionally, the architecture of XB360 does not provide for an explicit data flow from Xenos->Xenon. All such data flows are routed via memory - i.e. Xenos writes to memory, and Xenon then reads it. So while RSX->Cell is supported by a specific 15GB/s link, Xenos's data is forced to go via memory, and capped at 10GB/s.

A key difference between Xenos and RSX is that the former is designed to accomplish (some) procedural geometry tasks, which it can perform without using any of the 10.8GB/s bandwidth Xenon->Xenos.

And, clearly, Xenos is designed to accomplish all graphics post-processing effects (massive performance in the EDRAM), as opposed to the talk of Cell helping RSX with post-processing (which I think will be severely limited functionality, anyway).

In regard of bus clock, and a guessed proportion of l2 locked for be stream to gpu, can somebody make a calculation of how many data can be send to gpu in regard of the 10.6gb/s up available
Hard to say if geometry shading is performed on Xenos (e.g. creation of particles, level of detail adaptive tessellation, skinning). A whole load of work that's traditionally done on the CPU can be done directly within Xenos.

This thread has plenty of interest.

http://www.beyond3d.com/forum/showthread.php?p=555823#post555823

Xenos isn't DX10-compliant, but there appears to be a lot of shared capability.

Jawed
 
SubD said:
This desperate need to believe 'Sony had to a put pc graphics card in the PS3' is pathetic and tiresome.

I too don't understand why some people want to make it seem like Sony decided to put a Nvidia GPU in the PS3 in the summer of 2004.
 
speng said:
Geo, this merely states that a development house (SOE) will be using CG in their games.

Not a collaboration of Sony and Nvidia on a customized chip.

I feel since it's taking so long for Sony and Nvidia to announce further details of RSX and to have the final or reference toolset in developers hands bodes well for a more "refined" and customized GPU in the PS3.

Speng.

I understand. My point was that when the quote upstream said that NV and Sony have been cooperating on "aspects" of next-gen for three years, there is nothing there to require those "aspects" to have been RSX (which is what the quoter is representing to be the case). It could have been CG. We don't know.
 
geo said:
For instance, this article: http://news.com.com/2100-1040-935595.html which identifies Sony as one of the first adherents to CG when it was first announced in 2002.

There have alot of articles and instances were Sony has jointly worked with Nvidia on things other than a GPU way back then. The question is when did Sony start looking at Nvidia for a GPU. With that, I personally don't think it was a spur of the moment thing.
 
MrWibble said:
But I'm going to stick my neck out here and suggest that the XGPU is going to suffer exactly the same fate at pretty much the same time. While it may be more radically different to a PC GPU than RSX (and please note, I really don't know if that's true or not), even if it is, it's not using some kind of magic technology that's going to keep it ahead of the curve for any length of time.
No but with DX10 looking likely to last a few years, having a console with a GPU that's reasonably close to DX10 is better than a console with a souped-up SM2a GPU.

Jawed
 
BenSkywalker said:
I think we could have gained a lot more insight into this matter if we had seen something like it before. Perhaps if ATi had ended up producing a GPU for one of the last gen consoles and paired it with eDRAM and an IBM processor and nVidia had used a slightly modified PC based GPU and put it into another console we could have had something to gauge things better by. Maybe that would have clearly demonstrated how vastly superior going with a custom designed chip is over using slightly modified versions of high end PC hardware.

Certainly it would have silenced all of the rabid loyalists who are trying to insinuate that somehow going with off the shelf parts is going to end up being anything but an absolute thrashing.

I am not saying that it will end up as an absolute trashing.

If you took them side by side, the R5900i and the Xbox 1 CPU, you would clearly choose any day of the week including the weekends the XCPU even though it lacks the MMI instructions whicha re so nice to use. Does it mean that a console that uses the R5900i is doomed to be "anything but an absolute trashing" ;) ?
 
geo said:
I understand. My point was that when the quote upstream said that NV and Sony have been cooperating on "aspects" of next-gen for three years, there is nothing there to require those "aspects" to have been RSX (which is what the quoter is representing to be the case). It could have been CG. We don't know.

And the fact we don't know makes me question why people say it was a last minute decision. It seems like some people want to make it a last minute decision by default.
 
BlueTsunami said:
The question is when did Sony start looking at Nvidia for a GPU. With that, I personally don't think it was a spur of the moment thing.

Agreed. Do you have anything to point at to support that conclusion? Preferably something unambiguous --or at least less ambiguous than "aspects"? :smile:

Personally, I hope RSX is "all that and a bag of chips". I love great engineering, and awesome graphics. But even if it is not much more than G70 with a few tweaks, that is still a pretty impressive piece of hardware that can do good work.

What Wavey seems to be pointing at as a warning sign if that is true, is not year 1, but year 3-5. That NV tends to be forward looking on features, but no so much on performance for those features, so as devs catch up to that curve the performance won't be as good as you'd like. Balanced against that a bit, is that G70 is "second take" for NV so already includes some tweaks that way.
 
Jawed said:
No but with DX10 looking likely to last a few years, having a console with a GPU that's reasonably close to DX10 is better than a console with a souped-up SM2a GPU.

Jawed....
 
Panajev2001a said:
Yes, to write directly from the SPE's ;).

Unfortunately I do not see it happening, I see SPE's DMA-ing into VRAM or into XDR memory and RSX reading from there. I think RSX's caches will be transparent to the SPE's and the PPE.
I expect, and hope, that RSX will appear to Cell like just another "client", practically like another Cell. This would mean that an SPE could initiate DMA directly to RSX without bothering with a trip to memory.

Jawed
 
mckmas8808 said:
I too don't understand why some people want to make it seem like Sony decided to put a Nvidia GPU in the PS3 in the summer of 2004.

Look, you might choose to believe that another company had the GPU contract basically in the bank before Sony/SCE announced the switch to the other contender.

It is even possible that while courting SCE that nVIDIA was taking notes and planning some extra features (features that might have gone into G70 and the plans they already had for G71 ... things not exposed through their current drivers that in RSX would receive their final tweaking... see how Jackson Technology/HyperThreading and then x86-64 support in the Pentium IV came to be... hidden in, not enabled... etc...), but I do think they received no official R&D funding until the deal between SCE and nVIDIA was finalized during the year 2004 when they switched to the nVIDIA solution away from the RS design. What that means is that customizations from G70 to make RSX, aside from what nVIDIA was already planning, started later than the 2-3 years back, etc... thing they have started to say after the deal was officially announced.
 
Last edited by a moderator:
Jawed said:
I expect, and hope, that RSX will appear to Cell like just another "client", practically like another Cell. This would mean that an SPE could initiate DMA directly to RSX without bothering with a trip to memory.

Jawed

Of course, I expect that to happen too: I just mention the two way of communicating between the CBBE and the RSX chips I had in mind.

1.) SPE writes to XDR... RSX initiate a DMA to XDR and reads data.

2.) SPE sets-up DMA transfer to RSX's VRAM and executes the transfer with its MFC.
 
I think the hidden goodies in an NVidia GPU will be all the OpenGL bells and whistles that NVidia likes to create.

With PS3 being a "closed OpenGL" platform, as it were, there should be plenty of stuff that's far in advance of what we'll see in OGL on PCs for a fair while.

Jawed
 
mckmas8808 said:
And the fact we don't know makes me question why people say it was a last minute decision. It seems like some people want to make it a last minute decision by default.

I wouldn't agree with that. There are indeed statements from NV that support the idea that there isn't much delta between G70 and RSX, which does point at the decision being late in the day. In fact, my impression is from a financials pov, Jen-Hsun has been very happy to point at the Sony deal as being practically all margins. And you know how he loves his margins these days. :LOL:
 
To add to the notion that RSX isn't a completely new part, Nvidia is making chump change off of the PS3 (compared to the Xbox). Sony basically payed them to use their current design and to take over the fabrication. Nvidia still gets a small royalty. So apparently, Nvidia didn't spend anything on RSX development since they basically sold Sony their previous work (G70).

This idea can be reconciled with all current Sony and Nvidia press comments. The RSX and G70 were in design concurrently with the former very closely related to the latter. So RSX isn't a "bolted on PC part" but rather a G70 heritage GPU for the PS3. In the end they'll probably be 90% similar.
 
Last edited by a moderator:
  • Like
Reactions: Geo
geo said:
I wouldn't agree with that. There are indeed statements from NV that support the idea that there isn't much delta between G70 and RSX, which does point at the decision being late in the day. In fact, my impression is from a financials pov, Jen-Hsun has been very happy to point at the Sony deal as being practically all margins. And you know how he loves his margins these days. :LOL:
What exactly do you consider late in the day? Like what month and year?
 
Back
Top