PlayStation 3 GPU: NV40 and NV50 Hybrid with XDR DRAM...

More rumors. Form what they say here's this sounds similar to the Geforce 3/4 tech that went into teh xbox GPU.

Anyway, I won't be suprised is nvidia supplies a complete graphics chip and not just "dumb raster functions".
 
Let me sum up this article for the people who followed the PS3 GPU minutiae and who don't have the time to read this article:
Worthless reading. They just translated the speculations and educated guesses made in an PC Watch article.
"PS3 GPU would be designed using actual and next gen technologies..." They couldn't be more precise than this, it's what one would consider crystal clear kind of information...

Anyway, the only purpose of this article was, it seems, to talk about XDR, and its possible implementation in PC graphics cards. The PS3 GPU was just a pretext for talking about rumored Nvidia parts using XDR.

In other words, worthless (With regards to the PS3 GPU discussion) talk...
 
leechan25 said:
I don't think so. Sony will not let them just pull a pc based GPU off the shelf and slap it into PS3.


agreed.


anyway, I am hoping to see a Graphics Synthesizer 3 / NV50 hybrid. that is, a GPU that combines the best aspects of GS with the best aspects of Nvidia GPUs.

*massive parallelism (sony)
*massive on-chip bandwidth thanks to edram (sony & toshiba)
*tremendous polygon & pixel performance (sony)
*lots of hardwired rendering features (nvidia)
*4th generation pixel shaders (nvidia)
*high image quality (nvidia)
 
pc999 said:
So, you dont belive in this more or less than you used to belive in a NV GPU in PS3 ;)

It is not that they do not believe in this, it is that they are taking the words of nVIDIA (50+ of nVIDIA's best engineers dedicated to taking the NV5X architecture and customize it to be used in PlayStation 3) and the fact that Sony/SCE will be involved in this customized Media Processor that put the "this is an only slightly modified NV50" line of thinking in doubt.

The decision to go with nVIDIA was not THE decision till few weeks ago: can you really blame people for wanting nVIDIA on board, but having almost lost the hope ?
 
Qroach said:
More rumors. Form what they say here's this sounds similar to the Geforce 3/4 tech that went into teh xbox GPU.

GeForce 4/FX you mean.

leechan25 said:
I don't think so. Sony will not let them just pull a pc based GPU off the shelf and slap it into PS3.

Instead you think Sony would rather dictate that NVIDIA scrap all their previous work and go down a path that that they have no prior knowledge with? If Sony were to dicate a route then I would suggest they would have their own graphics processor in there; little point dicatating the graphics route to a partner you have brought in because of their expertise on the graphics path they have been following.
 
DaveBaumann said:
Instead you think Sony would rather dictate that NVIDIA scrap all their previous work and go down a path that that they have no prior knowledge with? If Sony were to dicate a route then I would suggest they would have their own graphics processor in there; little point dicatating the graphics route to a partner you have brought in because of their expertise on the graphics path they have been following.

Almost what MS did to ATI ;) .

Not exact but ...
 
It'll be interesting to see how R520 (PC) and R400==>R500 (Xenon) compare. although we wont get to see a direct comparison because one is PC and the other is console.
 
I meaning that, if they can put a next gen chip in XB2 they could ( if they had a API ) put that in a gfx card ... at the same time.
 
the nv50 will be a damn fine partner for the cell chip.

You have to figure its at least 3 times the raw speed of the nv40 , its shader capabilitys will most likely be enhanced.


I don't get where u people think nvidia is going to design a whole new part of sony . They are just going to get a modified part from them.
 
jvd said:
the nv50 will be a damn fine partner for the cell chip.

You have to figure its at least 3 times the raw speed of the nv40 , its shader capabilitys will most likely be enhanced.


I don't get where u people think nvidia is going to design a whole new part of sony . They are just going to get a modified part from them.

My sentiments exactly. It's those hardcore Sony afficionados who wanted to believe in a radically different & specifically completely unique architecture. This is more than sufficient coupled with with the CPU IMO.
 
DaveBaumann said:
Sorry? MS are getting a part that ATI developed two years ago (not exact, but...)
Would not the R&D lead times for ATI and nVidia be roughly the same? If so wouldn't, both the r500 and nV50 have had similar periods of development? Or was the R400 (now r500,) for all practical purposes, completed two years ago? Ready but just not practical to produce?
 
jvd said:
You have to figure its at least 3 times the raw speed of the nv40

I seriously doubt that. Memory bandwidth isn't climbing fast enough in external memories, and even if there's a massively wide and multi-ported eDRAM interface, there's no point in having such enormous fillrate. That's roughly 20Gpix or MORE you know, what are you going to draw at TV resolution that needs that much filling???

Shader speed will likely be greatly increased, but I dunno 3x. That would more than likely mean MORE than 3x investment in transistors to get 3x performance (perhaps much more than 3x too), and the pixel shader blocks in NV40 are already very large consumers of transistors.

If I was to guess, I'd expect fewer pixel pipes than upcoming PC chips, perhaps running at much higher clock speed, with more pixel shading execution units per pipe.
 
Guden Oden said:
jvd said:
You have to figure its at least 3 times the raw speed of the nv40

I seriously doubt that. Memory bandwidth isn't climbing fast enough in external memories, and even if there's a massively wide and multi-ported eDRAM interface, there's no point in having such enormous fillrate. That's roughly 20Gpix or MORE you know, what are you going to draw at TV resolution that needs that much filling???

Shader speed will likely be greatly increased, but I dunno 3x. That would more than likely mean MORE than 3x investment in transistors to get 3x performance (perhaps much more than 3x too), and the pixel shader blocks in NV40 are already very large consumers of transistors.

If I was to guess, I'd expect fewer pixel pipes than upcoming PC chips, perhaps running at much higher clock speed, with more pixel shading execution units per pipe.


Fillrate is not only used for what you actually see on screen you know, or else there would be little point in having the kind of fillrate we already have today "since they have to run on normal TVs"....
I'm sure no developer in the world will ever complain of having "too much fillrate". If it's there, you can trust they will find ways to exploit it, like they did on PS2 with decent results.
 
Back
Top